The surprising lessons and truths of a nightmarish decade
On May 1, 2011, U.S. Navy SEALs put one bullet through the chest and one through the head of Osama bin Laden—nine years, seven months, and 20 days after al-Qaeda killed nearly 3,000 people in the name of Islam. Historical eras are rarely framed as neatly as this. Though not precisely a decade after 9/11, the secret mission in Pakistan on May 1 was close enough to impose some poetic shape on the period in which the United States first fought back against Islamist terrorism.
Within minutes, discussion of Bin Laden’s death was dominated by a term not common to war-making or foreign policy, but one crucial to wellness and pop-psychology spheres: closure. “New Yorkers have waited nearly 10 years for this news,” said New York City Mayor Michael Bloomberg. “It is my hope that it will bring some closure and comfort to all those who lost loved ones on September 11, 2001.” New York Senator Chuck Schumer sounded a similar note: “This at least brings some measure of closure and consolation to the victims and their families.” Across the Hudson, in New Jersey, Governor Chris Christie commented on the “extraordinary sense of closure” brought about by the killing.
The “closure” meme soon spread beyond the realm of tristate politicians. “Former U.S. President Bill Clinton says Osama bin Laden’s death is an opportunity for closure,” reported Reuters the next day. Pundits and correspondents were also on the same page. “Osama bin Laden’s Death Brings Closure” read the May 2 headline of David Paul Kuhn’s opinion article on the agenda-setting website realclearpolitics.com. Wire stories from overseas outlets, such as France’s AFP, declared, “Bin Laden’s death brings closure.” Mental-health experts were brought before national television audiences to explain the psychological implications of killing the most wanted terrorist on the planet. During a primetime segment on CNN Headline News, above a digital banner reading “Emotional Reactions to Bin Laden’s Death,” a news anchor asked her guest, a clinical psychologist, “Do you believe in closure?”
It is telling that much of the discussion concerned the nation’s feelings. To be sure, the emotional response to 9/11 has helped define the past decade—thousands dead who were loved by tens or hundreds of thousands in turn, a sense of national vulnerability to foreign attack entirely new for Americans to grapple with, and the immortal bravery of the passengers and crew of United flight 93. Perhaps we could have done without the psychobabble, but the fact that we discussed the killing of Bin Laden as a means of providing a national catharsis is evidence of a notable American achievement. We could afford to concentrate on the state of our psyches—rather than the fear of instant reprisal—because American policies and actions had kept the homeland safe from attack for a decade.
Over the course of the 10 years, American authorities foiled more than two dozen al-Qaeda plots. Those averted tragedies were not foremost on the minds of revelers who gathered to celebrate Bin Laden’s demise on May 1 at Ground Zero, Times Square, and in front of the White House. But if a mere few of the plots had materialized, those spaces might not even have been open to public assembly.
Not only have U.S. authorities managed to keep America safe from al-Qaeda for a decade; by the time he was killed, Osama bin Laden was barely a leader. Among the items recovered at his compound in Abbottabad were some recent writings, in which the former icon lamented al-Qaeda’s dramatically sinking stock and pondered organizational rebranding as a possible antidote.
His growing insignificance as a global player was not the product of chance. The marginalization of the world’s principal jihadist was the result of audacious American policy—indeed, the most controversial and hotly debated policy undertaken in the wake of 9/11. In the words of Reuel Marc Gerecht writing in the Wall Street Journal, “the war in Iraq was Bin Laden’s great moral undoing.” In his desperate attempt to drive American fighting forces out of Mesopotamia, Bin Laden sanctioned a bloody civil war in Iraq in 2005 and 2006. The carnage failed to repel the United States, but in the end, the countrywide slaughter of Muslims proved too much to bear for al-Qaeda’s own one-time and would-be supporters. The “Sunni awakening” that helped transform Iraq was an awakening out of al-Qaeda jihadism, and the blow it delivered to Bin Laden’s ambitions was stunning.
After the turnaround in Iraq, the landscape of the Muslim world suffered even greater changes—with ordinary Muslims rising to revolt against Persian and Arab tyranny, not against American hegemony. As Fouad Ajami has written: “The Arab Spring has simply overwhelmed the world of the jihadists. In Tunisia, Egypt, Libya, Bahrain, and Syria, younger people—hurled into politics by the economic and political failures all around them—are attempting to create a new political framework, to see if a way could be found out of the wreckage that the authoritarian states have bequeathed them.”
It was the Freedom Agenda of the George W. Bush administration—delineated and formulated as a conscious alternative to jihadism—that showed the way. Indeed, the costly American nation-building in Iraq has now led to the creation of the world’s first and only functioning democratic Arab state. One popular indictment of Bush maintains that he settled on the Freedom Agenda as justification for war after U.S. forces and inspectors found no Iraqi weapons of mass destruction. The record shows otherwise. “A free Iraq can be a source of hope for all the Middle East,” he said before the invasion, in February 2003. “Iraq can be an example of progress and prosperity in a region that needs both.”
And something of the kind has come to pass. “One despot fell in 2003,” Ajami has said. “We decapitated him. Two despots, in Tunisia and Egypt, fell, and there is absolutely a direct connection between what happened in Iraq in 2003 and what’s happening today throughout the rest of the Arab world.”
Thus, there are three intertwined achievements that have proved to be the dispositive features of American success in the war on terror: formulating the Freedom Agenda in the Middle East, reversing the course of the war in Iraq, and establishing a national-security apparatus to foil multiple terrorist attacks. It is no coincidence that they are also the most controversial foreign policies America has implemented since the Vietnam War.
September 11 was a hinge moment in American history. The attacks plunged the nation into a full-scale war against non-state entities. Any adequate American response had to break with previous approaches in previous conflicts. War could not be waged on parties inside states in the same way it had been waged on states themselves. Prisoners captured on a battlefield in a country not their own and with no interest in following the rules of conventional war could not be handled as they had been. Getting the edge on Islamist terror would mean fundamentally rethinking our approach to both the blunting of deadly threats and the shuttering of the political hothouses of the Middle East in which such threats thrive.
The adoption of these unprecedented and uncompromising means of war inspired animated debate in the United States. In fighting the war on terror, we have been told, America has become—depending on the accuser—either too dismissive or too enamored of democracy. Some on the left think our national-security apparatus undermines our defining ideals. On the right, outraged voices condemn our naive enthusiasm for helping to secure liberty for Muslims abroad, calling it a form of multicultural self-sabotage. After civil war seized post-invasion Iraq, critics from across the ideological spectrum denounced our misguided effort. The fits and starts and frustrations of the war decade have this one thing in common: we have done battle in an age when spectacular setbacks appear to provide irrefutable evidence of our own baseness and incompetence—a few years before drab good news arrives to refute both expert opinion and common knowledge.
The arguments that we have prosecuted the war on terror immorally and ineffectually are important, and deserve the respectful hearing they have received, even if many of those arguing these points have resorted to launching the most abject slanders and accusations toward those who believe the war on terror is just and has been fought honorably. To be sure, not everything the United States has done in the war on terror has been correct. Far from it. As Winston Churchill said, “War is mainly a catalogue of blunders.” In the fight against Islamist terrorism, American blunders have come in all shapes and sizes, and in truth there are few small wartime miscalculations. This is especially so in an age of instant global headlines.
We continue to suffer for our biggest mistakes. Concerning the failure to catch Bin Laden and make serious efforts to nation-build early in the Afghanistan war, inaccurate intelligence about Saddam Hussein’s weapons, and the Pentagon’s ill-preparedness for the Iraqi insurgency, there can be no absolution. These errors have cost the country tragic sums in money, credibility, and life. They also set our efforts back precious years.
But these blunders, great as they are, have not undone America’s outstanding accomplishments. Ten years ago, the most delusional optimist among us would not have predicted the irrelevancy of Osama bin Laden or a decade without another al-Qaeda attack, let alone a democratic Iraq and a transformative explosion of antiauthoritarianism in the Middle East.
Nor do American achievements in this war mean we are in a position to quit the fight. The notion that America achieved closure with Bin Laden’s killing suggests to some, perhaps even the occupant of the White House, that the war on terror has had its decade and the United States can now move on. “America, it is time to focus on nation-building here at home,” said Barack Obama this summer as he announced a sizable drawdown of troops in Afghanistan for the fall of 2012. The suggestion that our work is done has traction only because resolute American action at home and abroad have provided a sense of security so pervasive it now goes unquestioned.
The United States has fallen prey to false comfort in the past. So before we submit to the siren song of closure, we would do well to recall that that is exactly where this war began—and our retaining some genuine measure of security has been the result of thinking and acting more boldly than we have in generations.
2. Defining the Threat
When the first plane, American Airlines Flight 11, crashed into the north tower of the World Trade Center, America was savoring the final moments of its post–Cold War repose. President George H.W. Bush had described reality after the Cold War as “a world quite different from the one we’ve known,” one in which countries “East and West, North and South, can prosper and live in harmony.” Bill Clinton had called it “a world warmed by the sunshine of freedom.” Not only had the United States enjoyed a decade as the uncontested global hyperpower, but the very notion of tensions among great powers had undergone a cheerful reassessment. As Robert Kagan wrote in The Return of History and the End of Dreams, “The modern democratic world wanted to believe that the end of the Cold War did not just end one strategic and ideological conflict but all strategic and ideological conflict.”
Why fight when democratic capitalism was already the victor? The best way to open a closed political regime, well-meaning democrats believed, was through trade. Ideological wars were obsolete. In retrospect, back in 2001, we were relatively blissful, our news dominated in the weeks before the attacks by a missing girl in Washington, D.C., and a New York City publicist who had driven her SUV into a bunch of people outside a nightclub. There was, in the attacks of 9/11, a rebuke aimed at all Americans. “Who did you think you were,” history asked, “to have decided your world was threat-free?”
As the dream disappeared, some inapposite answers to that question emerged, particularly on the left. Less than two weeks after the attack, the late Susan Sontag wrote in the New Yorker, “Where is the acknowledgment that this was not a ‘cowardly’ attack on ‘civilization’ or ‘liberty’ or ‘humanity’ or ‘the free world’ but an attack on the world’s self-proclaimed superpower, undertaken as a consequence of specific American alliances and actions?” She went on:
“How many citizens are aware of the ongoing American bombing of Iraq? And if the word ‘cowardly’ is to be used, it might be more aptly applied to those who kill from beyond the range of retaliation, high in the sky, than to those willing to die themselves in order to kill others.”
Sontag summed it up for an entire class of people: America was guilty, ignorant, and cowardly. Our good years may have been a dream, but a bad one. We had gone too long without a reminder of our wickedness. Pax Americana was always a sham. And globalization was just a new form of neo-imperialism. This was the earliest and most thorough voicing of the America-has-dirty-hands argument that is still the default position of today’s progressives.
Sontag came under withering assault in those days, because Americans in overwhelming numbers were unnerved not only by the attacks themselves but by what those attacks said about the mass murderers who designed them.
The 9/11 attacks were a manifestation of a fascistic strain of Sunni Islam known as Wahhabism. It is accurate to assert that al-Qaeda attacked us because it disagreed with U.S. policies. But as its disagreement sprang from a fascistic, theocratic moral and political framework, that explanation hardly satisfies. Here is one that does: “This war is fundamentally religious…. Under no circumstances should we forget this enmity between us and the infidels, for the enmity is based on creed.” That was Osama bin Laden speaking to Al Jazeera soon after 9/11. Here is another bit of exposition from the man responsible for the attacks: “Every Muslim, from the moment he realizes the distinction in his heart, hates Americans, hates Jews, and hates Christians. This is part of our belief and our religion.”
The war on terror is also more than a fight between a fanatically ascetic strain of Islam and a supposedly corrupt and godless West. The seeds of the threat America faces were sown long ago by the forces of history. Islam is one of the world’s great religions, but it was also a great power, one that has suffered a lengthy and painful decline. “For the past 300 years, since the failure of the second Turkish siege of Vienna in 1683 and the rise of the European colonial empires in Asia and Africa,” wrote the great Middle East scholar Bernard Lewis, “Islam has been on the defensive.” Since then, “there has been a rising tide of rebellion against the West’s paramount standing, and a desire to reassert Muslim values and restore Muslim greatness.” In Bin Laden’s eyes, reclaiming Muslim greatness meant establishing Taliban-style governance throughout every land that had ever been under Muslim rule and bending infidels worldwide to Muslim will.
Again, clarity comes from the source of the attacks. Three weeks after 9/11, Al Jazeera ran a video of Bin Laden declaring, “What America is tasting now is something insignificant compared to what we have tasted for scores of years.” In some sense, Islamist terrorism is the assertion of great-power nationalism in slow motion. It is for this reason that Muslim support for al-Qaeda is not limited to adherents of Wahhabism or similarly austere sects of Islam. According to a landmark Gallup poll conducted in 2005 and 2006, al-Qaeda enjoyed the support of fully 100 million Muslims worldwide.
A necessarily abbreviated timeline brings to light the key humiliations that Islam, as a world power, has suffered in the last century. In 1916, the West divided a once great Ottoman Empire. In 1967, Israel delivered a stunning and unexpected military defeat to Egypt. In 1990, American forces—including women—came to Saudi Arabia to rescue Kuwait (and the Saudis) from Saddam Hussein. These affronts were determinative for men like Bin Laden. And for decades before 9/11, radical Islamists had been trying to advance their interests on the Middle East chessboard.
Israel’s defeat of Egypt in the Six Day War inspired an Islamist awakening in that country that produced, among other notable terrorists, al-Qaeda’s former No. 2, and now No. 1, Dr. Ayman al-Zawahiri. In 1979, the world endured Islamism’s most dynamic year. Extremist Shiites scored a major victory by overthrowing the Western-backed shah of Iran. Radical Islamists seized and temporarily held the mosque in Mecca. And when the Soviets invaded Afghanistan in December, the Hindu Kush immediately became jihad central. In 1981, Islamists assassinated Anwar Sadat, who was viewed by many Muslims as a traitor to the faith for signing a peace treaty with Israel.
The many pre-9/11 Islamist terrorist attacks against the United States and American interests include Hezbollah’s suicide attack that killed 241 Marines in their barracks in Beirut in 1983, Hezbollah’s multiple bombings (beginning that same year) of the American embassy in Beirut, the World Trade Center bombing attempt of 1993, the 1996 bombing of Khobar Towers in Saudi Arabia, the 1998 bombing of U.S. embassies in Tanzania and Kenya, and the attack on the U.S.S. Cole in a Yemeni port in 2000.
To rehearse the whole list is to lose sight of the swamp for the alligators; by 2001, jihadist terrorism was much more than the sum of its attacks. Before 9/11, successive American administrations had answered terrorist attacks on a case-by-case basis. Particularly egregious acts, such as the 1998 embassy bombings, might inspire a flurry of cruise missiles, but there was no comprehensive strategy to combat radical Islam. Al-Qaeda’s mass-casualty attack on targets in New York City and Washington, D.C., meant the case-by-case approach was no longer an option. It was finally clear that Islamist terror was not just an ideology but a global network, incorporating governments, clerics, financiers, NGOs, and scholars. Al-Qaeda itself had become something of a shadow-nation, with a sophisticated bureaucracy and military training camps that churned out thousands of new jihadists each year.
Terrorism had become a necessary element in the regional political order. For generations illiberal kingdoms and autocratic republics had shackled the Muslim body politic but unleashed the fevered Islamist mind. With legitimate avenues of political redress hopelessly barred, only religious extremism provided an available channel for voicing widespread discontent. Islamist terrorists had become either patricidal offspring, as was the Muslim Brotherhood in relation to autocratic Egypt; cherished sons, as was Hezbollah to Iran; or torn spawn, as were the Taliban to Pakistan and al-Qaeda to Saudi Arabia.
Understanding that the growth of Islamic extremism was the result of a lack of political freedom was the first real intellectual accomplishment of the Bush administration. In a September 20, 2001, address before Congress, Bush proclaimed that “this will be an age of liberty, here and across the world.” That was an ambitious objective for a wounded country in crisis-response mode. But Bush saw the promotion of democracy in the Islamic world as the best hope for inoculating Muslims against terrorist ideologies.
Future jihadists could not, Bush and his people knew, be deterred by the threat of annihilation alone. To the contrary, Bin Laden stretched Koranic teachings on martyrdom during holy war into the understanding that suicide attacks on innocents would yield an eternity in paradise with 72 compliant virgins. Thus, the prospect of death in the service of jihad would not deter, but motivate, a would-be terrorist. For America, this meant that ideas would have to do battle where deadly weapons were useless. Democracy would be offered as an opportunity to effect change in this life, beside the offer of jihad as transport to the next.
On September 20, 2001, Bush addressed Congress and all America tuned in. By the end of the speech, everyone listening knew we had entered a new age. “All of this was brought upon us in a single day,” the president said of 9/11, “and night fell on a different world, a world where freedom itself is under attack.” As for what would come next: “Our war on terror begins with al-Qaeda, but it does not end there. It will not end until every terrorist group of global reach has been defeated.” American intelligence operatives were already en route to Afghanistan.
3. Combating the Threat
Bush unveiled the most consequential element of what would come to be known as the Bush Doctrine during his Oval Office speech on the evening of September 11. “We will make no distinction between the terrorists who committed these acts and those who harbor them,” he said. This critical departure from the status quo reflected a new paradigm. Throughout the Cold War, Western countries had not done much about the extensive and intimate relationships between governments and terrorist groups. Threats were deemed to have come primarily from enemy states. If they employed bands of sympathetic terrorists to do some jobs and their own secret forces to do others, it mattered little in the bigger picture. In the decade after the Soviet Union fell, there was no need to rethink this understanding. The old enemy states had melted away.
The new paradigm reflected the recognition that jihadist organizations could act on their own to advance their own interests. Some, such as Hezbollah, still worked for states, such as Iran. But others, notably al-Qaeda, conducted their own foreign policies entirely. In Afghanistan, the Taliban’s relationship with al-Qaeda resembled that of sympathetic landlord and well-paying tenant. Among the Taliban, opinion was mixed regarding al-Qaeda’s terrorist activities. It wasn’t that the fascistic retrograde regime had any moral qualms about bringing violent death to nonbelievers. It was that there was little, from their perspective, to be gained from provoking the United States. While Bin Laden had declared war on America, the Taliban were somewhat more content to rule their repressive fiefdom unmolested.
That difference in perspective vanished entirely at the most critical juncture. Days after 9/11, the Bush administration asked senior Taliban officials, both publicly and through back channels, to hand over Bin Laden or face an American attack. But as Taliban leader Mullah Omar explained in a Voice of America radio interview: “We cannot do that. If we did, it means we are not Muslims, that Islam is finished.”
And so the United States launched a brilliant and innovative campaign to topple the Islamist government that was providing al-Qaeda with a safe haven. According to the then deputy national security advisor, Stephen Hadley, military leaders presented Bush with a standard air-power-based Afghanistan plan. “We’re not going to do it that way,” the president responded. “We need to send a whole new message, that we are serious about this.”
With the cash-bought assistance of Afghanistan’s anti-Taliban Northern Alliance, CIA officers helped place clandestine Special Forces teams on the ground. The Special Forces, in turn, guided American air strikes on the Taliban. The “whole new message” sent was one of unimaginable mastery of modern warfare. Air strikes with laser-guided munitions were so accurate and lethal that the Northern Alliance was convinced the United States possessed an invisible death ray. When that rumor leaked to the Taliban, some surrenders followed. “Instantly you could see the guys bend over,” one U.S. sergeant later said. “They put their guns down, they took their cloaks off, and they started marching in, in single file right up into the middle of our perimeter, because they knew that it was over if that death ray was coming out.”
Before the year was over, the Taliban was deposed, some 5,000 Islamist fighters were killed, 20 jihad training camps were toppled, surviving al-Qaeda members were wounded and dispersed, and Hamid Karzai’s American-backed interim government had international legitimacy.
Those who, still today, suggest there is real affection between the Taliban and the Afghans they rule would do well to read this New York Times report Dexter Filkins filed from an Afghan town in November 2001:
In the 12 hours since the Taliban soldiers left this town, a joyous mood has spread. The people of Taliqan, who lived for two years under the Taliban’s oppressive Islamic rule, burst onto the streets to toss off the restrictions that had burrowed into the most intimate aspects of their lives. Men tossed their turbans into the gutters. Families dug up their long-hidden television sets. Restaurants blared music. Cigarettes flared, and young men talked of growing their hair long.
Did al-Qaeda expect such an overwhelming initial response from the United States? What, after all, did Bin Laden think he was going to accomplish strategically by killing thousands of innocent Americans? About this, respectable opinions differ.
Michael Scott Doran, writing in the January/February 2002 issue of Foreign Affairs, claimed, “Osama Bin Laden sought—and has received—an international military crackdown, one he wants to exploit for his particular brand of revolution.” According to this strategic interpretation, Bin Laden actually wanted the United States to come into the region, guns blazing:
America, cast as the villain, was supposed to use its military might like a cartoon character trying to kill a fly with a shotgun. The media would see to it that any use of force against the civilian population of Afghanistan was broadcast around the world, and the umma would find it shocking how Americans nonchalantly caused Muslims to suffer and die. The ensuing outrage would open a chasm between state and society in the Middle East, and the governments allied with the West—many of which are repressive, corrupt, and illegitimate—would find themselves adrift.
Bin Laden was not quite so brilliant a strategist. He had a talent for operations designed to spotlight their mastermind as an instrument of Allah’s will. But what strategic effect these operations were meant to produce beyond such beatification—and the scoring of a blunt point against the enemy—was not always clear. In the late 1980s, Bin Laden established his own band of Arab fighters in Afghanistan and led them on preposterously daring missions of little strategic value against the Soviets. His 1990 request of the Saudis—that they let him lead his group of jihadists in a fight to expel Saddam Hussein from Kuwait—was another quixotic plan. Bin Laden had a penchant for suicidal odds, mass casualties, grueling fights, and outsize symbolism. Unlike Muhammad, on whose life he modeled his own, Bin Laden had no genius for military matters or statecraft. The point of 9/11 was that it was a spectacular and deadly high-profile blow. And an accumulation of such blows would cause a feeble and frightened America to alter its policies in the Muslim world, or even accept defeat.
The assertion that Bin Laden hoped to bait the United States into overreacting is undone foremost by the many accounts of his own words and thoughts on the matter. In the 1990s, he famously described America as a “paper tiger.” His evidence included the U.S. troop withdrawal from Lebanon after the 1983 Marine barracks bombing, the 1993 “Black Hawk Down” debacle in which U.S. Rangers were overrun in Somalia and the subsequent American retreat, and the largely symbolic cruise missile attacks in revenge for al-Qaeda’s bombing of American embassies in Africa. He was aware, moreover, of the defeatist Vietnam Syndrome that shaped popular American perceptions of war.
Bin Laden had little doubt the United States would avenge 9/11 in the largely ineffective fashion that had characterized American policy. He reportedly laughed off warnings of overwhelming American retaliation. Even as it became clear that the United States was going to respond to 9/11 with something more potent than a barrage of cruise missiles, he gave no credence to the notion of America as a formidable war machine. On October 3, Bin Laden sent a letter to Taliban leader Mullah Omar, in which he wrote, “A U.S. campaign against Afghanistan will cause great long-term economic burdens which will force America to resort to the former Soviet Union’s only option: withdrawal from Afghanistan, disintegration, and contraction.” The irony here is pungent. In the end, the Soviet Union was forced out of Afghanistan because the United States provided arms and training to Afghan fighters.
Osama bin Laden’s colossal strategic misjudgment serves to highlight the Bush administration’s strategic courage. Analysis like Doran’s was compelling, and it was fairly commonplace in those days. Among left-wing pundits, it took on the color of a dire warning: by bringing war to Afghanistan, they said, America would play right into al-Qaeda’s hands. But the diminished state of the Taliban and al-Qaeda at the end of the initial campaign in Afghanistan demonstrated the opposite: Bin Laden was wholly unprepared for the American response he elicited.
For all this, the opening campaign in Afghanistan was marked by two tremendous failures.
First, Osama bin Laden survived the American assault in the mountains of Tora Bora and fled to Pakistan. It is likely that he was struck in battle and suffered injuries to the left side of his body. But escape he did, and there are no points earned for a near miss of such enormous consequence.
What could be learned from the first big American blunder in the war on terror? The consensus of those charged with getting Bin Laden in the snowy mountains of northeast Afghanistan is that the United States did not commit enough resources to finish the job. As Peter Bergen writes in his comprehensive history, The Longest War: “The Pentagon’s reluctance to send more soldiers to Tora Bora arose out of a combination of factors: fear of offending the Afghan warlords in eastern Afghanistan; worries about replicating the Soviet debacle in Afghanistan; concerns about the difficult terrain; and an unwillingness to take casualties.” Sensible concerns, all—especially at such an early stage of the war and after such stunning success elsewhere in Afghanistan. But it is clear that Bin Laden’s escape resulted from an American reluctance to let go of old models and standards. This wouldn’t be the last time the U.S. effort would have benefited from a heavier military footprint and a more sophisticated understanding of how to integrate troops into foreign populations.
The second blunder of Afghanistan had even more to do with shortsightedness and an inability to break with the past. It wouldn’t be evident for another five years, but the Taliban would make an enormous comeback. The American failure to institute a comprehensive counterinsurgency and nation-building approach allowed the Taliban to exploit the ongoing weaknesses at every layer of the anemic Afghan state. It is well known that when Bush took office, he opposed nation-building. During the Clinton years, humanitarian military action in Bosnia, Somalia, and Haiti was famously branded “a branch of social work” by the foreign policy analyst Michael Mandelbaum. It would take some dispiriting years before Bush recognized that in the war on terror, nation-building was not a matter of quixotic do-goodism, but quite simply the difference between victory and defeat.
4. The War at Home
Those in Washington who found themselves charged with the astonishingly heavy burden of keeping America safe within its own borders would undergo crucibles of their own. As John Yoo, part of the Justice Department’s Office of Legal Counsel during Bush’s first term, has written, “the ‘front’ in the war on terrorism would soon move from the battlefields of Afghanistan to the cells of Gitmo and the federal courtrooms.” And that is more or less where it would stay.
The domestic polices of the war on terror have been so successfully mischaracterized that the mere mention of terms like Guantánamo Bay, rendition, and enhanced interrogation produces a reflexive cringe of patriotic shame. In truth, no American approach to the detention and interrogation of terror suspects would have satisfied the liberal press and the activist left, just as there could never have been a domestic security regime that would have satisfied the embattled libertarian right. Reinhold Niebuhr noted ruefully in 1952 that “we take, and must continue to take, morally hazardous actions to preserve our civilization.” In the United States—a nation of laws founded on individual liberty—such “morally hazardous” actions will always elicit outrage from one party or another; and that is as it should be. The line between free and not free becomes fuzziest in emergencies.
Yet the most authentic verdict on the soundness of our national-security apparatus comes not from the media elite or the anti-TSA mobs, but from the American people as a whole. And to Americans kept safe these 10 years, the propaganda storms on the left and right have proved to be little more than white noise. We know that Americans accept as necessary the measures taken since 9/11 because, in our representative democracy, these measures have enjoyed overwhelming bipartisan support—and we’re yet to see a political campaign with mudslinging accusations that a candidate was too committed to the war on terror. It is telling that after years of liberal objections to military tribunals for unlawful enemy combatants, the prospect of trying 9/11 mastermind Khalid Shaikh Mohammed in a New York courtroom inspired a wholesale pushback from New York’s Democratic leaders. Such is the political reality of national security.
The Patriot Act, which allows for the freer flow of information between the CIA and the FBI, was passed on October 25, 2001, by a Senate vote of 98 to 1 and a House vote of 357 to 66. In May 2011, the most recent extension of the Patriot Act passed the Senate 72 to 23 and the House 250 to 153. We no longer feel as threatened, so the level of support is not as high as it was six weeks after the attacks. But it is still, relatively speaking, overwhelming.
Interestingly, the official post-9/11 case against America-the-human-rights-abuser was a flop from the start, and it was the weakness of that case that set activists on a more drastic—and more damaging—course. At a December 2001 congressional hearing, members of the Senate Judiciary Committee questioned Attorney General John Ashcroft on American policy regarding military tribunals and other new initiatives. Democrats such as Patrick Leahy, Dianne Feinstein, Ted Kennedy, Charles Schumer, and Russ Feingold found themselves stunned when their assertions that the United States was on a dangerous course elicited a ferociously effective counterattack. The most trenchant refutation came when Ashcroft produced a recovered al-Qaeda training manual in which detained terrorists are advised to lie about abuses so as to exploit the American system of legal protections. As Jeffrey Toobin described the scene in the New Yorker, the Democrats “looked almost physically diminished by Ashcroft’s performance.”
This had two important results. First, legislative opposition to the domestic war on terror was shut down, which gave both Bush and elected politicians the ability to claim they had gotten tough, and the opposition’s demise opened an enormous market opportunity for legal activists and their media supporters. Second, without facts to marshal against the Bush administration, opponents built a case largely on agitprop theatrics and the cartoonish demonization of administration figures. Note that in Toobin’s version, Democrats were not undone by the simple truth of what Ashcroft produced; rather, they were defeated by his “performance.”
The truth exposed by Ashcroft ended up accelerating the activist assault on American detention policies. That assault focused on the U.S. facility at Guantánamo Bay. American forces in Afghanistan had captured hundreds of men they deemed too great a threat to let go; they posed a unique problem, as they were not citizens of Afghanistan and their own native countries had no interest in taking custody of them. And so it was determined they should be housed as prisoners of war in a unique facility—an American military base on foreign soil (ironically, the soil of an American enemy, Cuba). Once those prisoners had been transferred to Gitmo, they commenced lying about abuses and their cause was taken up at once. The first to jump into the breach was Michael Ratner, a leftist lawyer who had previously defended Omar Abdel Rahman, the blind sheikh behind the 1993 World Trade Center bombing. (The American Civil Liberties Union would follow after Ratner softened the field.) Ratner eventually sprang his original three clients; afterward, one of them refused to take a lie detector test regarding his tales of Gitmo abuse and another admitted that he had been through an al-Qaeda training camp.
There exist many poisonous fruits of anti-Gitmo activism. An intelligence assessment released by the Director of National Intelligence’s office at the end of 2010 found that of the 598 detainees released from Gitmo, one of four was either suspected of or confirmed as “reengaging in terrorist or insurgent activities after transfer.” There was no political benefit to be gained by the issuance of such a finding, which took place during the Obama administration, which would have preferred every piece of available evidence it could get its hands on in its quest to close Gitmo.
One former detainee, Said Ali al-Shihri, released in 2007, went through a Saudi Arabian jihad rehabilitation program before becoming al-Qaeda’s deputy leader in Yemen. Abdullah Ghulam Rasoul, a close associate of Taliban leader Mullah Omar, said upon being released, “I want to go back home and join my family and work in my land.” Instead, he became a high-ranking Taliban soldier. Ibrahim Shafir Sen, a Turkish prisoner at Gitmo, told an interviewer after being released, “Ninety percent of the soldiers at Guantánamo wore skullcaps. They all had Jewish names. There were also 15 rabbis in Guantánamo that we counted. At least one rabbi was present during interrogations.” He was subsequently arrested in Turkey for being the leader of an al-Qaeda cell.
But as Gitmo was not being attacked on the basis of fact, it could scarcely be defended with factual evidence. Any wild tale told by a detainee, like the absurd stories of Koran desecration published by Newsweek and later retracted, gained instant credibility. When the verifiable—and verifiably punished—prisoner abuse at Iraq’s Abu Ghraib facility was revealed in 2004, the anti-Gitmo crusade got a massive but wholly immaterial boost. The real case of a few rogue sadists working as prison guards in Iraq served as evidence of official Bush policy at Guantánamo Bay; the false association was so pervasive as to constitute a mass delusion.
And so, by the time of the 2008 presidential election, not one, but both leading candidates had promised to shutter Gitmo. Indeed, Barack Obama made it his first order of business as president to sign an executive order closing the Guantánamo Bay facility. That this has proved impossible, despite the prodigious heaping of moral instruction with which he announced it, is perhaps the single best defense of Gitmo we have yet seen.
Part of the charge against military tribunals for unlawful combatants asserts that al-Qaeda detainees are due the full rights articulated in Common Article 3 of the Geneva Conventions. This is a matter of willing reality to be something other than it is. Al-Qaeda members fight under no flag, wear no country’s uniform, and are not themselves signatories to the Geneva Conventions. What’s more, in 1977 there were two updates to the Geneva Conventions that added protections to non-state organizations during warfare—updates the United States refused to ratify expressly to deny terrorists the same securities granted those fighting for their countries.
Perhaps the greatest victory for the anti-tribunal movement was the Supreme Court’s 2008 Boumediene v. Bush decision, which essentially granted unlawful enemy combatants the protections of habeas corpus. This is a lesson in the value of sheer tenacity. For Boumediene v. Bush effectively undid the Court’s own 2006 Hamdan v. Rumsfeld decision, which said the military tribunals judging the Gitmo prisoners would be legal once the executive branch came to an agreement with Congress concerning their enactment. That agreement was reached. Then the Court overruled itself. Even now, though, the matter of actual habeas corpus trials for detainees remains in a state of suspended animation—because there are few reasonable courses of action other than the one the Bush administration pursued.
Attempts to normalize the domestic aspect of the war on terror and turn it into a criminal matter to be handled by the conventional court system have mostly backfired. On November 17, 2010, a New York jury acquitted al-Qaeda agent Ahmed Khalfan Ghailani of the murder of 224 people in the 1998 embassy bombings in Kenya and Tanzania. He was convicted—just barely—on one count of conspiracy. The Obama administration’s push to make Khalid Shaikh Mohammed stand trial in a Manhattan federal courtroom was greeted with such absolute outrage that it was shelved completely.
Attorney General Eric Holder’s mission to discredit the Bush administration’s domestic efforts in the war on terror has been lackluster at best and embarrassing for him at worst. He tasked the Justice Department’s Office of Professional Responsibility (OPR) with investigating John Yoo and his fellow official Jay Bybee for writing supposedly unethical memos regarding enhanced interrogations. A review of the office’s efforts by Associate Deputy Attorney General David Margolis vindicated Yoo and Bybee and condemned the OPR for sloppiness. In 2009, Holder appointed prosecutor John Durham to investigate the possible mistreatment of 100 detainees at the hands of the CIA. In July 2011, the Justice Department announced that it would continue with the investigation in only two cases in which prisoners died in custody. Looking into the other 98, it determined, “is not warranted.”
Moving beyond the destructive political theater, an extraordinary set of facts emerges that puts the entire controversy into perspective. The facts are these:
The United States has foiled more than two dozen al-Qaeda plots. American officials have used enhanced interrogation techniques on a total of 28 detainees. One of these was Khalid Shaikh Mohammed, who went from stonewalling interrogators to revealing lifesaving intelligence. That intelligence also led to the capture of more than a dozen terrorists, including Iyman Faris, an al-Qaeda soldier inside the United States, and the single-named Hambali, who perpetrated the 2002 bombing of two nightclubs in Bali. More than this, the enhanced interrogation of Khalid Shaikh Mohammed and Abu Faraj al-Libi revealed the identity of Osama bin Laden’s courier. Surveilling that courier brought U.S. Navy SEALs to Bin Laden’s compound in Pakistan.
If the despised National Security Agency wiretapping program—whose existence was exposed by the New York Times in the most reckless and irresponsible act of investigative reporting of our time—had been in effect before 9/11, we might have learned of those attacks before they were launched. And if something like the Patriot Act had been in place to allow for the flow of information between the CIA and the FBI, we might have been able to act fast enough to stop them.
As a matter of intellectual housekeeping, it is worth noting the following: for the high dudgeon about rendering terrorists to other nations, that was a policy first, and repeatedly, implemented by the Clinton administration. For all the outrage about waterboarding and the supposed moral stain it placed on the country, Bush ended the interrogation technique in 2003 after it had been used on three—three—suspects. And despite the way Democrats used civil-libertarian outrage over NSA wiretapping, the Patriot Act, and Guantánamo Bay to partisan advantage in the blowout elections of 2006 and 2008, all three policies are still used by the Obama administration.
Incensed civil libertarians on the right, for their part, also fail to acknowledge some extraordinary facts. The TSA pat-downs, no-fly lists, travel restrictions, and legislation aimed at stopping would-be terrorist attacks have in fact worked. Strict airport rules preventing passengers from flying with various items have stopped terrorists from using those items and forced them to resort to more unreliable methods. Umar Farouk Abdulmutallab failed to ignite his “underwear bomb” in part because he was using PETN, a tricky plastic explosive not previously used for such operations.
Provisions in the Patriot Act do allow for the tracking of certain chemical purchases, such as hydrogen peroxide and acetone. This has led some civil libertarians to scoff at the ineffective “criminalization of beauty products.” But, in fact, U.S. authorities successfully tracked the hair-bleach purchases of Najibullah Zazi, the al-Qaeda recruit nabbed before he could pull off a spectacular bombing attack on New York City subways. A politically healthy discussion about the domestic war on terror should have been a fact of public life for the past decade. The slew of new policies enacted after 9/11 raised important issues. It is possible to hold the following three propositions in our minds at the same time:
1. America’s post-9/11 national-security architecture is effective.
2. As Americans, we are right to be concerned by the larger and more invasive intelligence superstructure that these policies entail.
3. This larger, more invasive superstructure does not automatically mean the abandonment of our values or the forfeiting of our way of life.
That no widespread discussion along these or similarly sober lines has ever occurred is the 9/11 legacy that belongs exclusively to left-wing activists and civil-liberties fetishists. That no second al-Qaeda attack has hit the United States is the legacy of an administration that took its responsibilities more seriously than its opponents did theirs.
5. The Centrality of Iraq
I began this article with a daring proposition: the prosecution of the war in Iraq was central to the efforts against al-Qaeda. The common riposte—Iraq had nothing to do with al-Qaeda’s strike on the United States on 9/11—is all well and good as a debating point over a dinner table, but it has nothing to do with the deeper argument.
Iraq was always more fundamental in al-Qaeda’s thinking than was widely understood. Telling evidence of this is found in Osama bin Laden’s 1998 declaration of war against the West, translated by Bernard Lewis in the pages of Foreign Affairs. The “Declaration of the World Islamic Front for Jihad Against the Jews and the Crusaders” listed three main offenses against the Islamic world for which the perpetrators would be made to pay. America’s Iraq policy constituted the core of the first two charges:
First—For more than seven years the United States is occupying the lands of Islam in the holiest of its territories, Arabia, plundering its riches, overwhelming its rulers, humiliating its people, threatening its neighbors, and using its bases in the peninsula as a spearhead to fight against the neighboring Islamic peoples…. Second—Despite the immense destruction inflicted on the Iraqi people at the hands of the Crusader-Jewish alliance and in spite of the appalling number of dead, exceeding a million, the Americans nevertheless, in spite of all this, are trying once more to repeat this dreadful slaughter.
The “occupation” was the basing of American troops in Saudi Arabia, which was undertaken with the Saudi government’s express permission. The “fight against the neighboring Islamic peoples” refers to the first Gulf War, in which the United States pushed Saddam’s regime out of Kuwait. That effort came with the blessing of not only the Saudis but also with that of the Kuwaiti government and even with the sanction of a few Islamic jurists in the region.
Arabia, or what we now call Saudi Arabia, and Iraq are the two most important lands in Islamic history. They are, respectively, the first and second epicenters of early Islam’s most productive periods. Mohammed’s connections to Mecca and Medina (in Arabia) are well known, but, as Lewis notes, Iraq was “the seat of the caliphate for half a millennium.” It is no wonder, then, that Bin Laden listed his grievance involving Israel last, as a kind of add-on: “Third—While the purposes of the Americans in these wars are religious and economic, they also serve the petty state of the Jews, to divert attention from their occupation of Jerusalem and their killing of Muslims in it.” Note that even this complaint cites the conflict of the first two and notes America’s “eagerness to destroy Iraq, the strongest of the neighboring Arab states.”
During the most trying times of the second Iraq war, detractors who were not part of the progressive antiwar alliance expressed longing for the foreign-policy prudence and wisdom—the calculating realism—of the first President Bush. He understood, they argued, that the United States would suffer militarily if it exceeded its original mandate to push Saddam out of Kuwait and attempted to unseat him. In other words, Bush the elder knew enough to spare America the hell that Bush the son would invite by going into Iraq. But in reading Bin Laden’s declaration of war on America, it becomes clear that the first President Bush was not realist enough. It was precisely in these realist policies that Bin Laden found an expedient for jihad.
There is not and never was a traditional realist detour around the American showdown with Islamist terrorism. The failure of realism to grasp the nature of the new threat to America was central to Bush the son’s revision of his father’s foreign policy. Before an audience of West Point graduates in June 2002, George W. Bush gave voice to the doctrine of preemption: “If we wait for threats to fully materialize, we have waited too long.” The weapons of mass destruction the United States mistakenly believed Saddam to possess justified a preemptive attack, and his regime’s repulsive record and continued illegal conduct violating the terms that allowed him to remain in power constituted a long overdue case for war on grounds of international law.
The bill for our realism had finally come due, and we were to pay heavily, although it didn’t seem so at first. Operation Iraqi Freedom began on March 20, 2003. Coalition troops took Baghdad in a dazzling three weeks. But in May, the Coalition Provisional Authority set up to administer post-Saddam Iraq made two disastrous policy decisions that would give an opening to the forces of anarchy. Paul Bremer, the American administrator, issued an order removing about 30,000 senior Baath Party members from their posts. Then, under the same de-Baathification program, he issued another order putting a half million Iraqi soldiers and intelligence professionals out of work. This new army of unemployed leaders and fighters came mostly from Iraq’s ruling Sunni minority. They were well aware that the sectarian power balance was turning against them and in favor of the country’s long-suffering Shia majority. The beginnings of the Iraqi insurgency gained strength, particularly around Baghdad, at once.
Meanwhile, the forces of jihad were focusing their efforts on ensuring that the American expedition in Iraq came to disaster. A Jordanian named Abu Musab al-Zarqawi entered northern Iraq with a small group of fighters in 2002. In 2003, his group bombed the UN headquarters in Baghdad, which sent the UN packing, along with the World Bank and the International Monetary Fund. Zarqawi would soon be named leader of al-Qaeda in Mesopotamia.
He had a bottomless talent for the unthinkable. Zarqawi was a snuff auteur. He regularly recorded his beheadings, bombings, and IED attacks on video for immediate viral distribution on the Internet. Zarqawi’s depraved campaign would forever taint the American effort, no matter what reversals came to pass. Before long, the Iraqi insurgency and the Iraqi jihad fused. It was an extraordinary object lesson in the political and cultural maladies of the Middle East. It was also an unprecedented nightmare, as Americans were fighting a well-connected, suicidal, ex-Baathist-jihadist hybrid enemy.
Zarqawi’s next stroke of evil cunning was to incite a Shia-Sunni civil war. His organization set about bombing Shiite targets in order to elicit retaliations on Sunnis. Then Sunnis, so the plan went, would embrace al-Qaeda as their protectors. It worked. By 2006, approximately 3,000 civilians were being killed each month. That year, the CIA estimated that some 1,300 foreign al-Qaeda members had come to fight in Iraq.
Few U.S. policymakers were willing to admit the extent of the chaos, and even fewer had any idea what to do about it. On the Shia side, death squads proved to be as bloodthirsty as their Sunni counterparts. The Iranians backed the demagogic and murderous Moktada al-Sadr; his Jaish al-Mahdi army attacked coalition forces in devastating battles in Najaf, and Sadr himself became a charismatic political hero among Shiites.
Iraq would not be rescued from the brink when its rescue finally came after three years of horrific war; the brink had already become a distant memory by then. The first break came when Sunni sheiks in Anbar decided they’d had enough of the al-Qaeda killing. In 2006, their anti-jihad “awakening” began. This made the Sunnis and the coalition forces natural allies. Their cooperation proved invaluable, and Zarqawi was soon tracked down and killed. The American-backed Sunni army that became the Sons of Iraq would boast of 100,000 members within three years.
Also, in 2006, following a model that had worked in a town in northern Iraq a year earlier, the United States began to station troops in small outposts in the dangerous area of Ramadi, so that they could more effectively protect the local population. Until then, American forces spent most of their time in enormous fortified forward operation bases, where they scarcely understood the day-to-day concerns of locals. The move into the population not only enabled better protection, but it yielded better intelligence on al-Qaeda once trust was established between troops and locals. The partnerships in Ramadi, and others like it, proved uniquely effective. Once neighborhoods were made safer, the “build” phase of a “clear, hold, and build” strategy would proceed, paving the way for progress on things like schools and utility service reconstruction. This was, by definition, the nation-building that the Bush administration had hitherto eschewed.
American debate about a surge in troops and a wider application of the new strategy began in earnest. A combination of think-tank scholars, military commanders, and some Pentagon officials started to flesh out various versions of a troop-increase plan and a counterinsurgency strategy to go with it. At the end of 2006, the Army and Marine Corps Counterinsurgency Field Manual was published. Partially the work of General David Petraeus, it laid out the counterinsurgency doctrine that would change Iraq. It was focused on the population and encouraged amnesty for former enemies interested in cooperating. It was, finally, a way to handle the asymmetrical challenge of the fight.
Counterinsurgency is daring, complex, and often counterintuitive. To a war-weary country, it was a near-impossible sell. The mainstream media and the Democratic establishment were opposed. Around the same time that the field manual was released, the congressionally mandated bipartisan Iraq Study Group, headed by James Baker and Lee Hamilton, came out with its own plan. It was a blueprint for timed withdrawal. The idea was to increase U.S. troop levels temporarily in order to “stand up” Iraqi forces to the best of our ability and then get out of the war within two years. President Bush disagreed. In January 2007, he announced a surge of 20,000 soldiers (the number would eventually reach 30,000) and made Petraeus ground commander in Iraq.
Polls showed that most of the country thought the surge was bad idea. In hindsight, the most important opposing voice was this one: “I am not persuaded that the 20,000 additional troops in Iraq is going to solve the sectarian violence there,” said then Senator Barack Obama. “In fact, it will do the reverse. I think it takes the pressure off the Iraqis to make the sort of political accommodations that every observer believes is the ultimate solution to the problems we face there. So I’m going to actively oppose the president’s proposal.”
The results of the proposal rejected so soundly by Obama and his ilk were nothing less than astonishing. Al-Qaeda was hurled on the defensive, as better intelligence led to increased Special Forces raids. Thousands of terrorist operatives were killed or captured. And because of the foreign al-Qaeda presence in Iraq, intelligence gathered there proved crucial to the war on terror beyond the country’s borders. In Iraq, “the numbers of Iraqi civilians dying in sectarian violence began a sharp decline,” wrote Peter Bergen, “from a high of around 90 every day in December 2006 to single digits two years later.” As al-Qaeda in Iraq was crushed, the appeal of Shia death squads waned. This allowed Iraq’s prime minister, Nouri al-Maliki, a Shiite, to degrade Sadr, ordering Iraqi troops to take him on in a crucial fight in Basra. Over time, diminished violence on both sides allowed for a degree of political reconciliation previously unthinkable.
The two most important American achievements in Iraq cannot be measured in captured terrorists or Iraqi policemen. The first is that the United States came face-to-face with the very worst al-Qaeda could muster and, against all odds, prevailed. Not only did America prove not to be the paper tiger its enemies had counted on, but those enemies were thoroughly discredited among their potential supporters. The war that was supposed to break America’s back broke al-Qaeda’s instead.
The second great achievement is the establishment of the first Arab Muslim democracy. The importance of this cannot be overstated. Although Iraq’s people had gone to the polls three times in 2005, waving purple fingers proudly in a display that supported the notion they wanted normal lives as free citizens, Iraq’s first unquestionably successful election took place only in January 2009. A mere five months later, a rigged presidential election in neighboring Iran sparked a wave of democratic passion that country had not seen since the establishment of the Iranian Republic. Just as Iran helped sow the seeds of discord inside Iraq (and continues to try to do so), so did the successful democratic expression of popular will inside Iraq sow the seeds of expectation in its neighbor, with bloody consequences for the Persian people—and with reverberations that hit Tunisia 18 months later.
Meanwhile, as the noble call for representative government continues to be heard by Muslims around the region, let us not forget that the one existing democratic country among them is the successful American project in Mesopotamia.
6. The Second Decade
That the larger and potentially more enduring and revolutionary impact of our Iraq effort is only now changing the region makes it plain that we are nowhere near done with the war on terror. It will require a committed American effort to keep the countries of the Arab Spring on a democratic course. The last 10 years have seen an intellectual battle royal in the Middle East, with democracy slugging it out alongside secular tyranny and brutal theocracy. The Arab Spring is an opportunity to deliver knockout blows to the last two. But if we don’t take advantage of this pregnant moment, the region’s poisonous ideological parties will surely regain their footing.
In Afghanistan, continued American commitment is even more desperately needed. The dangers now manifest there due to our previous failures could set the stage for another 9/11. There is no greater threat to the United States than a reconstituted Taliban presiding over Afghanistan. They will once again harbor al-Qaeda because the two groups share a deep and abiding religious bond. Perhaps Americans tune out at the mention of the abstruse-sounding Haqqani Taliban network. But they might not if they understood that its leader, Jalaluddin Haqqani, fought like a brother alongside his dear friend Osama bin Laden in Afghanistan in the 1980s. A premature American withdrawal is scheduled to leave Haqqani in place next year.
No terrorist plot of global significance over the past 10 years is without its ties to al-Qaeda training camps in Afghanistan. It is only with the kind of protection and freedom of movement that the Taliban offers that al-Qaeda can plan and carry out world-altering attacks. After years of American neglect, the Afghan surge was implemented and has just started working to clear the Taliban from neighborhoods in the South. With next year’s scheduled drawdown of all 30,000 surge troops, we will be unable to do the same in the crucial Haqqani-controlled east. If the Taliban could be defeated with drone strikes alone, we’d already have begun a full drawdown in order to let the machines win for us. But it won’t work now, and it won’t work in a year.
The question to ask today is not whether we believe in closure. What matters is whether our enemy is as ready to call it a day as we are. “This clearly is a defeat for the U.S. in Afghanistan, and the start of the return of the Taliban, [its leader] Mullah Omar, and an Islamic sharia state,” said one senior Taliban fighter in response to Obama’s drawdown announcement. “We can’t believe that in the short time of 10 years, the Taliban are forcing the superpower of the century to pull out its troops.”
To a holy army avenging a centuries-old wrong, 10 years is a short time. To a superpower interrupted in the comfort of its unipolar moment, the same 10 years has been an endless, fraught, and painful decade. Indeed there is today a sense among some Americans that the fighting of the last decade was, finally, unnecessary—that it somehow could have been avoided, and that our winding down now will bring an overdue peace. If that delusion prevails, we will have circled fully back to our pre-9/11 state of vulnerability.
But we’ve made more valuable use of these years than our enemy has. As a fighting nation, we have learned precious lessons. In Afghanistan and Iraq we have gained the essential skills for counterinsurgency and nation-building. We have witnessed the power of democracy to transform populations long suspected of being immune to the beauty of consensual governance. At home, we have learned that our own democratic republic can enjoy its unparalleled freedoms and still remain safe from attack. We know that no grotesque ideology, no matter how ruthlessly defended, is a match for American power inspired by American ideals.
President Obama is wrong. For to confirm these truths at such great cost is “to nation-build at home.” It is to make a stronger country, a safer country, and one that need not succumb to the deadly temptations of an illusory peace.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
What We Got Right in the War on Terror
Must-Reads from Magazine
Last year, we asked experts to examine Candidate Trump’s policy proposals. This year, we’ve asked them to examine how he has executed these proposals in office.
On Trade By Scott Lincicome
Last year, economic, legal, and geopolitical calamity lurked in the shadows of almost every trade-policy promise made by presidential candidate Donald Trump. Eight months into the Trump presidency, those problems have—thankfully—not yet materialized. Instead, Trump trade policy has been a mixture of bluster, disappointment, relief, and uncertainty. This last category warrants close attention: In the coming months, Trump’s dangerous trade ambitions could remain in check, thus keeping a global trade system alive. Or politics, legal ambiguity, and Trump’s own emotional impulses could deal that system a fatal blow.
There is no doubt that President Trump has already done serious damage to the United States’ longstanding position as a world leader on trade policy, the American political consensus in favor of trade liberalization, and Republican views of trade and globalization. His constant vituperation has offended U.S. allies and trading partners, causing them to turn to Europe, Asia, or Latin America in search of alternatives to the once-welcoming and predictable U.S. market. He has accelerated (not started) the American retreat from the World Trade Organization, further wounding a multilateral trading system that was a U.S. invention—an invention that has, contrary to popular belief, served U.S. economic and foreign-policy interests well since the 1940s.
Trump’s day-one withdrawal from the Trans-Pacific Partnership—the flawed-yet-deserving Asia-Pacific trade agreement started by President Bush and ultimately signed by President Obama—has left vacuums in both Asia-Pacific trade and international economic law. TPP was far from perfect, but it was widely supported by U.S. trade and foreign-policy experts because of its economic and geopolitical benefits. The deal contained important new rules for 21st-century issues such as e-commerce, GMOs, and state-owned enterprises. Moreover, it would have provided small but significant benefits for U.S. workers and the economy, while cementing the United States’s influence in a region increasingly covered by China’s shadow. Now, TPP parties are working to complete a “TPP-11” deal that excludes the United States, while China is negotiating its own version of the TPP—the Regional Comprehensive Economic Partnership. And many of TPP’s novel provisions are being relitigated in contentious NAFTA renegotiations with Canada and Mexico (both TPP parties).
All of this is disappointing, but it’s probably survivable and hardly the fire and brimstone of the Trump campaign trail (hence, the relief). Trump has repeatedly threatened tariffs and other forms of dangerous unilateral protectionism, but economic, legal, and political realities have intervened. For example, when Trump promised new “national security” tariffs on steel and aluminum under Section 232 of the Trade Expansion Act of 1962, the opposition from Congress, business groups, strategic allies, NGOs, and even members of Trump’s administration was unrelenting. As a result, planned tariffs have quietly been shelved (for now). Other presidential threats have similarly come and gone without major action, giving market participants some heartburn but little long-term pain. Only in the opaque area of trade remedies—antidumping, countervailing duty, and safeguard measures—has there been a marked uptick in U.S. protectionism. But this is the result of long and technical administrative proceedings initiated by U.S. industries or unions that formally petitioned the government under relevant domestic law—hardly the wave-of-the-hand actions that Trump promised.
Some measure of relief is warranted, but we’re not out of the woods just yet. Indeed, in the last eight months, Trump has publicly threatened to
- block steel and aluminum imports for national-security reasons or bring new cases against semiconductors and ships, under the aforementioned Section 232;
- withdraw from the North American Free Trade Agreement and the U.S.-Korea FTA;
- slap tariffs on Chinese imports under Section 301 of the Trade Act of 1974 because of alleged Chinese intellectual-property-rights violations; and
- impose onerous new “Buy American” requirements on U.S. pipelines and government-funded infrastructure projects.
And those are just the public threats. Behind closed doors, Trump has reportedly considered enacting sweeping import restrictions under the International Emergency Economic Powers Act. The president reportedly yelled, “I want tariffs. Bring me some tariffs!” when told by his “globalist” advisers that legal and economic realities prevent him from imposing broad-based protectionism on a whim.
None of the threats on Trump’s wish list is officially off the table, and any one of them would have serious economic consequences: Steel tariffs alone would put more than 1.3 million American jobs at risk; NAFTA withdrawal could destroy 250,000 more; and several nations have promised immediate retaliation against American goods, services, or investment in response to Trumpian protectionism. Trump’s actions would also raise major legal issues. For example, the World Trade Organization’s broad, subjective “national security” exception wasn’t intended to be used as a get-out-of-jail free-card for steel tariffs, and a dispute over a member’s right to invoke it could imperil the multilateral trading system. Meanwhile, Trump’s withdrawal from a free-trade agreement without congressional consent would raise major constitutional questions as to whether the president had that authority and what would happen to the myriad U.S. tariffs and other commitments that were embedded in legislation and passed into law. Lawsuits over these and other issues surrounding presidential trade powers would throw billions of dollars of cross-border trade and investments into legal limbo.
The president’s unpredictability, political weakness, and clear affinity for protectionism, combined with ample (though ambiguous) legal authority to act unilaterally, mean that any one of his trade threats could still materialize in the coming months. The White House’s internationalists may have won the early battles, but the war will rage for as long as Trump is president. Continued vigilance and advocacy for the benefits of freer trade remain critical.
And congressional legislation clarifying and limiting the president’s trade powers might not be a bad idea either…just in case.
Click here to read what Scott Lincicome wrote about Candidate Trump and trade last year.
Scott Lincicome is an international trade attorney, adjunct scholar at the Cato Institute, and visiting lecturer at Duke University Law School. The views expressed herein are his own and do not necessarily reflect those of his employer.
On Taxes By James Pethokoukis
At some point in his first term, President Donald Trump will likely sign legislation that cuts taxes by some amount for somebody. This modest prediction is based less on reading the political tea leaves than understanding conservative politics. If any issue made the modern Republican Party, it was tax cuts. Not surprising, then, that candidate Trump promised big cuts for individuals and businesses. And with the GOP now holding the White House and Congress, failure to deliver is almost unimaginable.
Of course it’s almost equally unimaginable that the Trump tax cuts will at all resemble the ambitious plans devised by Trump advisers during the campaign. There were two of those blueprints. The first, rolled out September 2015, proposed lowering the top personal rate to 25 percent from the current 39.6 percent, and cutting the corporate rate to 15 percent from the current 35 percent. Along with other changes, including eliminating the alternative minimum tax and estate tax, this initial plan might have lowered annual government revenue by a whopping $1 trillion a year or more (even if one assumes much faster economic growth).
This was, in other words, more a fantasy proposal cooked up by Reagan-era supply-siders than a serious effort to reform the tax code without worsening our historically high federal debt. Indeed, Trump’s sole purpose in signing on to the plan may have been to win over that very same group, still influential among base voters. Trump himself talked little about the plan while on the hustings, especially compared with immigration, trade, and The Wall.
The Trump campaign’s second bite at the apple a year later was a scaled-back plan, but still a colossal one. Instead of losing a trillion bucks a year, maybe the government would be out just a half trillion or so. Again, since the plan was unaccompanied by spending cuts elsewhere in the budget, it was more a set of glorified campaign talking points than a serious proposal. And like the first, Trump didn’t talk much about it.
So after Trump’s shock election, there really was no realistic Trump tax plan. No worries, however, since there was a House Republican tax plan all ready to go, with an enthusiastic House Speaker Paul Ryan ready to push it hard through the lower chamber. It was an ambitious proposal but one within reality, especially with a bit of fiscal tweaking. That plan called for, among other things, lowering the top personal rate to 33 percent and the corporate rate to 20 percent, immediately expensing new capital investment, and expanding the child tax credit.
And more so than the Trump campaign plans, the House plan intended to reform the tax code, not just cut taxes. For example, it eliminated all personal itemized deductions other than mortgage interest and charitable contributions. The House plan also made a stronger attempt to pay for the tax through a border-adjustment tax and limiting business-interest deductibility. All in all, the plan cost a couple of trillion dollars over a decade, not assuming economic feedback. On such a dynamic basis, according to Tax Foundation modeling, the House plan would reduce 10-year revenues by just under $200 billion.
So if Republicans really wanted to make their plan revenue neutral, it was certainly doable through relatively minor changes, such as less dramatic corporate or personal rate cuts. Yet the plan would still be a massive improvement over the status quo, both in terms of encouraging more domestic investment and providing middle-class tax relief.
With a detailed plan at the ready and Republicans running Washington, it is easy to understand why many in the GOP thought it reasonable to predict that Trump would be signing a mega tax bill by August of this year, just as Ronald Reagan did in the first year of his first term. Reagan did it from his ranch in Santa Barbara, California. Maybe Trump would repeat the feat from his Trump Tower penthouse in Manhattan.
But that did not happen. Then again, very little of Trump’s ambitious domestic agenda has happened as planned. Repeal and replace was promised by Easter, leaving plenty of time to hash out the fine details of tax reform and move legislation through the House and Senate. But the GOP health reform was a long slog consuming valuable time, attention, and political capital. Also deserving blame was Trump’s inability to focus on pushing policy priorities rather than pounding political opponents on Twitter. As of now, it seems highly unlikely that significant tax reform will occur in 2017. And 2018 looks challenging as well.
Yes, Trump has provided more distraction than leadership on this issue. And trying to pass major legislation in a midterm year only adds to the political difficulties. But the biggest problem is that there is no tax-reform plan for Republicans to push.
What happened to the ready-to-serve House plan? It suffered from not being a fantasy. It acknowledged both political and policy constraints, something the populist president almost never does. For instance: the House plan tried to pay for the tax cuts—a political necessity to placate debt-hawk Republicans. That requires making somebody somewhere unhappy. Ryan knew that without such an effort, it would be extraordinarily difficult to reduce the corporate tax rate to anywhere close to 20 percent. But while exporters supported the border tax, importers hated it, complaining that it would raise costs. Nor was the Trump White House happy about axing business-interest deductibility.
Still, as problematic as those pay-fors were, the alternatives—limiting tax breaks for mortgages, 401(k)s, and state and local taxes—are equally if not more so. The state and local tax deduction is a case in point. Pushed hard by Republican leaders as the primary revenue generator to replace border adjustment, it seems unlikely to survive criticism from blue-state Republicans. Eventual legislation is likely to be a far smaller and less comprehensive bill than first envisioned—more cut than reform—with some temporary parts designed to satisfy congressional budget rules. Indeed, Senate budget writers cleared room for just a $1.5 trillion tax cut, and even that might be overly ambitious. Expect Trump and his people to call whatever passes a “down payment” on true tax reform. Pro-growth conservatives should call it a missed opportunity.
Click here to read what James Pethokoukis wrote about Candidate Trump and taxes last year.
James Pethokoukis is the DeWitt Wallace Fellow at the American Enterprise Institute. He is also an official CNBC contributor.
‘The Wall’ By Linda Chavez
“We’re going to build a wall. That wall will go up so fast, your head will spin.” Donald Trump made this promise on August 23, 2016, repeated it throughout his presidential campaign, and has reiterated it in tweets and at press conferences and rallies ever since. But the only spinning going on lately has been the president’s own efforts to assure his base that he will eventually build a wall, or a fence, or some barrier along the U.S. border with Mexico, except maybe for those areas that don’t need one or already have one. Oh, and someone will pay for it—preferably Mexico, as he promised—but if not, Congress, unless Democrats or even Republicans refuse to go along. A year after winning the presidency, Trump’s most ubiquitous pledge, The Great Wall separating the U.S. from Mexico, remains largely a figment of his imagination and evidence of his supporters’ gullibility.
No issue defined Trump’s campaign more viscerally than immigration, and on none was his position less ambiguous. Trump’s presidential record on immigration enforcement and policy, however, is decidedly more mixed. He continues to promise that construction of the wall is going to start soon: “Way ahead of schedule. Way ahead of schedule. Way, way, way ahead of schedule,” he said in February. But the cost, with estimates as high as $70 billion, and the sheer impracticality of erecting a solid barrier along 1,900 miles make little sense in light of recent trends in illegal immigration. Illegal immigration is at historically low levels today (roughly the same, in absolute numbers, as it was in the early 1970s) and has been falling more or less consistently since the peak in 2000, mostly because fewer people are crossing the border from Mexico. Apprehensions of Mexicans are at a 50-year low, as are all apprehensions along the southern border. Year-to-date in 2017, apprehensions at the Mexican border have dropped 24 percent compared with those in 2016, when a slight uptick occurred as more people tried to cross in advance of a feared Trump victory and border crackdown. The population of undocumented immigrants living in the U.S. is down as well and now stands at roughly 11 million, from a peak of 12.2 million in 2007; and two-thirds of these unauthorized immigrants have lived here a decade or longer. More Mexicans—whom Trump described as “bringing drugs. . . crime. They’re rapists”—are now leaving the U.S. than arriving. In 2013, for the first time since the 1960s, Mexico fell as the top source of immigrants to the U.S., behind both China and India.
Trump’s pledge to build a wall, of course, wasn’t his only promise on immigration, but he hasn’t lived up to his own hype in other areas either, which is a good thing. He said he’d end on day one the Obama administration’s Deferred Action for Childhood Arrivals (DACA), a program that provided temporary protection from removal for young people who arrived here illegally before age 16. Instead, Trump waited until September 5 to send his beleaguered Attorney General Jeff Sessions out to announce that DACA would end in six months unless Congress acted. Trump then almost immediately backtracked in a series of tweets and offhand statements. Polls show that large majorities of Americans, including some two-thirds of Trump voters, have no interest in deporting so-called Dreamers, half of whom came before they were seven years old and 90 percent of whom are employed and paying taxes. Trump’s own misgivings and the backlash over the policy’s announcement led him into a tentative deal with Democratic leaders Representative Nancy Pelosi and Senator Chuck Schumer in September to support legislation granting legal status for Dreamers who complete school, get jobs, or join the military. Trump’s most nativist supporters have already dubbed him “Amnesty Don” for even suggesting that Dreamers should be allowed to remain and gain temporary legal status, much less earn a path toward citizenship. But whether such legislation will make it through Congress is still uncertain. Similar bills have repeatedly passed one chamber and died in the other over the past 10 years, but the potential threat that the administration might begin deporting many of the 800,000 young adults who signed up for DACA should concentrate the minds of the Republican leadership to allow legislation to move forward. One of the complications in the House is the “Hastert Rule,” named after former Speaker Dennis Hastert, an informal agreement that binds the speaker from bringing a bill to the floor unless a majority of the majority party supports it.
To be sure, Trump’s rhetoric and his appointment of hard-line immigration restrictionists to posts in his administration have led to fear among immigrants, as have the administration’s erratic, irrational enforcement policies. Previous administrations, including Barack Obama’s, gave priority to detaining and deporting aliens convicted of serious crimes, but in one of his first executive orders and Department of Homeland Security memoranda, Trump broadened the priorities for detention and removal to include anyone even suspected of committing a crime, with or without charges or conviction. As a result, arrests for immigration offenses have increased under Trump and have swept up hundreds of individuals who pose no threat to safety or security, some picked up outside their children’s schools or when seeking court orders against domestic abuse. Actual deportations, on the other hand, are down slightly in Trump’s first eight months compared with the same period in Obama’s last year. This is largely because the overloaded system isn’t equipped for mass deportation. Trump promised to rid the country of a greatly exaggerated 2 million criminal aliens and “a vast number of additional criminal illegal immigrants who have fled or evaded justice.” But his boasting that “their days on the run will soon be over” has always been aimed less at promoting sensible immigration policy than at stoking nativist anger in pursuit of his own brand of identity politics. Trump’s America will be a less welcoming place for immigrants—legal as well as illegal—if Trump gets his way on proposed legislation to reduce legal immigration by half over the next decade. But labor shortages and an aging population make it unlikely that Trump’s efforts will succeed. The simple fact is that we need more, not fewer, immigrants if the economy is to grow. Building walls and deporting workers is exactly the wrong way to go about needed immigration reform, whether Trump and his hard-core base can admit it or not.
Click here to read what Linda Chavez wrote about Candidate Trump and ‘The Wall’ last year.
Linda Chavez is the president of the Becoming American Institute and a frequent contributor to Commentary.
On Infrastructure By Philip Klein
A massive infrastructure bill was supposed to be one of the early triumphs of President Trump’s administration. Instead, Trump’s inability to advance the ball on one of his signature issues has highlighted the lack of focus, inattention to detail, and difficulties working with Congress that are emblematic of his presidency to date.
The idea of rebuilding the nation’s infrastructure, though overshadowed by daily controversies during the wild 2016 campaign, wove together several elements of the Trump phenomenon.
His experience in building projects such as luxury hotels, resorts, skyscrapers, and golf courses became central to his argument that he had the skills required to get things done in Washington. By touting the economic benefits of infrastructure during his campaign, Trump also signaled that he was an unorthodox Republican, breaking with decades of conservative critiques of Keynesian stimulus projects. Trump also spoke of infrastructure in nationalist terms, integrating it into riffs about how the United States was constantly losing to China. “They have trains that go 300 miles per hour,” he said during the campaign. “We have trains that go: Chug. Chug. Chug.”
When Trump pulled off his election-victory upset, Washington insiders quickly focused on infrastructure as one issue on which he could get a legislative win and box Democrats into a corner. After all, could Democrats really resist passing a major policy priority that had eluded them when one of their own was in the White House?
In his Inaugural Address, Trump threw a jab at Bush-era Republicanism, declaring that the U.S. “spent trillions of dollars overseas while America’s infrastructure has fallen into disrepair and decay.” Going forward, he said, “America will start winning again, winning like never before.” He promised: “We will build new roads, and highways, and bridges, and airports, and tunnels, and railways all across our wonderful nation.”
Now in the fall of the first year of his presidency, any effort to advance infrastructure legislation has been drowned out by daily controversies involving White House intrigue, the investigation into Russian influence in the 2016 election, and Trump’s raucous Twitter feed. Congress, meanwhile, spent much of the year focused on repealing and replacing Obamacare.
This isn’t to say that the Trump administration didn’t try, in fits and starts, to push infrastructure. In May, with the release of his first budget, Trump included $200 billion in funding for infrastructure as the first step in his $1 trillion infrastructure initiative. He also released a six-page fact sheet outlining his vision for infrastructure, which remains the most detailed resource on his infrastructure goals.
The document, broadly speaking, argues that current infrastructure money is spent inefficiently. It proposes greater selectivity in using federal dollars for infrastructure investments that are in the national interest and recommends giving state and local governments more leeway over their own projects. It also calls for more public-private partnerships.
Specifically, the proposal would create a nongovernment entity to manage the nation’s air-traffic-control system. It would also support private rest stops, give states the ability to work with private companies to manage their toll roads, and streamline the environmental-review process. The proposal received little attention, as it was rolled out during a week when Russia hearings took center stage in Congress and Trump was traveling in Europe and the Middle East.
Such inattention was supposed to end in early June, when White House officials announced “Infrastructure Week.” This was a carefully orchestrated campaign in which Trump was supposed to deliver speeches and lead staged events to highlight different aspects of his infrastructure initiative. But during this week, Washington was captivated by testimony of fired FBI Director James Comey, and Trump veered way off message in his speeches and on his favorite social-media platform.
He went on a Twitter tear. Trump attacked his own Justice Department for pursuing a “watered down” travel ban, took a shot at the mayor of London in the wake of a terrorist attack, unloaded on “fake news” outlets, and hit Comey as a liar. During a speech meant to make the case for both parties to get behind his infrastructure effort, Trump went off on a tangent, blasting Democrats as “obstructionists” on health care.
In truth, any hope of getting Democrats on board for the Trump infrastructure push had been fading even before this implosion. Liberals had already pressured lawmakers to pursue a policy of total resistance to Trump. But during Trump’s big policy push, Senate Minority Leader Chuck Schumer declared overtly that Democrats had no appetite for his infrastructure initiative due to its reliance on privatization.
Before long, the phrase “Infrastructure Week” had become a punch line—an ironic metaphor for a presidency gone off the rails.
Trump has made little progress on infrastructure since then, beyond issuing an executive order in August aimed at making the permitting process for building roads, bridges, and pipelines more efficient. But again, this announcement was overshadowed, as it came during the same news conference in which he blamed “both sides” for the violence in Charlottesville and complained about the slippery slope of removing the Robert E. Lee statue.
On the other hand, by striking a deal with Democratic leaders on the debt ceiling and negotiating with them on immigration, Trump has revived talk about the possibility that he could be ready to compromise with them to get infrastructure legislation passed as well. It is important to note, however, that in both cases—DACA and the debt ceiling—there was a ticking-time-bomb element that forced action. No such urgency exists when it comes to infrastructure.
From the perspective of a limited-government conservative, Trump’s inability thus far to negotiate a trillion-dollar federal infrastructure package with Democrats is nothing to shed tears about. But if we’re looking at the issue through the broader lens of whether or not Trump has been able to deliver on his ambitious campaign promises and make the transition from being a bombastic reality-television star to governing, it’s a case study in failure.
Click here to read what Philip Klein wrote about Candidate Trump and infrastructure last year.
Philip Klein is managing editor of the Washington Examiner.
On NATO By Tod Lindberg
On the campaign trail, Donald Trump was unsparing in his disparagement of U.S. alliances. In a word, allies were freeloaders—complacent in their reliance on the United States to provide them security, contributing nothing like their “fair share” of the cost of their defense, and lavishing the dividend on their domestic needs. Maybe that was acceptable when they were flat on their backs after a war that left the United States on top, but now that they are prospering and the United States has pressing needs of its own, it’s time for the allies to pay up. He also mused about NATO being “obsolete.”
This was alarming (to put it mildly) to most American foreign-policy specialists—to say nothing of the reaction of U.S. allies. The postwar alliance structure in Europe has been the backbone of security on a continent where the United States fought two wars. The North Atlantic Treaty Organization underpinned the postwar revival of Western Europe and subsequently, after the collapse of the Warsaw Pact and the demise of the Soviet Union, of Central and Eastern Europe. The relevance of the alliance has gained renewed salience with Russia’s aggression against its neighbors, first in Georgia in 2008, then in Ukraine in 2014.
At the heart of the alliance is Article 5 of the Washington Treaty of 1949—the commitment of each member to regard an armed attack on any as an attack on all. In practical terms, the meaning of Article 5 is that American power provides a security guarantee for Europe, a commitment upheld and explicitly reiterated by U.S. presidents since Harry S. Truman. The treaty is binding, yet equally in practical terms, it is the
American president whose commander-in-chief powers will dictate the response of the U.S. military to any attack—and by extension, the sincerity of his commitment determines the deterrent value of Article 5 against potential aggressors. Would a President Trump abrogate the U.S. commitment? Or hold it hostage to defense-spending increases by allies—perhaps even by demanding the payment of a much larger past-due bill, as the candidate suggested on at least one occasion?
In Asia, the biggest long-term challenge is the rise of China; the U.S. alliances with Japan, South Korea, Australia, and the Philippines (as well as the more complicated commitment enshrined in the Taiwan Relations Act) represent the underpinning of Pacific security. Would this, too, be up for grabs under Trump? Was “America First” shorthand for an isolationist retooling of U.S. relations with the rest of the world? The short answer to these questions turns out to be no. Trump has no apparent intention to do away with U.S. alliance relationships, however cumbersome and expensive he perceives them to be, and he evinces no intention to try to replace the postwar security architecture with something new and different, whatever that might be. So what happened? Were his many critics sounding the alarm therefore wrong about his intentions? Did he change his mind? Is the question of alliances now settled? Since Trump has taken office, alliance policy seems to have operated on two tracks within the U.S. government. The first track is the president’s own. He has continued to warn allies that they need to pay up—though his demands have moderated considerably, coalescing around the 2 percent of GDP that allies have pledged to spend on defense (though very few do). And although he has reaffirmed the U.S. Article 5 commitment on some occasions, on others when it would have been appropriate for him to do so, he has declined, apparently intentionally. Still, he has never repudiated the commitment. There seem to be two possibilities here: either a deliberate exercise in ambiguity, or incompetence and confusion of the kind his critics have long diagnosed.
I think the evidence points distinctly toward the former. That evidence is the second track of policy within the government. Vice President Mike Pence, Secretary of State Rex Tillerson, and Secretary of Defense James Mattis—as well as officials junior to them—have been on something close to a nonstop reassurance tour of U.S. allies and partners since the beginning of the administration. National Security Adviser H.R. McMaster has joined the chorus since he stepped in to replace the ousted Michael Flynn. Their message has been unambiguous: The United States stands by its security and alliance commitments, and allies must contribute more to collective defense. True, some allies continue to harbor doubts centered on the persona of Trump. Yet—therefore?—many are moving to spend more on defense.
Now, the simple fact is that Trump could order his Cabinet members and senior staff to desist from repeating the first half of their message—the reassurance. Trump might have had some resignations to cope with, but it is well within his power to issue such an edict, and he hasn’t done so. The most likely reason he hasn’t is that he has concluded that too much is riding on these alliances. To continue in this speculative vein, what Trump knew to be true about U.S. allies during the campaign season was that they weren’t contributing enough; that’s a message that Washington has been sending with little effect for decades. What he didn’t know on the campaign trail and has since determined is how central these alliances are to U.S. national security. U.S. alliances aren’t quite so fragile as some feared. The case for them, competently made by the likes of Mattis, must be compelling, including to the skeptic in chief.
It’s here that we may be getting a little lesson in the cunning of history. From his skeptical premise, Trump sparked a very broad debate over alliances. Senior officials of his administration have probably devoted more time and energy to making the public case for NATO and our Pacific alliances during his first 10 months in office than their predecessors did in the previous 10 years. The latter had taken the utility of alliances to U.S. national security as a given.
All this attention has had an effect on public opinion. But the effect has not been, as many feared, a groundswell of support for isolationist or anti-alliance sentiment. Just the opposite. For the past three years, the Chicago Council Survey has asked, “How effective do you think [maintaining effective alliances is] to achieving the foreign-policy goals of the United States?” In 2015, 32 percent of all respondents responded “very effective.” In 2016, the figure was 40 percent. In 2017? Forty-nine percent. Specifically on NATO, 69 percent say the alliance is “essential” to U.S. security, a slight increase from 65 percent in 2016 and well above the 57 percent who said the same when the Chicago Council first asked the question in 2002.
For the first time in the history of the survey, a majority of Americans, 52 percent, say they would support “the use of U.S. troops…if Russia invades a NATO ally like Latvia, Lithuania, or Estonia.” The Trump administration has had little to say about the Russian threat to the Baltics but a great deal to say about the danger of North Korea’s nuclear weapons and missile program. A year ago, 47 percent said they would favor “the use of U.S. troops…if North Korea invaded South Korea.” That was the view of 26 percent of Americans in 1990. Today, it’s what 62 percent think.
Finally, on the question of allies paying up, the survey asked which comes closer to the respondent’s views: “The United States should encourage greater allied defense spending through persuasion and diplomatic means” or “The United States should withhold its commitment to defend NATO members” until they actually spend more. Overall, 59 percent said persuasion and diplomacy; 38 percent (including 51 percent of Republicans) would put Article 5 at risk. Maybe I’m hearing things, but that sounds to me more like a warning to our allies to take seriously American insistence that they spend more on defense starting now than it does an abrogation of the commitments at the center of U.S. national-security strategy for 70 years.
Click here to read what Tod Lindberg wrote about Candidate Trump and NATO last year.
Tod Lindberg is a member of the Chicago Council Survey’s foreign policy advisory board.
On Asia By Michael Auslin
Despite continued Russian threats in Eastern Europe and the lurking danger of an Iranian race to a nuclear bomb, it is Asia that has vaulted to the top of the national-security agenda. Barack Obama had warned Donald Trump that North Korea would be the major national-security threat he would face, and North Korean dictator Kim Jong Un has proved him right. Kim is on the threshold of fielding a reliable intercontinental ballistic missile (ICBM) that can reach U.S. territory in the Pacific and even the American homeland. He is within striking distance of achieving his family’s long-held dream of possessing the ultimate weapon. Not since 1994, when Bill Clinton initially ordered and then called back an air strike on Pyongyang’s nascent nuclear facilities, has the region seemed so close to war.
Beyond the Korean peninsula, Asia has arguably been Trump’s central foreign preoccupation since his entry into politics. He talked during his campaign about a 45 percent tariff on Chinese goods. And despite his noninterventionist affect, he began his transition phase by getting tough on China for its increasingly assertive actions during the Obama years, including the successful building and militarization of islands in contested waters in the South China Sea.
Then Trump retreated from his tough stance toward Beijing, initiating a period of seesawing between cooperation and confrontation and mixing together trade and economic concerns with security and diplomatic issues. His explicit linkage of the two, carefully separated by previous presidents, has been particularly unnerving to Beijing. China’s regime has warned of the risks of a larger trade war if Trump continues to threaten economic retaliation for disagreement on security issues. Of equal concern to Beijing has been his recent willingness to permit more frequent freedom-of-navigation operations by the U.S. Navy in the disputed South China Sea waters off the Spratly and Paracel Islands.
Trump’s initial hard line, including an unprecedented transition-period phone call to Taiwan’s president, put Beijing on its back foot. But his subsequent inconstancy has led to a reassertion of Chinese activism on economic and diplomatic issues. His withdrawal from the Trans-Pacific Partnership and general anti-free-trade stance have allowed Chinese President Xi Jinping to claim the mantle of global economic leadership—promoting free-trade alternatives and grandiose policies such as the “Belt and Road Initiative,” in which Xi has promised more than $1 trillion of infrastructure investment to link the world in a trading network centered in China.
In contrast, Trump’s relations with America’s Asian allies, particularly Japan and South Korea, have been surprisingly smooth. Again backing down from campaign rhetoric, Trump early on reaffirmed the importance of both alliances, and buried talk of making the two pay more for hosting U.S. forces on their territory. His bond with Japanese Prime Minister Shinzo Abe has been particularly close, and his conversations with South Korea’s new left-leaning president, Moon Jae In, have gone better than some expected. Far from scaling back the alliances, Trump and his top officials, including Secretary of Defense James Mattis, have put them at the center of American strategy in the Pacific, especially with respect to North Korea.
It is North Korea, however, that remains the first great test of the Trump administration. Trump clearly inherited a failed policy, stretching over past Democratic and Republican administrations alike, and was doubly cursed in coming to office on the eve of Kim Jong Un’s nuclear and ICBM breakout.
Yet despite Trump’s heated rhetoric, he and his team have actually moved cautiously on North Korea. Like its predecessors, the administration has combined shows of force, such as flying B-1 bombers over the peninsula, with appeals to the United Nations for further sanctions on Pyongyang. Two new rounds of sanctions, in July and September, may indeed have been harder than those previously levied, but, just as in the past, the administration had to settle for less than it wanted. More worrying, Trump appears to be adopting the long-held goal of presidents past: North Korean denuclearization. This is a strategic mistake that threatens to lock him into an unending series of negotiations that have served over the past quarter-century to buy time for Pyongyang to develop its nuclear and missile capabilities. I believe it would be a far more realistic move for Trump to drop the chimera of denuclearization and instead tacitly acknowledge that North Korea is a nuclear-weapons-capable state. This would free up the administration to focus on the far more important job of deterring and containing a nuclear North Korea. Since Trump is almost certainly sure to avoid a preventive war to remove Kim’s nuclear weapons, given the associated military and political risks, he will be forced in the end to accept them. That then mandates a credible and comprehensive policy to restrict North Korea’s actions abroad while making clear that any nuclear use will result in a devastating counterstrike. Washington has been deterring North Korea ever since the end of the Korean War. This new approach explicitly makes deterrence the center of U.S. policy, dropping the unobtainable goal of denuclearization or the imprudent goal of normalizing relations with North Korea. To be successful, Trump will need to get the support of both Seoul and Tokyo, which is a tall order. The alternative, however, is another round of Kabuki negotiations and the diversion of U.S. attention from the far more necessary task of ensuring that Kim Jong Un is kept in his nuclear box.
Click here to read what Michael Auslin wrote about Candidate Trump and Asia last year.
Michael Auslin is the Williams-Griffis Fellow in Contemporary Asia at the Hoover Institution, Stanford University, and the author of The End of the Asian Century (Yale).
On Israel By Daniella J. Greenbaum
As a candidate, Donald Trump’s positions on Israel were a blend of incoherence and inconsistency. He was an isolationist, except he was also Israel’s biggest supporter; he would enforce the Iran deal, except he wanted to rip it up on day one; he was the most pro- Israel candidate on the stage, except that he wanted to be “the neutral guy”; he wouldn’t commit to a policy on Jerusalem, except he declared his plan to immediately move the American Embassy to Israel’s eternal and undivided capital.
Words—especially a president’s—matter, but until Trump took office, it was impossible to predict how his administration would treat the Jewish state. Some Israel advocates became convinced that Trump’s victory would lead to the fulfillment of their bucket list of Middle East dreams—in particular, resolution of the long-simmering issue involving the location of the U.S. Embassy in Israel. The Jerusalem Embassy Act, which became law in 1995, recognized that “each sovereign nation, under international law and custom, may designate its own capital” and that “since 1950, the city of Jerusalem has been the capital of the State of Israel.” It ordered that “the United States Embassy in Israel should be established in Jerusalem no later than May 31, 1999.”
And yet, despite all that, the American Embassy has remained in Tel Aviv. (Presidents were given the power to push the date back on national-security grounds.) Much like then-candidates Bill Clinton and George W. Bush, Trump pledged to move the embassy if elected president. In a March 2016 speech to the American Israel Public Affairs Committee’s Policy Conference, Trump said unequivocally: “We will move the American Embassy to the eternal capital of the Jewish people, Jerusalem.”
The American Embassy belongs in Jerusalem, and Trump’s evolution on the issue was, for the most part, encouraging. (Early on in his candidacy, he was booed at the Republican Jewish Coalition’s annual meeting after refusing to take a position on Jerusalem’s status.) But for Israelis, who face myriad threats on a daily basis—both physically, from their many hostile neighbors, and economically, through an international boycott, divestment, and sanctions campaign—the location of the embassy ranks low on the list of urgent political matters. Even the most ardent proponents of this policy shift acknowledge it has the potential to inflame tensions in the region. Like his predecessors, Trump signed the waiver and suspended the move.
Next on the bucket list: discarding Barack Obama’s cataclysmic Iran deal. When Trump was a candidate, his intentions for the Joint Comprehensive Plan of Action (JCPOA) were anything but clear. He told AIPAC, “My number-one priority is to dismantle the disastrous deal with Iran.” But he also said, “We will enforce it like you’ve never seen a contract enforced before folks, believe me.” It’s hard to know which part of his schizophrenic speech the audience—and the country—was supposed to believe. The schizophrenia has continued during his tenure, with Trump certifying the Iran deal twice before announcing in October his decision not to recertify a third time. Despite signaling his extreme displeasure with the deal, Trump has so far opted not to terminate it. But, by refusing to recertify, he has instead left to Congress the decision whether or not to reimpose sanctions.
Most important, perhaps, to pro-Israel forces was Trump’s choice of foreign-policy team. While Jared Kushner’s lack of political experience made him an odd choice for Middle East maven—Trump exclaimed at an inauguration event: “if [he] can’t produce peace in the Middle East, nobody can”—there is no denying that Kushner is a Zionist. Along with Jason Greenblatt, Trump’s envoy to the Israeli–Palestinian peace process, Kushner visited Israel this summer to determine whether restarting peace talks was a viable course of action. The duo have articulated their desire to refrain from repeating the mistakes of previous administrations: “It is no secret that our approach to these discussions departs from some of the usual orthodoxy. … Instead of working to impose a solution from the outside, we are giving the parties space to make their own decisions about the future,” Greenblatt explained. Maybe that’s why Benjamin Netanyahu seems so elated. Bibi’s friction with Obama was well documented, and the prime minister has expressed his jubilation at the changed nature of his relationship to Washington. During the United Nations General Assembly, he tweeted: “Under your leadership, @realDonaldTrump, the alliance between the United States and Israel has never been stronger.”
During the campaign, it was hard to imagine that might be the case. Trump’s repeated use of the phrase “America First,” a classic isolationist trope with anti-Semitic overtones, was deeply concerning to pro- Israel voters. He continually insisted that foreign governments were a drain on the American economy: “I want to help all of our allies, but we are losing billions and billions of dollars. We cannot be the policemen of the world. We cannot protect countries all over the world…where they’re not paying us what we need.” According to a 2016 report from the Congressional Research Service, “Israel is the largest cumulative recipient of U.S. foreign assistance since World War II.” The report calculates that the United States has, over the years, provided Israel with more than $127 billion in bilateral assistance. If words and campaign promises meant anything to Trump, the candidate who insisted that Israel could pay “big league” would have metamorphosed into the president who ensured that it did.
But Trump’s campaign promises seem to have had no bearing on his actions. In an appropriations bill, Congress pledged an extra $75 million in aid to Israel, on top of the annual $3.1 billion already promised for this year. As part of negotiations for the 2016 Memorandum of Understanding, the Israeli government promised to return any funds that surpassed the pre-negotiated aid package. In what was doubtlessly a major disappointment to Trump’s America-first base, the State Department confirmed it will not be asking the Israelis to return the additional funds.
His behavior toward Israel during his eight months in office has confirmed what was evident throughout the campaign: Donald Trump’s words and actions have, at best, a haphazard relationship to each other. So far Israel has benefited. That may not always be the case.
Click here to read what Jordan Chandler Hirsch wrote about Candidate Trump and Israel last year.
Daniella J. Greenbaum is assistant editor of Commentary.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Of Hobbes and Harvey Weinstein
In man’s natural state, with no social or religious order to impose limits upon his hungers and passions, “notions of right and wrong, justice and injustice have there no place. Where there is no common power, force and fraud are…the cardinal virtues.” Thus did Thomas Hobbes, in 1651, anticipate and describe the sordid story of the film producer Harvey Weinstein.
The reason Weinstein’s three decades of monstrous personal and professional conduct are so appalling and fascinating in equal measure is that he was clearly functioning outside the “social compact” Hobbes said was necessary to save men from a perpetual state of war they would wage against one another in the state of nature. For that is what Weinstein was doing, in his own way: waging Hobbesian war against the women he abused and finding orgasmic pleasure in his victories.
And Weinstein did so while cleverly pretending to leadership within the social compact and disingenuously advocating for its improvement both through political change and artistic accomplishment. Hobbes said the life of man in the state of nature was nasty, brutish, and short, but he did not say the warrior could not be strategic. Rochefoucauld’s immortal declaration that hypocrisy is the tribute vice pays to virtue is entirely wrong in this case. Weinstein paid off feminists and liberals to extend his zone of protection and seduction, not to help support the virtues he was subverting with his own vices.
Hobbes said that in the state of nature there was “no arts; no letters; no society.” But if the man in the state of nature, the nihilistic warrior, coexists with people who live within the social compact, would it not be a brilliant strategy to use the arts, letters, and society as cover, and a means of infiltrating and suborning the social compact? Harvey Weinstein is a brutal thug, a man of no grace, more akin to a mafioso than a maker of culture. And yet as a movie producer he gravitated toward respectable, quality, middlebrow, elevated and elevating fare. People wanted to work with him because of the kinds of movies he made. I think we can see that was the whole point of the exercise: It was exciting to be called into his presence because you knew you would do better, more socially responsible, more praiseworthy work under his aegis than you would with another producer.
And then, garbed only in a bathrobe, Weinstein would strike.
Weinstein was universally known to be a terrible person long before the horrifying tales of his sexual predation, depredation, and assault were finally revealed. And—this is important—known to be a uniquely terrible person. His specific acts of repugnant public thuggishness were detailed in dozens of articles and blog items over the decades, and were notable precisely because they were and are not common currency in business or anywhere else. It was said of him after the latest revelations that he had mysterious abilities to suppress negative stories about himself, and perhaps he did; even so, it was a matter of common knowledge that he was the most disgusting person in the movie business, and that’s saying a lot. And that’s before we get to sex.
To take one example, Ken Auletta related a story in the New Yorker in 2001 about the director Julie Taymor and her husband, the composer Eliot Goldenthal. She had helmed a movie about Frida Kahlo produced by Weinstein. There was a preview screening at the Lincoln Square theater in Manhattan. The audience liked it, but some of its responses indicated that the plotline was confusing. Weinstein, whose hunger to edit the work of others had long since earned him the name “Harvey Scissorhands,” wanted to recut it to clarify the picture. Taymor didn’t, citing the audience’s favorable reaction. Then this happened:
He saw Taymor’s agent…and yelled at him, “Get the fuck out of here!” To Goldenthal, who wrote the score for Frida, Weinstein said, “I don’t like the look on your face.” Then, according to several witnesses, he moved very close to Goldenthal and said, “Why don’t you defend her so I can beat the shit out of you?” Goldenthal quickly escorted Taymor away. When asked about this incident, Weinstein insisted that he did not threaten Goldenthal, yet he concedes, “I am not saying I was remotely hospitable. I did not behave well. I was not physically menacing to anybody. But I was rude and impolite.” One member of Taymor’s team described Weinstein’s conduct as actually bordering on “criminal assault.”
Weinstein told the late David Carr in 2002 that his conduct in such cases had merely been the result of excess glucose in his system, that he was changing his diet, and he was getting better. That glucose problem was his blanket explanation for all the bad stories about him, like this one:
“You know what? It’s good that I’m the fucking sheriff of this fucking lawless piece-of-shit town.” Weinstein said that to Andrew Goldman, then a reporter for the New York Observer, when he took him out of a party in a headlock last November after there was a tussle for Goldman’s tape recorder and someone got knocked in the head.
Goldman’s then-girlfriend, Rebecca Traister, asked Weinstein about a controversial movie he had produced. Traister provided the predicate for this anecdote in a recent piece: “Weinstein didn’t like my question about O, there was an altercation…[and] he called me a c—.”
Auletta also related how Weinstein physically threatened the studio executive Stacey Snider. She went to Disney executive Jeffrey Katzenberg and told him the story. Katzenberg, “one of his closest friends in the business,” told Weinstein he had to apologize. He did, kind of. Afterward, Katzenberg told Auletta, “I love Harvey.”
These anecdotes are 15 years old. And there were anecdotes published about Weinstein’s behavior dating back another 15 years. What they revealed then is no different from what they reveal now: Weinstein is an out-and-out psychopath. And apparently this was fine in his profession…as long as he was successful and important, and the stories involved only violence and intimidation.
Flash-forward to October 2017. Katzenberg—the man who loved Harvey—publicly released an email he had sent to Weinstein after he was done for: “You have done terrible things to a number of women over a period of years. I cannot in any way say this is OK with me…There appear to be two Harvey Weinsteins…one that I have known well, appreciated, and admired and another that I have not known at all.”
So which Weinstein, pray tell, was the one from whom Katzenberg had had to protect Stacey Snider? The one he knew or the one he didn’t know? Because they are, of course, the same person. We know that sexual violence is more about power than sex—about the ultimate domination and humiliation. In these anecdotes and others about Weinstein, we see that his great passions in life were dominating and humiliating. Even if the rumors hadn’t been swirling around his sexual misconduct for decades, could anyone actually have been surprised he sought to secure his victory over the social compact in the most visceral way possible outside of murder?
The commentariat’s reaction to the Weinstein revelations has been desperately confused, and for once, the confusion is constructive, because there are strange ideological and moral convergences.
The most extreme argument has it that he’s really not a unique monster, that every working woman in America has encountered a Weinstein, and that the problem derives from a culture of “toxic masculinity.” This attitude is an outgrowth of the now-fashionable view that there have been no real gains for women and minorities over the past half-century, that the gains are illusory or tokenish, and that something more revolutionary is required to level the playing field.
As a matter of fact in the Weinstein case, this view is false. Women have indeed encountered boors and creeps in their workplaces. But a wolf-whistler is not a rapist. Someone who leers at a woman isn’t the same as someone who masturbates in front of her. Coping with grotesque and inappropriate co-workers and bosses is something every human being, regardless of gender, has had to deal with, and will have to deal with until we are all replaced by robots. It’s worse for women, to be sure. Still, no one should have to go through such experiences. But we all have and we all do. It’s one of the many unpleasant aspects of being human.
Still, the extreme view of “toxic masculinity” contains a deeper truth that is anything but revolutionary. It takes us right back to Hobbes. His central insight—indeed, the insight of civilization itself—is that every man is a potential Weinstein. This clear-eyed, even cold-eyed view of man’s nature is the central conviction of philosophical conservatism. Without limits, without having impressed upon us a fear of the legal sanction of punishment or the social sanction of shame and ostracism, we are in danger of seeking our earthly rewards in the state of nature.
The revolutionary and the conservative also seem to agree there’s something viscerally disturbing about sex crimes that sets them apart. But here is where the consensus between us breaks down. Logically, if the problem is that we live in a toxic culture that facilitates these crimes, then the men who commit them are, at root, cogs in an inherently unjust system. The fault ultimately is the system’s, not theirs.
Harvey Weinstein is an exceptionally clever man who spent decades standing above and outside the system, manipulating it and gaming it for his own ends. He’s no cog. Tina Brown once ran Weinstein’s magazine and book-publishing line. She wrote that “strange contracts pre-dating us would suddenly surface, book deals with no deadline attached authored by attractive or nearly famous women, one I recall was by the stewardess on a private plane.” Which means he didn’t get into book publishing, or magazine publishing, to oversee the production of books and articles. He did it because he needed entities through which he would pass through payoffs both to women he had harassed and molested and to journalists whose silence he bought through options and advances. His primary interest wasn’t in the creation of culture. It was the creation of conditions under which he could hunt.
Which may explain his choice of the entertainment industry in the first place. In how many industries is there a specific term for demanding sexual favors in exchange for employment? There’s a “casting couch”; there’s no “insurance-adjustor couch.” In how many industries do people conduct meetings in hotel rooms at off hours anyway? And in how many industries could that meeting in a hotel room end up with the dominant player telling a young woman she should feel comfortable getting naked in front of him because the job for which she is applying will require her to get naked in front of millions?
Weinstein is entirely responsible for his own actions, but his predatory existence was certainly made easier by the general collapse of most formal boundaries between the genders. Young women were told to meet him in private at night in fancy suites. Half a century earlier, no young woman would have been permitted to travel alone in a hotel elevator to a man’s room. The world in which that was the norm imposed unacceptable limitations on the freedoms of women. But it did place serious impediments in the paths of predators whose despicable joy in life is living entirely without religious, spiritual, cultural, or moral impediment.
Hobbes was the great philosopher of limits. We Americans don’t accept his view of things; we tend to think better of people than he did. We tend to believe in the greater good, which he resolutely did not. We believe in self-government, which he certainly did not. But what our more optimistic outlook finds extraordinarily difficult to reckon with is behavior that challenges this complacency about human nature. We try to find larger explanations for it that place it in a more comprehensible context: It’s toxic masculinity! It’s the residue of the 1960s! It’s the people who enabled it! The truth is that, on occasion—and this is one such occasion—we are forced to come face to face with the worst of what any of us could be. And no one explanation suffices save Hamlet’s: “Use every man after his desert, and who should ’scape whipping?”
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
The education-reform outfit’s hard-left shift
In remaking itself, TFA has subtly downgraded the principles that had won it allies across the spectrum. George W. Bush, Mitch McConnell, John Cornyn, Chris Christie, and Meg Whitman are a few of the Republicans who championed TFA. The group attracted such boldface names, and hundreds of millions of dollars from some of the largest American firms and philanthropies, because it stood for a simple but powerful idea: that teacher quality is the decisive factor in the educational outcomes produced by schools.
Judging by its interventions in recent debates, it isn’t all that clear that senior TFA executives still believe this. These days, TFA’s voice on charters, accountability, and curricular rigor is decidedly muffled. Such education-reform essentials have been eclipsed in TFA’s discourse by immigration, policing, “queer” and transgender-identity issues, and other left-wing causes. TFA’s message seems to be that until numerous other social ills are cured—until immigration is less restricted, policing becomes more gentle, and poverty is eliminated—an excellent education will elude the poor. That was the status-quo defeatism TFA originally set out to challenge.
Wendy Kopp conceived TFA when she was a senior at Princeton in 1989. Unable to get a New York City teaching job without a graduate degree and state certification, Kopp wrote a thesis calling for the creation of a nontraditional recruitment pipeline that would bring America’s most promising young people to its neediest classrooms. TFA members would teach for two years, applying their energy and ambition to drive achievement at the classroom level. She speculated that some would stay in education, while others would go on to careers in law, medicine, business, journalism, etc. But all would remain “lifelong leaders in the effort to end educational inequity.”
The following year, Kopp launched TFA with a corps of 489 new teachers who were dispatched to schools in six regions—a virtuoso feat of social entrepreneurship. Since then some 50,000 teachers have completed the program. This year’s corps counts around 6,400 members, serving 53 regions from coast to coast.
By the time I joined, in 2005, TFA had distilled the experience of its best corps members into a theory of educational transformation called “Teaching as Leadership.” Most people, it said, aren’t natural-born educators. But they could rise to classroom greatness by setting “big goals” for all students, planning engaging lessons, continually assessing their students, maintaining tough discipline, and investing parents and the wider community in their goals.
Mostly, great teachers work hard—really hard. TFA brought the work habits usually associated with large law firms and high-end management consultancies to America’s K–12 failure factories. Its “summer institute” for new recruits was a grueling ordeal of tears, sweat, and 16-hour days. When I was a corps member, we were told that this is what it would take to overcome the forces of the status quo, which were chronically low expectations; broken homes and criminality in the streets; messy, undisciplined classrooms; and bloated bureaucracies that put the needs of adults above those of children.
The TFA worldview diverged sharply from the one that predominated in the education industry. The leading lights of the profession held that the achievement gap was a product of inadequate funding and larger social inequalities. Thus they transferred blame for classroom outcomes from teachers to policymakers and society at large. Teachers’ unions were particularly fond of this theory, since it provided cover for resisting accountability and high expectations.
TFA raged against all this. The assumption that some kids were doomed to underachievement was wrong and, indeed, bigoted. Ditto for the notion that inner-city children couldn’t be expected to behave like young scholars. These children could pull themselves up, provided they had dedicated educators who believed in them. This wasn’t to say that external factors were discounted altogether. But TFA concentrated on the things that educators and school leaders could control. It would emphasize self-help and uplift. And it would accept friends and allies across political divides to fulfill the promise of educational equality.T oday’s Teach for America is a different story. TFA’s leaders have now fully enlisted the organization in the culture war—to the detriment of its mission and the high-minded civic sensibility that used to animate its work.
This has been most visible in TFA’s response to the 2016 election. TFA chief executive Elisa Villanueva Beard, who took over from Kopp four years ago, doesn’t bother to mask either her progressivism or her revulsion at the new administration. When, a couple of weeks after the election, the president-elect announced his choice of Betsy DeVos to lead the Department of Education, Beard’s response was swift and cold.
A November 23 TFA news release began by decrying Trump’s “indisputably hostile and racially charged campaign” and called on DeVos to uphold “diversity, equity, and inclusiveness.” The statement went on to outline 11 TFA demands. Topping the litany was protection of the previous administration’s Deferred Action for Childhood Arrivals, or DACA, program, which granted legal status to certain illegal immigrants brought into the country as children. Then came the identity-politics checklist: “SAFE classrooms for LGBTQ youth and teachers,” “safe classrooms for students and teachers with disabilities,” “safe classrooms for Muslim students and teachers,” “culturally responsive teaching,” and so on.
Of the 11 demands, only three directly touched core education-reform areas—high expectations, accountability, and data-driven instruction—and these were couched in the broadest terms possible. Most notably, there wasn’t a single kind word for DeVos: no well wishes, no hope of “working together to achieve common goals,” no call for dialogue, nothing but angry demands. This, even though the secretary-designee was a passionate charter advocate and came from the same corporate philanthropy and activism ecosystem that TFA had long inhabited.
It is true that inner-city educators were horrified at the election of a candidate who winked at David Duke and suggested that a federal judge’s Mexican heritage was disqualifying. TFA’s particular concern about DACA makes sense, since many corps members work with illegal-immigrant children in border states. (My own stint took me to the Rio Grande Valley region of South Texas.)
Even so, TFA’s allergic reaction to the Trump phenomenon reflects faulty strategic thinking. Beard isn’t Rachel Maddow, and TFA isn’t supposed to be an immigration-reform outfit, still less a progressive think tank. With Republicans having swept all three branches of the federal government, as well as a majority of statehouses and governors’ mansions, TFA must come to terms with the GOP. Condemning the new education secretary as barely legitimate wasn’t wise.
Beard is also making a grave mistake by attempting to banish legitimate conservative positions from the reform movement. In the wake of the bloody white-nationalist protests in Charlottesville, Virginia, she blasted an email to the organization that denounced in one breath opposition to affirmative action and “racist and xenophobic violence.” Some two-thirds of Americans oppose race-based affirmative action. Will these Americans give TFA a fair hearing on educational reform when the organization equates them with alt-right thugs? In a phone interview, Beard said she didn’t intend to link white nationalism with opposition to affirmative action.
As for DACA, the amount of attention TFA devotes to the fate of those affected is out of all proportion. TFA has a full-time director for DACA issues. A search of its website reveals at least 31 news releases, statements, and personal blogs on DACA—including a 2013 call for solidarity with “UndocuQueer students” that delved into the more exotic dimensions of intersectionality. As one education reformer told me in an interview, “They are super-concerned with ‘can’t wait’ issues—DACA and so on—and so much of their mental space [is filled up] by that kind of thing that less of their attention and time is being spent” on central priorities. “Personally, I think that’s such a shame.” (This reformer, and others I interviewed for this article, declined to speak on the record.)
By contrast, TFA didn’t call out Mayor Bill de Blasio on his attempts to roll back charter schools in New York. The organization has rarely targeted teachers’ unions the way it has ripped into Trump. But it is the National Education Association and the American Federation of Teachers that pose the main obstacle to expanding school choice and dismissing ineffective teachers. It is the unions that are bent on snuffing out data-driven instruction. It was a teachers’ union boss (Karen Lewis of Chicago), not the 45th president, who in 2012 accused TFA of supporting policies that “kill and disenfranchise children.”T
each for America’s turn to the harder left predated Trump’s ascent, and it isn’t mainly about him. Rather, it tracks deeper shifts within American liberalism, from the meritocratic Clintonian ideas of the 1990s and early aughts to today’s socialist revival and the fervid politics of race, gender, and sexuality.
Culturally, TFA was always more liberal than conservative. Educators tend to be liberal Democrats, regardless of the path that brings them to the classroom. But education reformers are unwanted children of American liberalism. They are signed up for the Democratic program, but they clash with public-sector labor unions, the most powerful component of the party base.
As TFA went from startup to corporate-backed giant, it sustained withering attacks from leftist quarters. On her influential education blog, New York University’s Diane Ravitch (a one-time education reformer who changed sides) relentlessly hammered corps members as “woefully unprepared,” as scabs “used to take jobs away from experienced teachers,” as agents of “privatization” and the “neoliberal attack on the public sector.” It was Ravitch who publicized Lewis’s claim that TFAers “kill” kids.
Michelle Rhee, the Korean-American alumna who in 2007 was tapped as chancellor of the District of Columbia system, became a lightning rod for anti-TFA sentiment on the left. Rhee’s no-nonsense approach to failing schools was summed up in a Time magazine cover that showed her holding a broom in the middle of a classroom. When D.C. Mayor Adrian Fenty didn’t win reelection in 2010, it was seen as a popular verdict against this image of TFA-style reform.
In 2013, one university instructor, herself a TFA alumna, urged college professors not to write letters of recommendation for students seeking admission to the organization. Liberal pundits took issue with TFA’s alleged elitism and lack of diversity, portraying it as the latest in a long line of “effete” white reformist institutions that invariably let down the minorities they try to help. TFA, argued a writer in the insurgent leftist magazine Jacobin, is “another chimerical attempt in a long history of chimerical attempts to sell educational reform as a solution to class inequality. At worst, it’s a Trojan horse for all that is unseemly about the contemporary education-reform movement.” By “unseemly,” the writer meant conservative and corporate.
The assaults have had an effect. Applications to TFA dropped to 37,000 last year, down from 57,000 in 2013. Thus ended a growth spurt that had seen the organization increase the size of its corps by about a fifth each year since 2000. Partly this was due to more jobs and better salaries on offer to elite graduates in a rebounding private sector. But as Beard conceded in a statement in April 2016, partly it was the “toxic debate surrounding education” that was “pushing future leaders away from considering education as a space where they can have real impact.”
The temptation for any successful nonprofit crusade is to care more about viability and growth than the original cause. Wounded by the union-led attacks, TFA leaders have apparently concluded that identity politics and a progressive public presence can revive recruitment. With its raft of corporate donors and the massive Walton-family endowment, TFA would never fit in comfortably with an American liberalism moving in the direction of Bernie Sanders and Elizabeth Warren. But talk of Black Lives and “UndocuQueers” might help it reconnect with younger millennials nursed on race-and-gender theory.
Thus, TFA leads its current pitch by touting its diversity. Beard opened her keynote at last year’s 25th-anniversary summit in Washington by noting: “We are more diverse than we have ever been. . . . We are a community that is black, that is Latino, that is white, that is American Indian, that is Asian and Pacific Islander, that is multiracial. We are a community that is lesbian, gay, bisexual, queer and trans.” The organization’s first priority, Beard went on, will always be “to build an inclusive community.”
It makes sense to recruit diverse teachers to lead classrooms in minority-majority regions, to be sure. But one can’t help detecting a certain liberal guilt behind this rhetoric, as if TFA had taken all the attacks against it to heart: We aren’t elite, we swear! Yet the 90 percent of black children who don’t reach math proficiency by eighth grade need good math teachers, period. Their parents don’t care how teachers worship (if at all), what they look like, or what they get up to in the bedroom. They want teachers who will put their children on a trajectory out of poverty.
Minority parents, moreover, fear for their kids’ well-being in chaotic schools and gang-infested streets. Yet to hear many of the speakers at TFA’s summit, you would have thought that police and other authority figures represent the main threat to black and Hispanic children. At a session titled “#StayWoke,” a TFA teacher railed against the police:
I teach 22 second-graders in Southeast D.C., all of them students of color. Sixteen of them are beautiful, carefree black and brown boys, who, despite their charm and playfulness, could be slain in the streets by the power that be [sic], simply because of the color of their skin, what clothes they wear, or the music they choose to listen to.
Educators must therefore impart “a racial literacy, a literacy of resistance.” Their students “must grow up woke.” Another teacher-panelist condemned anti-gang violence initiatives that
come from the same place as the appetite to charge black and brown people with charges of self-destruction. The tradition of blaming black folk keeps us from aiming at real sources of violence. If we were really interested in ending violence, we would be asking who pulled the trigger to underfund schools in Philadelphia? Who poisoned our brothers and sisters in Flint, Michigan? Who and what made New Orleans the incarceration capital of the world? We would teach our students to raise these questions.
Throughout, he led the assembly in chants of “Stay Woke!”
Talk of teaching “resistance” represented a reversion to the radical pedagogy and racial separatism that left a legacy of broken inner-city schools in the previous century. TFA’s own experience, and that of TFA-linked charter networks such as the Knowledge Is Power Program, had taught reformers that, to thrive academically, low-income students need rigid structure and order. Racial resentment won’t set these kids up for success but for alienation and failure—and prison.
Another session, on “Academic Rigor, Social and Political Consciousness, and Culturally Relevant Pedagogy,” pushed similar ideas. Jeff Duncan-Andrade, an associate professor of “Raza studies” at San Francisco State University, urged teachers to develop an ultra-localized race-conscious curriculum:
Don’t even essentialize Oakland’s culture! If you’re from the town, you know it’s a big-ass difference between the west and the east [sic]. We talk differently, we walk differently, we dress differently, we speak differently. The historical elements are different. So if you use stuff from the west [of Oakland] you have to really figure out, ‘How do I modify this to be relevant to the communities I’m serving in East Oakland?’ Develop curriculum, pedagogy, assessment that is responsive to the community you serve. You gotta become an ethnographer. You gotta get on the streets, get into the neighborhoods and barrios…talk to the ancestors…
If your curriculum is not building pathways to self-love for kids who at every turn of their day are taught to hate themselves, hate the color of their skin, hate the texture of their hair, hate the color of their eyes, hate the language they speak, hate the culture they come from, hate the ‘hood that they come from, hate the countries that their people come from, then what’s the purpose of your schooling?
Other sessions included “Native American Community Academy: A Case Study in Culturally Responsive Pedagogy”; “What Is the Role of White Leaders?”; “Navigating Gender Dynamics”; “Beyond Marriage Equality: Safety and Empowerment in the Education of LGBTQ Youth”; “A Chorus of Voices: Building Power Together,” featuring the incendiary Black Lives Matter activist and TFA alumnus DeRay McKesson; “Every Student Counts: Moving the Equity Agenda Forward for Asian American and Pacific Islander Students”; “Intentionally Diverse Learning Communities”; and much more of the kind.
Lost amid all this talk of identitarian self-love was the educator’s role in leading poor children toward things bigger and higher than Oakland, with its no doubt edifying east–west street rivalries—toward the glories of the West and the civic and constitutional bonds that link Americans of all backgrounds. You can be sure that the people who participate in TFA see to it that their own children learn to appreciate Caravaggio and Shakespeare and The Federalist. The whole point of the organization was to ensure that kids from Oakland could do the same.
Twenty-seven years since Teach for America was founded, the group’s mission remains vital. Today fewer than 1 in 10 children growing up in low-income communities graduate college. The basic political dynamics of education reform haven’t changed: Teach for America, and the other reform efforts it has inspired, have shown what works. The question is whether Teach for America is still determined to reform schools and fight for educational excellence for all—or whether it wants to become a cash-flush and slick vehicle for the new politics of identity.
Choose your plan and pay nothing for six Weeks!
Review of 'iGen' By Jean Twenge
n 1954, scientists James Olds and Peter Milner ran some experiments on rats in a laboratory at McGill University. What they found was remarkable and disturbing. They discovered that if electrodes were implanted into a particular part of the rat brain—the lateral hypothalamus—rats would voluntarily give themselves electric shocks. They would press a lever several thousand times per hour, for days on end, and even forgo food so that they could keep pressing. The scientists discovered that the rats were even prepared to endure torture in order to receive these shocks: The animals would run back and forth over an electrified grid if that’s what it took to get their fix. They enjoyed the shocks so much that they endured charring on the bottoms of their feet to receive them. For a long time afterward, Olds and Milner thought that they had discovered the “bliss center” of the brain—but this was wrong. They had discovered the reward center. They had found the part of the brain that gives us our drives and our desires. These scientists assumed that the rats must have been in a deep state of pleasure while receiving these electric shocks, but in reality they were in a prolonged state of acute craving.
Jean Twenge’s important new book, iGen: Why Today’s Super-Connected Kids Are Growing Up Less Rebellious, More Tolerant, Less Happy—and Completely Unprepared for Adulthood, talks about a new form of electronic stimulation that appears to be driving young people to extreme distraction. A professor of psychology at San Diego State University, Twenge has built her career on looking at patterns in very large samples of people across long periods of time. She takes data from the General Social Survey, which has examined adults 18 years and older since 1966; the American Freshman Survey, which has questioned college students since 1991; the Youth Risk Behavior Surveillance System; and the Monitoring the Future databases. She looks to see whether there have been any changes in behavior and personality across time for people the same age but from different generations. Prior to iGen, she was the author of The Narcissism Epidemic (2009), co-written with psychologist W. Keith Campbell, and Generation Me (2013), a book about self-entitled Millennials. Twenge knows whereof she speaks.
Unlike previous patterns of rising narcissism, the trends of self-regard and self-entitlement associated with those born after 1995 appear to have petered out. What Twenge does find, however, is that reversals in trends of narcissism have been replaced by sharp increases in anxiety. Rates of anxiety and depression are spiking rapidly in young people, while at the same time their engagement with adult behaviors is declining. Using dozens of graphs, Twenge shows the reader how teenagers today drink less, go out less, socialize less, are less motivated to get their driver’s license, work less, date less, and even have sex less.
At first glance, the data seem counterintuitive, because the social pressures to abstain from alcohol and casual sex have never been more relaxed. But, on further reading, it appears that young people’s avoidance of adult behaviors has at least something to do with the addictive and distracting nature of smartphones and social media. Of course, Twenge is careful to point out that this is all “correlational.” She does not have a smoking gun and cannot prove causality. But the speculation seems plausible. All of the changes she observes started accelerating after 2007, when smartphones became ubiquitous. She writes:
I asked my undergraduate students what I thought was a very simple question: “What do you do with your phone while you sleep? Why?” Their answers were a profile in obsession. Nearly all slept with their phones, putting them under their pillows, on the mattress, or at the very least within arm’s reach of the bed. They checked social media websites and watched videos right before they went to bed and reached for their phones again as soon as they woke up in the morning (they had to—all of them used it as their alarm). Their phone was the last thing they saw before they went to bed and the first thing they saw when they woke up. If they woke up in the middle of the night they often ended up looking at their phones. They talked about their phones the way an addict would talk about crack: “I know I shouldn’t, but I just can’t help it.”
Recent experiments also lend support to the hypothesis. In an experiment carried out in 2013, psychologists Larry Rosen and Nancy Cheever brought 163 university students into a room. Some students had their phones unexpectedly taken away and others were told to put their phones on silent and out of sight. All students were then asked to fill out a brief anxiety questionnaire in 20-minute intervals. Those who were the heaviest smartphone users and heaviest social-media users recorded anxiety levels that kept climbing over the 90-minute period. The kids who used their smartphones the least did not have any increase in anxiety. This experiment lends strong support to the hypothesis that smartphones, by their propensity to promote constant use, do in fact cause agitation.
Twenge’s chapter on mental health in the generation born after 1995 makes for the book’s most disturbing reading. Heavy smartphone and social-media use correlates with higher anxiety and increased feelings of loneliness, particularly in girls. Social media seems to allow girls to bully one another in much more subtle and effective ways than were previously available. They constantly include or exclude one another from online activities such as group “chats,” and they are forever surveilling their peers’ presentation and appearance. This means that if girls aren’t vigilantly checking their social-media accounts, they won’t know if they’re being gossiped about or excluded from some fun activity. Like the electrodes placed on Olds and Milner’s rats, this new technology seems to activate the reward center—but it does not induce states of contentment, satisfaction, or meaning. It also takes time away from other activities such as sports and in-person socializing that would induce feelings of contentment and satisfaction. For a young person who is developing his personality and his competencies in the real world, this could have a profound and long-lasting effect.
Twenge tries not to be alarmist, and she presents her findings in a cautious, conscientious manner. She takes care to make caveats and eschew emotionally laden language. But it’s hard not to be alarmed by what she has found. In the six years between 2009 and 2015, the number of high-school girls who attempted suicide increased by 43 percent and the number of college students who “seriously considered” ending their lives rose by 51 percent. Suicides in young people are carefully tracked—there can be no ambiguity in this data—and increasing rates of children killing themselves are strong evidence that something is seriously amiss. From 2007 (the year smartphones became omnipresent) to 2015, suicide among 15- to 19-year-olds rose by 46 percent, and among those aged 12 to 14, it rose by half. And this rise is particularly pronounced for young girls. Three times as many 12- to 14-year-old girls killed themselves in 2015 as in 2007; among boys that age, suicide doubled in the same period. The suicide rate is always higher for boys (partly because they use more violent methods), but girls are now beginning to close this gender gap.
Another startling chapter in Twenge’s book focuses on sex, relationships, and family formation. We all know that young people are putting off marriage and child-rearing until later years, often for sensible reasons. But what is less well known is that young people are dating a lot less and spending a lot more time alone. It appears that old-fashioned romance and courtship norms are out the window, and so too is sex among young people. Twenge writes:
[M]ore young adults are not having sex at all. More than twice as many iGen’ers and late Millennials (those born in the 1990s) in their early twenties (16 percent) had not had sex at all since age 18 compared to GenX’ers at the same age (6 percent). A more sophisticated statistical analysis that included all adults and controlled for age and time period confirmed twice as many “adult virgins” among those born in the 1990s than among those born in the 1960s.
But if 16 percent are virgins, that means 84 percent of young people are having sex. Perhaps, then, there’s only a small segment bucking the trend toward more libertine lifestyles? Not so. Twenge writes:
Even with age controlled [in samples], Gen X’ers born in the 1970s report having an average of 10.05 sexual partners in their lifetimes, whereas Millennials and iGen’ers born in the 1990s report having sex with 5.29 partners. So Millennials and iGen’ers, the generations known for quick, casual sex, are actually having sex with fewer people.
For decades, conservatives have worried about loosened social and sexual mores among young people. It’s true that sexual promiscuity poses meaningful risks to youths’ well-being, especially among women. But there are also risks that manifest at a broader level when there is a lack of sexual activity in young people. And this risk can be summed up in three words—angry young men. Anthropologists are well aware that societies without strong norms of monogamous pairing produce a host of negative outcomes. In such populations, crime and child abuse increase while savings and GDP decline. Those are just some of the problems that come from men’s directing their energies toward competing with one another for mates instead of providing for families. In monogamous societies, male-to-male competition is tempered by the demands of family life and planning for children’s futures.
These trends identified by Twenge—increased anxiety and depression, huge amounts of time spent on the Internet, and less time spent dating and socializing—do not bode well for the future of Western societies. It should come as no surprise that young people who struggle to connect with one another and young men who can’t find girlfriends will express their anxieties as political resentments. Twenge’s book reveals just how extensive those anxieties are.
Like the rats that forgo food to binge on electric shocks, teenagers are forgoing formative life experiences and human connection in order to satiate their desire for electronic rewards. But the problem is not necessarily insurmountable. Twenge identifies possible protective factors such as playing sports, real-life socializing, adequate sleep, sunlight, and good food. Indeed, phone apps designed to encourage good habits are becoming popular, as are those that lock people out of their social-media accounts for predetermined periods of time. Twenge also argues that iGen has several positive indicators. They are less narcissistic and are more industrious than the generation before them, and they are also more realistic about the demands of work and careers. But harnessing those qualities will require an effort that seems at once piddling and gargantuan. IGen’s future well-being, and ours, depends on whether or not they can just put down their phones.
Choose your plan and pay nothing for six Weeks!
Playwrights and politics
No similar incidents have been reported, but not for lack of opportunity. In the past year, references to Trump have been shoehorned into any number of theatrical productions in New York and elsewhere. One Trump-related play by a noted author, Robert Schenkkan’s Building the Wall, has already been produced off Broadway and across America, and various other Trump-themed plays are in the pipeline, including Tracy Letts’s The Minutes and Beau Willimon’s The Parisian Woman, both of which will open on Broadway later this season.
The first thing to be said about this avalanche of theatrical activity is that these plays and productions, so far as is known, all show Trump in a negative light. That was to be expected. Save for David Mamet, I am not aware of any prominent present-day American playwright, stage actor, director, or technician who has ever publicly expressed anything other than liberal or progressive views on any political subject whatsoever. However, it appears one can simultaneously oppose Trump and still be skeptical about the artistic effects of such lockstep unanimity, for many left-of-center drama critics have had unfavorable things to say about the works of art inspired to date by the Trump presidency.
So even a political monoculture like that of the American theater can criticize the fruits of its own one-sidedness. But can such a culture produce any other kind of art? Or might the Theater of Trump be inherently flawed in a way that prevents it from transcending its limitations?F rom Aristophanes to Angels in America, politics has always been a normal part of the subject matter of theater. Not until the end of the 19th century, though, did a major playwright emerge whose primary interest in writing plays was political rather than aesthetic. George Bernard Shaw saw himself less as an artist than as a propagandist for the causes to which he subscribed, which included socialism, vegetarianism, pacifism, and (late in his life) Stalinism. But Shaw took care to sugar the political pill by embedding his preoccupations in entertaining comedies of ideas, and he was just as careful to make his villains as attractive—and persuasive-sounding—as his heroes.
In those far-off days, the English-speaking theater world was more politically diverse than it is today both on and off stage. It was only in the late ’40s that the balance started to shift, at first slowly, then with steadily increasing speed. In England, this ultimately led to a theater in which it is now common to find explicit political statements embedded not merely in plays but also in such commercial musicals as Billy Elliot, a show about the British miners’ strike of 1984 in which a chorus of children sings a holiday carol whose refrain runs as follows: “Merry Christmas, Maggie Thatcher / We all celebrate today / Cause it’s one day closer to your death.”
As this example suggests, postwar English political theater is consumed with indictments of the evils arising from the existence of a rigid class system. American playwrights, by contrast, are typically more inclined to follow in the footsteps of Arthur Miller and Tennessee Williams, both of whose plays portray (albeit for different reasons) the spiritual and emotional poverty of middle-class life. In both countries, most theater is neither explicitly nor implicitly political. Nevertheless, the theater communities of England and America have for the last half-century or so been all but unanimous in their offstage political convictions. This means that when an English-language play is political, the views that it embodies will almost certainly be left-liberal.
This unanimity of opinion is responsible for what I called, in a 2009 Commentary essay about Miller, the “theater of concurrence.”1 Its practitioners, presumably because all of their colleagues share their political views, take for granted that their audiences will also share them. Hence they write political plays in which no attempt is made to persuade dissenters to change their minds, it being assumed that no dissenters are present in the theater. In the theater of concurrence, disagreement with left-liberal orthodoxy is normally taken to be the result either of invincible ignorance or a deliberate embrace of evil. In the U.S. and England alike, it has become rare to see old-fashioned Shavian political plays like David Hare’s Skylight (1995) in which the devil (in this case, a Thatcherite businessman in love with an upper-middle-class do-gooder) is given his due. Instead, we get plays whose villains are demoniacal monsters (Tony Kushner’s fictionalized portrayal of Roy Cohn in Angels in America is an example) rather than flawed humans who, like Tom in Skylight, have reached the point of no moral return.
All this being the case, it makes perfect sense that Donald Trump’s election should have come as so disorienting a shock to the American theater community, which took for granted that he was unelectable. No sooner were the votes tallied than theater people took to social media to angrily declare their unalterable resistance to the Trump presidency. Many of them believe both Trump and his supporters to be, in Hillary Clinton’s oft-quoted phrase, members of “the basket of deplorables . . . racist, sexist, homophobic, xenophobic, Islamophobic, you name it.”
What kind of theater is emerging from this shared belief? Building the Wall, the first dramatic fruit of the Trump era, is a two-character play set in the visiting room of a Texas prison. It takes place in 2019, by which time President Trump has been impeached after having responded to the detonation of a nuclear weapon in Times Square by declaring nationwide martial law and locking up every foreigner in sight. The bomb, it turns out, was a “false flag” operation planted not by terrorists but by the president’s men. Rick, the play’s principal character, has been imprisoned for doing something so unspeakably awful that he and his interlocutor, a sanctimonious black journalist who is interviewing him for a book, are initially reluctant to talk about it. At the end of an hour or so of increasingly broad hints, we learn that Rick helped the White House set up a Nazi-style death camp for illegal immigrants.
Schenkkan has described Building the Wall as “not a crazy or extreme fantasy,” an inadvertently revealing remark. It is possible to spin involving drama out of raging paranoia, but that requires a certain amount of subtlety, not to mention intelligence—and there is nothing remotely subtle or intelligent about Building the Wall. Rick is a blue-collar cartoon, a regular-guy Texan who claims not to be a racist but voted for Trump because “all our jobs were going to Mexico and China and places like that and then the illegals here taking what jobs are left and nobody gave a damn.” Gloria, his interviewer, is a cartoon of a different kind, a leftsplaining virtue signal in human form who does nothing but emit smug speeches illustrating her own enlightened state: “I mean, at some point in the past we were all immigrants, right, except for Native Americans. And those of us who didn’t have a choice in the matter.” The New York production of Building the Wall closed a month ahead of schedule, having received universally bad reviews (the New York Times described it as “slick and dispiriting”).
The Public Theater’s Julius Caesar, by contrast, received mixed but broadly positive reviews. But it, too, was problematic, albeit on an infinitely higher level of dramatic accomplishment. Here, the fundamental problem was that Eustis had superimposed a gratuitous directorial gloss on Shakespeare’s play. There have been many other high-concept productions of Julius Caesar, starting with Orson Welles’s 1937 modern-dress Broadway staging, which similarly transformed Shakespeare’s play into an it-can-happen-here parable of modern-day fascism. But Eustis’s over-specific decision to turn Caesar into a broad-brush caricature of Trump hijacked the text instead of illuminating it. Rather than allowing the audience to draw its own parallels to the present situation, he pandered to its prejudices. The result was a quintessential example of the theater of concurrence, a staging that undercut its not-inconsiderable virtues by reducing the complexities of the Trump phenomenon to little more than boob-baiting by a populist vulgarian.
Darko Tresjnak committed a venial version of the same sin in his Hartford Stage revival of Shaw’s Heartbreak House (1919), which opened around the same time as Building the Wall and Julius Caesar. Written in the wake of World War I, Heartbreak House is a tragicomedy about a group of liberal bohemians who lack the willpower to reconstruct their doomed society along Shaw’s preferred socialist lines. Tresjnak’s lively but essentially traditional staging hewed to Shaw’s text in every way but one: He put a yellow Trump-style wig on Boss Mangan, the bloated, parasitical businessman who is the play’s villain. The effect was not unlike dressing a character in a play in a T-shirt with a four-letter word printed across the chest. The wig triggered a loud laugh on Mangan’s first entrance, but you were forced to keep on looking at it for the next two hours, by which time the joke had long since grown numbingly stale. It was a piece of cheap point-making unworthy of a production that was otherwise distinguished.How might contemporary theater artists engage with the Trump phenomenon in a way that is both politically and artistically serious?
For playwrights, the obvious answer is to follow Shaw’s own example by allowing Trump (or a Trump-like character) to speak for himself in a way that is persuasive, even seductive. Shaw himself did so in Major Barbara (1905), whose central character is an arms manufacturer so engagingly urbane that he persuades his pacifist daughter to give up her position with the Salvation Army and embrace the gospel of high explosives. But the trouble with this approach is that it is hard to imagine a playwright willing to admit that Trump could be persuasive to anyone but the hated booboisie.
Then there is Lynn Nottage’s Sweat, which transferred to Broadway last March after successful runs at the Oregon Shakespeare Festival and the Public Theater. First performed in the summer of 2015, around the time that Trump announced his presidential candidacy, Sweat is an ensemble drama about a racially diverse group of unemployed steel workers in Reading, the Pennsylvania city that has become synonymous with deindustrialization. Trump is never mentioned in the play, which takes place between 2000 and 2008 and is not “political” in the ordinary sense of the word, since Nottage did not write it to persuade anyone to do anything in particular. Her purpose was simply to show how the people of Reading feel, and try to explain why they feel that way. Tightly structured and free of sermonizing, Sweat is a wholly personal drama whose broader political implications are left unsaid. Instead of putting Trump in the pillory, it takes a searching look at the lives of the people who voted for him, and it portrays them sympathetically, making a genuine good-faith attempt to understand why they chose to embrace Trumpian populism.
Sweat is a model for serious political art—artful political art, if you will. Are more such plays destined to be written about Donald Trump and his angry supporters? Perhaps, if their authors heed the wise words of Joseph Conrad: “My task which I am trying to achieve is, by the power of the written word, to make you hear, to make you feel—it is, before all, to make you see.” Only the very best artists can make political art with that kind of revelatory power. Shaw and Bertolt Brecht did it, and so has Lynn Nottage. Will Tracy Letts and Beau Willimon follow suit, or will they settle for the pandering crudities of Building the Wall? The answer to that question will tell us much about the future of political theater in the Age of Trump.
1 “Concurring with Arthur Miller” (Commentary, June 2009)