The United States is once again locked in a struggle with a deadly global enemy. Herewith a critical, comprehensive guide…
A Note to the Reader
This past spring, when it seemed that everything that could go wrong in Iraq was going wrong, a plague of amnesia began sweeping through the country. Caught up in the particulars with which we were being assaulted 24 hours a day, we seemed to have lost sight of the context in which such details could be measured and understood and related to one another. Small things became large, large things became invisible, and hysteria filled the air.
Since then, of course, and especially after the hand over of authority on June 30 to an interim Iraqi government, matters have become more complicated. But the relentless pressure of events, and the continuing onslaught both of details and of their often tendentious or partisan interpretation, have hardly let up at all. It is for this reason that, in what follows, I have tried to step back from the daily barrage and to piece together the story of what this nation has been fighting to accomplish since September 11, 2001.
In doing this, I have drawn freely from my own past writings on the subject, and especially from three articles that appeared in these pages two or more years ago.1 In some instances, I have woven sections of these articles into a new setting; other passages I have adapted and updated.
Telling the story properly has required more than a straight narrative leading from 9/11 to the time of writing. For one thing, I have had to interrupt the narrative repeatedly in order to confront and clear away the many misconceptions, distortions, and outright falsifications that have been perpetrated. In addition, I have had to broaden the perspective so as to make it possible to see why the great struggle into which the United States was plunged by 9/11 can only be understood if we think of it as World War IV.
My hope is that telling the story from this perspective and in these ways will demonstrate that the road we have taken since 9/11 is the only safe course for us to follow. As we proceed along this course, questions will inevitably arise as to whether this or that move was necessary or right; and such questions will breed hesitations and even demands that we withdraw from the field. Some of this happened even in World War II, perhaps the most popular war the United States has ever fought, and much more of it in World War III (that is, the cold war); and now it is happening again, notably with respect to Iraq.
But as I will attempt to show, we are only in the very early stages of what promises to be a very long war, and Iraq is only the second front to have been opened in that war: the second scene, so to speak, of the first act of a five-act play. In World War II and then in World War III, we persisted in spite of impatience, discouragement, and opposition for as long as it took to win, and this is exactly what we have been called upon to do today in World War IV.
For today, no less than in those titanic conflicts, we are up against a truly malignant force in radical Islamism and in the states breeding, sheltering, or financing its terrorist armory. This new enemy has already attacked us on our own soil—a feat neither Nazi Germany nor Soviet Russia ever managed to pull off—and openly announces his intention to hit us again, only this time with weapons of infinitely greater and deadlier power than those used on 9/11. His objective is not merely to murder as many of us as possible and to conquer our land. Like the Nazis and Communists before him, he is dedicated to the destruction of everything good for which America stands. It is this, then, that (to paraphrase George W. Bush and a long string of his predecessors, Republican and Democratic alike) we in our turn, no less than the “greatest generation” of the 1940's and its spiritual progeny of the 1950's and after, have a responsibility to uphold and are privileged to defend.
Out of the Blue
The attack came, both literally and metaphorically, like a bolt out of the blue. Literally, in that the hijacked planes that crashed into the twin towers of the World Trade Center on the morning of September 11, 2001 had been flying in a cloudless sky so blue that it seemed unreal. I happened to be on jury duty that day, in a courthouse only a half-mile from what would soon be known as Ground Zero. Some time after the planes reached their targets, we all poured into the street—just as the second tower collapsed. And this sight, as if it were not impossible to believe in itself, was made all the more incredible by the perfection of the sky stretching so beautifully over it. I felt as though I had been deposited into a scene in one of those disaster movies being filmed (as they used to say) in glorious technicolor.
But the attack came out of the blue in a metaphorical sense as well. About a year later, in November 2002, a commission would be set up to investigate how and why such a huge event could have taken us by surprise and whether it might have been prevented. Because the commission's public hearings were not held until the middle of this year's exceptionally poisonous presidential election campaign, they quickly degenerated into an attempt by the Democrats on the panel to demonstrate that the administration of George W. Bush had been given adequate warnings but had failed to act on them.
Reinforcing this attempt was the testimony of Richard A. Clarke, who had been in charge of the counterterrorist operation in the National Security Council under Bill Clinton and then under Bush before resigning in the aftermath of 9/11. What Clarke for all practical purposes did—both at the hearings and in his hot-off-the-press book, Against All Enemies—was to blame Bush, who had been in office for a mere eight months when the attack occurred, while exonerating Clinton, who had spent eight long years doing little of any significance in response to the series of terrorist assaults on American targets in various parts of the world that were launched on his watch.
The point I wish to stress is not that Clarke was exaggerating or lying.2 It is that the attack on 9/11 did indeed come out of the blue in the sense that no one ever took such a possibility seriously enough to figure out what to do about it. Even Clarke, who did stake a dubious claim to prescience, had to admit under questioning by one of the 9/11 commissioners that if all his recommendations had been acted upon, the attack still could not have been prevented. And in its final report, released on July 22 of this year, the commission, while digging up no fewer than ten episodes that with hindsight could be seen as missed “operational opportunities,” thought that these opportunities could not have been acted on effectively enough to frustrate the attack. Indeed not—not, that is, in the real America as it existed at the time: an America in which hobbling constraints had been placed on both the CIA and the FBI; in which a “wall of separation” had been erected to obstruct communication or cooperation between law-enforcement and national-security agents; and in which politicians and the general public alike were still unable and/or unwilling to believe that terrorism might actually represent a genuine threat.
Slightly contradicting itself, the commission said that “the 9/11 attacks were a shock, but they should not have come as a surprise.” Maybe so; and yet there was no one, either in government or out, to whom they did not come as a surprise, either in general or in the particular form they took. The commission also spoke of a “failure of imagination.” Maybe so again; and yet the word “failure” seems inappropriate, implying as it does that success was possible. Surely a failure so widespread deserves to be considered inevitable.
To the New York Times, however, the failure was not at all inevitable. In a front-page editorial disguised as a “report,” the Times credited the commission's final report with finding that “an attack described as unimaginable had in fact been imagined, repeatedly.” But not a shred of the documentary evidence cited by the Times for this categorical statement actually predicted that al Qaeda would hijack commercial airliners and crash them into buildings in New York and Washington. Moreover, all of the evidence, such as it was, came from the 1990's. Nevertheless, the Times “report” contrived to convey the impression that in the fall of 2000 the Bush administration—then not yet in office—had received fair warning of an imminent attack. To bolster this impression, the Times went on to quote from a briefing given to Bush a month before 9/11. But the document in question was vague about details, and in any case was only one of many intelligence briefings with no special claim to credibility over conflicting assessments.
Thus the Bush administration, which had just been excoriated in hearings held by the Senate Intelligence Committee for having invaded Iraq on the basis of faulty intelligence, was now excoriated by some of the 9/11 commissioners for not having acted on the basis of even sketchier intelligence to head off 9/11 itself. This contradiction elicited a mordant comment from Charles Hill, a former government official who had been a regular “consumer” of intelligence:
Intelligence collection and analysis is a very imperfect business. Refusal to face this reality has produced the almost laughable contradiction of the Senate Intelligence Committee criticizing the Bush administration for acting on third-rate intelligence, even as the 9/11 commission criticizes it for not acting on third-rate intelligence.3
However, the point I most wish to stress is that there was something unwholesome, not to say unholy, about the recriminations on this issue that befouled the commission's public hearings and some of the interim reports by the staff. It therefore came, so to speak, both as a shock and as a surprise that this same unholy spirit was almost entirely exorcised from the final report. In the end the commission agreed that no American President and no American policy could be held responsible in any degree for the aggression against the United States unleashed on 9/11.
Amen to that. For the plain truth is that the sole and entire responsibility rests with al Qaeda, along with the regimes that provided it with protection and support. Furthermore, to the extent that American passivity and inaction opened the door to 9/11, neither Democrats nor Republicans, and neither liberals nor conservatives, are in a position to derive any partisan or ideological advantage. The reason, quite simply, is that much the same methods for dealing with terrorism were employed by the administrations of both parties, stretching as far back as Richard Nixon in 1970 and proceeding through Gerald Ford, Jimmy Carter, Ronald Reagan (yes, Ronald Reagan), George H.W. Bush, Bill Clinton, and right up to the pre-9/11 George W. Bush.
A “Paper Tiger”
The record speaks dismally for itself. From 1970 to 1975, during the administrations of Nixon and Ford, several American diplomats were murdered in Sudan and Lebanon while others were kidnapped. The perpetrators were all agents of one or another faction of the Palestine Liberation Organization (PLO). In Israel, too, many American citizens were killed by the PLO, though, except for the rockets fired at our embassy and other American facilities in Beirut by the Popular Front for the Liberation of Palestine (PFLP), these attacks were not directly aimed at the United States. In any case, there were no American military reprisals.
Our diplomats, then, were for some years already being murdered with impunity by Muslim terrorists when, in 1979, with Carter now in the White House, Iranian students—with either the advance or subsequent blessing of the country's clerical ruler, Ayatollah Khomeini—broke into the American embassy in Tehran and seized 52 Americans as hostages. For a full five months, Carter dithered. At last, steeling himself, he authorized a military rescue operation which had to be aborted after a series of mishaps that would have fit well into a Marx Brothers movie like Duck Soup if they had not been more humiliating than comic. After 444 days, and just hours after Reagan's inauguration in January 1981, the hostages were finally released by the Iranians, evidently because they feared that the hawkish new President might actually launch a military strike against them.
Yet if they could have foreseen what was coming under Reagan, they would not have been so fearful. In April 1983, Hizbullah—an Islamic terrorist organization nourished by Iran and Syria—sent a suicide bomber to explode his truck in front of the American embassy in Beirut, Lebanon. Sixty-three employees, among them the Middle East CIA director, were killed and another 120 wounded. But Reagan sat still.
Six months later, in October 1983, another Hizbullah suicide bomber blew up an American barracks in the Beirut airport, killing 241 U.S. Marines in their sleep and wounding another 81. This time Reagan signed off on plans for a retaliatory blow, but he then allowed his Secretary of Defense, Caspar Weinberger, to cancel it (because it might damage our relations with the Arab world, of which Weinberger was always tenderly solicitous). Shortly thereafter, the President pulled the Marines out of Lebanon.
Having cut and run in Lebanon in October, Reagan again remained passive in December, when the American embassy in Kuwait was bombed. Nor did he hit back when, hard upon the withdrawal of the American Marines from Beirut, the CIA station chief there, William Buckley, was kidnapped by Hizbullah and then murdered. Buckley was the fourth American to be kidnapped in Beirut, and many more suffered the same fate between 1982 and 1992 (though not all died or were killed in captivity).
These kidnappings were apparently what led Reagan, who had sworn that he would never negotiate with terrorists, to make an unacknowledged deal with Iran, involving the trading of arms for hostages. But whereas the Iranians were paid off handsomely in the coin of nearly 1,500 antitank missiles (some of them sent at our request through Israel), all we got in exchange were three American hostages—not to mention the disruptive and damaging Iran-contra scandal.
In September 1984, six months after the murder of Buckley, the U.S. embassy annex near Beirut was hit by yet another truck bomb (also traced to Hizbullah). Again Reagan sat still. Or rather, after giving the green light to covert proxy retaliations by Lebanese intelligence agents, he put a stop to them when one such operation, directed against the cleric thought to be the head of Hizbullah, failed to get its main target while unintentionally killing 80 other people.
It took only another two months for Hizbullah to strike once more. In December 1984, a Kuwaiti airliner was hijacked and two American passengers employed by the U.S. Agency for International Development were murdered. The Iranians, who had stormed the plane after it landed in Tehran, promised to try the hijackers themselves, but instead allowed them to leave the country. At this point, all the Reagan administration could come up with was the offer of a $250,000 reward for information that might lead to the arrest of the hijackers. There were no takers.
The following June, Hizbullah operatives hijacked still another airliner, an American one (TWA flight 847), and then forced it to fly to Beirut, where it was held for more than two weeks. During those weeks, an American naval officer aboard the plane was shot, and his body was ignominiously hurled onto the tarmac. For this the hijackers were rewarded with the freeing of hundreds of terrorists held by Israel in exchange for the release of the other passengers. Both the United States and Israel denied that they were violating their own policy of never bargaining with terrorists, but as with the arms-for-hostages deal, and with equally good reason, no one believed them, and it was almost universally assumed that Israel had acted under pressure from Washington. Later, four of the hijackers were caught but only one wound up being tried and jailed (by Germany, not the United States).
The sickening beat went on. In October 1985, the Achille Lauro, an Italian cruise ship, was hijacked by a group under the leadership of the PLO's Abu Abbas, working with the support of Libya. One of the hijackers threw an elderly wheelchair-bound American passenger, Leon Klinghoffer, overboard. When the hijackers attempted to escape in a plane, the United States sent Navy fighters to intercept it and force it down. Klinghoffer's murderer was eventually apprehended and sent to prison in Italy, but the Italian authorities let Abu Abbas himself go. Washington—evidently having exhausted its repertoire of military reprisals—now confined itself to protesting the release of Abu Abbas. To no avail.
Libya's involvement in the Achille Lauro hijacking was, though, the last free pass that country's dictator, Muammar Qaddafi, was destined to get from the United States under Reagan. In December 1985, five Americans were among the 20 people killed when the Rome and Vienna airports were bombed, and then in April 1986 another bomb exploded in a discotheque in West Berlin that was a hangout for American servicemen. U.S. intelligence tied Libya to both of these bombings, and the eventual outcome was an American air attack in which one of the residences of Qaddafi was hit.
In retaliation, the Palestinian terrorist Abu Nidal executed three U.S. citizens who worked at the American University in Beirut. But Qaddafi himself—no doubt surprised and shaken by the American reprisal—went into a brief period of retirement as a sponsor of terrorism. So far as we know, it took nearly three years (until December 1988) before he could pull himself together to the point of undertaking another operation: the bombing of Pan Am flight 103 over Lockerbie, Scotland, in which a total of 270 people lost their lives. Of the two Libyan intelligence agents who were tried for planting the bomb, one was convicted (though not until the year 2001) and the other acquitted. Qaddafi himself suffered no further punishment from American warplanes.
In January 1989, Reagan was succeeded by the elder George Bush, who, in handling the fallout from the destruction of Pan Am 103, was content to adopt the approach to terrorism taken by all his predecessors. During the elder Bush's four-year period in the White House, there were several attacks on Americans in Turkey by Islamic terrorist organizations, and there were others in Egypt, Saudi Arabia, and Lebanon. None of these was as bloody as previous incidents, and none provoked any military response from the United States.
In January 1993, Bill Clinton became President. Over the span of his two terms in office, American citizens continued to be injured or killed in Israel and other countries by terrorists who were not aiming specifically at the United States. But several spectacular terrorist operations occurred on Clinton's watch of which the U.S. was most emphatically the target.
The first, on February 26, 1993, only 38 days after his inauguration, was the explosion of a truck bomb in the parking garage of the World Trade Center in New York. As compared with what would happen on September 11, 2001, this was a minor incident in which “only” six people were killed and over 1,000 injured. The six Muslim terrorists responsible were caught, tried, convicted, and sent to prison for long terms.
But in following the by-now traditional pattern of treating such attacks as common crimes, or the work of rogue groups acting on their own, the Clinton administration willfully turned a deaf ear to outside experts like Steven Emerson and even the director of the CIA, R. James Woolsey, who strongly suspected that behind the individual culprits was a terrorist Islamic network with (at that time) its headquarters in Sudan. This network, then scarcely known to the general public, was called al Qaeda, and its leader was a former Saudi national who had fought on our side against the Soviets in Afghanistan but had since turned against us as fiercely as he had been against the Russians. His name was Osama bin Laden.
The next major episode was not long in trailing the bombing of the World Trade Center. In April 1993, less than two months after that attack, former President Bush visited Kuwait, where an attempt was made to assassinate him by—as our own investigators were able to determine—Iraqi intelligence agents. The Clinton administration spent two more months seeking approval from the UN and the “international community” to retaliate for this egregious assault on the United States. In the end, a few cruise missiles were fired into the Iraqi capital of Baghdad, where they fell harmlessly onto empty buildings in the middle of the night.
In the years immediately ahead, there were many Islamic terrorist operations (in Turkey, Pakistan, Saudi Arabia, Lebanon, Yemen, and Israel) that were not specifically aimed at the United States but in which Americans were nevertheless murdered or kidnapped. In March 1995, however, a van belonging to the U.S. consulate in Karachi, Pakistan, was hit by gunfire, killing two American diplomats and injuring a third. In November of the same year, five Americans died when a car bomb exploded in Riyadh, Saudi Arabia, near a building in which a U.S. military advisory group lived.
All this was trumped in June 1996 when another building in which American military personnel lived—the Khobar Towers in Dhahran, Saudi Arabia—was blasted by a truck bomb. Nineteen of our airmen were killed, and 240 other Americans on the premises were wounded.
In 1993, Clinton had been so intent on treating the World Trade Center bombing as a common crime that for some time afterward he refused even to meet with his own CIA director. Perhaps he anticipated that he would be told things by Woolsey—about terrorist networks and the states sponsoring them—that he did not wish to hear, because he had no intention of embarking on the military action that such knowledge might force upon him. Now, in the wake of the bombing of the Khobar Towers, Clinton again handed the matter over to the police; but the man in charge, his FBI director, Louis Freeh, who had intimations of an Iranian connection, could no more get through to him than Woolsey before. There were a few arrests, and the action then moved into the courts.
In June 1998, grenades were unsuccessfully hurled at the U.S. embassy in Beirut. A little later, our embassies in the capitals of Kenya (Nairobi) and Tanzania (Dar es Salaam) were not so lucky. On a single day—August 7, 1998—car bombs went off in both places, leaving more than 200 people dead, of whom twelve were Americans. Credit for this coordinated operation was claimed by al Qaeda. In what, whether fairly or not, was widely interpreted, especially abroad, as a move to distract attention from his legal troubles over the Monica Lewinsky affair, Clinton fired cruise missiles at an al Qaeda training camp in Afghanistan, where bin Laden was supposed to be at that moment, and at a building in Sudan, where al Qaeda also had a base. But bin Laden escaped harm, while it remained uncertain whether the targeted factory in Sudan was actually manufacturing chemical weapons or was just a normal pharmaceutical plant.
This fiasco—so we have learned from former members of his administration—discouraged any further such action by Clinton against bin Laden, though we have also learned from various sources that he did authorize a number of covert counterterrorist operations and diplomatic initiatives leading to arrests in foreign countries. But according to Dick Morris, who was then Clinton's political adviser:
The weekly strategy meetings at the White House throughout 1995 and 1996 featured an escalating drumbeat of advice to President Clinton to take decisive steps to crack down on terrorism. The polls gave these ideas a green light. But Clinton hesitated and failed to act, always finding a reason why some other concern was more important.
In the period after Morris left, more began going on behind the scenes, but most of it remained in the realm of talk or planning that went nowhere. In contrast to the flattering picture of Clinton that Richard Clarke would subsequently draw, Woolsey (who after a brief tenure resigned from the CIA out of sheer frustration) would offer a devastating retrospective summary of the President's overall approach:
Do something to show you're concerned. Launch a few missiles in the desert, bop them on the head, arrest a few people. But just keep kicking the ball down field.
Bin Laden, picking up that ball on October 12, 2000, when the destroyer USS Cole had docked for refueling in Yemen, dispatched a team of suicide bombers. The bombers did not succeed in sinking the ship, but they inflicted severe damage upon it, while managing to kill seventeen American sailors and wounding another 39.
Clarke, along with a few intelligence analysts, had no doubt that the culprit was al Qaeda. But the heads neither of the CIA nor of the FBI thought the case was conclusive. Hence the United States did not so much as lift a military finger against bin Laden or the Taliban regime in Afghanistan, where he was now ensconced and being protected. As for Clinton, so obsessively was he then wrapped up in a futile attempt to broker a deal between the Israelis and the Palestinians that all he could see in this attack on an American warship was an effort “to deter us from our mission of promoting peace and security in the Middle East.” The terrorists, he resoundingly vowed, would “fail utterly” in this objective.
Never mind that not the slightest indication existed that bin Laden was in the least concerned over Clinton's negotiations with the Israelis and the Palestinians at Camp David, or even that the Palestinian issue was of primary importance to him as compared with other grievances. In any event, it was Clinton who failed, not bin Laden. The Palestinians under Yasir Arafat, spurning an unprecedentedly generous offer that had been made by the Israeli prime minister Ehud Barak with Clinton's enthusiastic endorsement, unleashed a new round of terrorism. And bin Laden would soon succeed all too well in his actual intention of striking another brazen blow at the United States.
The sheer audacity of what bin Laden went on to do on September 11 was unquestionably a product of his contempt for American power. Our persistent refusal for so long to use that power against him and his terrorist brethren—or to do so effectively whenever we tried—reinforced his conviction that we were a nation on the way down, destined to be defeated by the resurgence of the same Islamic militancy that had once conquered and converted large parts of the world by the sword.
As bin Laden saw it, thousands or even millions of his followers and sympathizers all over the Muslim world were willing, and even eager, to die a martyr's death in the jihad, the holy war, against the “Great Satan,” as the Ayatollah Khomeini had called us. But, in bin Laden's view, we in the West, and especially in America, were all so afraid to die that we lacked the will even to stand up for ourselves and defend our degenerate way of life.
Bin Laden was never reticent or coy in laying out this assessment of the United States. In an interview on CNN in 1997, he declared that “the myth of the superpower was destroyed not only in my mind but also in the minds of all Muslims” when the Soviet Union was defeated in Afghanistan. That the Muslim fighters in Afghanistan would almost certainly have failed if not for the arms supplied to them by the United States did not seem to enter into the lesson he drew from the Soviet defeat. In fact, in an interview a year earlier he had belittled the United States as compared with the Soviet Union. “The Russian soldier is more courageous and patient than the U.S. soldier,” he said then. Hence, “Our battle with the United States is easy compared with the battles in which we engaged in Afghanistan.”
Becoming still more explicit, bin Laden wrote off the Americans as cowards. Had Reagan not taken to his heels in Lebanon after the bombing of the Marine barracks in 1983? And had not Clinton done the same a decade later when only a few American Rangers were killed in Somalia, where they had been sent to participate in a “peacekeeping” mission? Bin Laden did not boast of this as one of his victories, but a State Department dossier charged that al Qaeda had trained the terrorists who ambushed the American servicemen. (The ugly story of what happened to us in Somalia was told in the film version of Mark Bowden's Black Hawk Down, which reportedly became Saddam Hussein's favorite movie.)
Bin Laden summed it all up in a third interview he gave in 1998:
After leaving Afghanistan the Muslim fighters headed for Somalia and prepared for a long battle thinking that the Americans were like the Russians. The youth were surprised at the low morale of the American soldiers and realized, more than before, that the American soldier was a paper tiger and after a few blows ran in defeat.
Bin Laden was not the first enemy of a democratic regime to have been emboldened by such impressions. In the 1930's, Adolf Hitler was convinced by the failure of the British to arm themselves against the threat he posed, as well as by the policy of appeasement they adopted toward him, that they were decadent and would never fight no matter how many countries he invaded.
Similarly with Joseph Stalin in the immediate aftermath of World War II. Encouraged by the rapid demobilization of the United States, which to him meant that we were unprepared and unwilling to resist him with military force, Stalin broke the pledges he had made at Yalta to hold free elections in the countries of Eastern Europe he had occupied at the end of the war. Instead, he consolidated his hold over those countries, and made menacing gestures toward Greece and Turkey.
After Stalin's death, his successors repeatedly played the same game whenever they sensed a weakening of the American resolve to hold them back. Sometimes this took the form of maneuvers aimed at establishing a balance of military power in their favor. Sometimes it took the form of using local Communist parties or other proxies as their instrument. But thanks to the decline of American power following our withdrawal from Vietnam—a decline reflected in the spread during the late 1970's of isolationist and pacifist sentiment, which was in turn reflected in severely reduced military spending—Leonid Brezhnev felt safe in sending his own troops into Afghanistan in 1979.
It was the same decline of American power, so uncannily personified by Jimmy Carter, that, less than two months before the Soviet invasion of Afghanistan, had emboldened the Ayatollah Khomeini to seize and hold American hostages. To be sure, there were those who denied that this daring action had anything to do with Khomeini's belief that the United States under Carter had become impotent. But this denial was impossible to sustain in the face of the contrast between the attack on our embassy in Tehran and the protection the Khomeini regime extended to the Soviet embassy there when a group of protesters tried to storm it after the invasion of Afghanistan. The radical Muslim fundamentalists ruling Iran hated Communism and the Soviet Union at least as much as they hated us—especially now that the Soviets had invaded a Muslim country. Therefore the difference in Khomeini's treatment of the two embassies could not be explained by ideological or political factors. What could and did explain it was his fear of Soviet retaliation as against his expectation that the United States, having lost its nerve, would go to any lengths to avoid the use of force.
And so it was with Saddam Hussein. In 1990, with the first George Bush sitting in the White House, Saddam Hussein invaded Kuwait in what was widely, and accurately, seen as a first step in a bid to seize control of the oil fields of the Middle East. The elder Bush, fortified by the determination of Margaret Thatcher, who was then prime minister of England, declared that the invasion would not stand, and he put together a coalition that sent a great military force into the region. This alone might well have frightened Saddam Hussein into pulling out of Kuwait if not for the wave of hysteria in the United States about the tens of thousands of “body bags” that it was predicted would be flown home if we actually went to war with Iraq. Not unreasonably, Saddam concluded that, if he held firm, it was we who would blink and back down.
The fact that Saddam miscalculated, and that in the end we made good on our threat, did not overly impress Osama bin Laden. After all—dreading the casualties we would suffer if we went into Baghdad after liberating Kuwait and defeating the Iraqi army on the battlefield—we had allowed Saddam to remain in power. To bin Laden, this could only have looked like further evidence of the weakness we had shown in the ineffectual policy toward terrorism adopted by a long string of American Presidents. No wonder he was persuaded that he could strike us massively on our own soil and get away with it.
Yet just as Saddam had miscalculated in 1990-91, and would again in 2002, bin Laden misread how the Americans would react to being hit where, literally, they lived. In all likelihood he expected a collapse into despair and demoralization; what he elicited instead was an outpouring of rage and an upsurge of patriotic sentiment such as younger Americans had never witnessed except in the movies, and had most assuredly never experienced in their own hearts and souls, or, for those who enlisted in the military, on their own flesh.
In that sense, bin Laden did for this country what the Ayatollah Khomeini had done before him. In seizing the American hostages in 1979, and escaping retaliation, Khomeini inflicted a great humiliation on the United States. But at the same time, he also exposed the foolishness of Jimmy Carter's view of the world. The foolishness did not lie in Carter's recognition that American power—military, economic, political, and moral—had been on a steep decline at least since Vietnam. This was all too true. What was foolish was the conclusion Carter drew from it. Rather than proposing policies aimed at halting and then reversing the decline, he took the position that the cause was the play of historical forces we could do nothing to stop or even slow down. As he saw it, instead of complaining or flailing about in a vain and dangerous effort to recapture our lost place in the sun, we needed first to acknowledge, accept, and adjust to this inexorable historical development, and then to act upon it with “mature restraint.”
In one fell swoop, the Ayatollah Khomeini made nonsense of Carter's delusionary philosophy in the eyes of very large numbers of Americans, including many who had previously entertained it. Correlatively, new heart was given to those who, rejecting the idea that American decline was inevitable, had argued that the cause was bad policies and that the decline could be turned around by returning to the better policies that had made us so powerful in the first place.
The entire episode thereby became one of the forces behind an already burgeoning determination to rebuild American power that culminated in the election of Ronald Reagan, who had campaigned on the promise to do just that. For all the shortcomings of his own handling of terrorism, Reagan did in fact keep his promise to rebuild American power. And it was this that set the stage for victory in the multifaceted cold war we had been waging since 1947, when the United States under President Harry Truman (aroused by Stalin's miscalculation) decided to resist any further advance of the Soviet empire.
Few, if any, of Truman's contemporaries would have dreamed that this product of a Kansas City political machine, who as a reputedly run-of-the-mill U.S. Senator had spent most of his time on taxes and railroads, would rise so resolutely and so brilliantly to the threat represented by Soviet imperialism. Just so, 54 years later in 2001, another politician with a small reputation and little previous interest in foreign affairs would be confronted with a challenge perhaps even greater than the one faced by Truman; and he too astonished his own contemporaries by the way he rose to it.
Enter the Bush Doctrine
In “The Sources of Soviet Conduct” (1947), the theoretical defense he constructed of the strategy Truman adopted for fighting the war ahead, George F. Kennan (then the director of the State Department's policy planning staff, and writing under the pseudonym “X”) described that strategy as
a long-term, patient but firm and vigilant containment of Russian expansive tendencies . . . by the adroit and vigilant application of counterforce at a series of constantly shifting geographical and political points.
In other words (though Kennan himself did not use those words), we were faced with the prospect of nothing less than another world war; and (though in later years, against the plain sense of the words that he himself did use, he tried to claim that the “coun-terforce” he had in mind was not military) it would not be an entirely “cold” one, either. Before it was over, more than 100,000 Americans would die on the far-off battlefields of Korea and Vietnam, and the blood of many others allied with us in the political and ideological struggle against the Soviet Union would be spilled on those same battlefields, and in many other places as well.
For these reasons, I agree with one of our leading contemporary students of military strategy, Eliot A. Cohen, who thinks that what is generally called the “cold war” (a term, incidentally, coined by Soviet propagandists) should be given a new name. “The cold war,” Cohen writes, was actually “World War III, which reminds us that not all global conflicts entail the movement of multimillion-man armies, or conventional front lines on a map.” I also agree that the nature of the conflict in which we are now engaged can only be fully appreciated if we look upon it as World War IV. To justify giving it this name—rather than, say, the “war on terrorism”—Cohen lists “some key features” that it shares with World War III:
that it is, in fact, global; that it will involve a mixture of violent and nonviolent efforts; that it will require mobilization of skill, expertise, and resources, if not of vast numbers of soldiers; that it may go on for a long time; and that it has ideological roots.
There is one more feature that World War IV shares with World War III and that Cohen does not mention: both were declared through the enunciation of a presidential doctrine.
The Truman Doctrine of 1947 was born with the announcement that “it must be the policy of the United States to support free peoples who are resisting attempted subjugation by armed minorities or by outside pressure.” Beginning with a special program of aid to Greece and Turkey, which were then threatened by Communist takeovers, the strategy was broadened within a few months by the launching of a much larger and more significant program of economic aid that came to be called the Marshall Plan. The purpose of the Marshall Plan was to hasten the reconstruction of the war-torn economies of Western Europe: not only because this was a good thing in itself, and not only because it would serve American interests, but also because it could help eliminate the grievances on which Communism fed. But then came a Communist coup in Czechoslovakia. Following as it had upon the installation by the Soviet Union of puppet regimes in the occupied countries of East Europe, the Czech coup demonstrated that economic measures would not be enough by themselves to ward off a comparable danger posed to Italy and France by huge local Communist parties entirely subservient to Moscow. Out of this realization—and out of a parallel worry about an actual Soviet invasion of Western Europe—there emerged the North Atlantic Treaty Organization (NATO).
Containment, then, was a three-sided strategy made up of economic, political, and military components. All three would be deployed in a shifting relative balance over the four decades it took to win World War III.4
If the Truman Doctrine unfolded gradually, revealing its entire meaning only in stages, the Bush Doctrine was pretty fully enunciated in a single speech, delivered to a joint session of Congress on September 20, 2001. It was then clarified and elaborated in three subsequent statements: Bush's first State of the Union address on January 29, 2002; his speech to the graduating class of the U.S. Military Academy at West Point on June 1, 2002; and the remarks on the Middle East he delivered three weeks later, on June 24. This difference aside, his contemporaries were at least as startled as Truman's had been, both by the substance of the new doctrine and by the transformation it bespoke in its author. For here was George W. Bush, who in foreign affairs had been a more or less passive disciple of his father, talking for all the world like a fiery follower of Ronald Reagan.
In sharp contrast to Reagan, generally considered a dangerous ideologue, the first President Bush—who had been Reagan's Vice President and had then succeeded him in the White House—was often accused of being deficient in what he himself inelegantly dismissed as “the vision thing.” The charge was fair in that the elder Bush had no guiding sense of what role the United States might play in reshaping the post-cold-war world. A strong adherent of the “realist” perspective on world affairs, he believed that the maintenance of stability was the proper purpose of American foreign policy, and the only wise and prudential course to follow. Therefore, when Saddam Hussein upset the balance of power in the Middle East by invading Kuwait in 1991, the elder Bush went to war not to create a new configuration in the region but to restore the status quo ante. And it was precisely out of the same overriding concern for stability that, having achieved this objective by driving Saddam out of Kuwait, Bush then allowed him to remain in power.
As for the second President Bush, before 9/11 he was, to all appearances, as deficient in the “vision thing” as his father before him. If he entertained any doubts about the soundness of the “realist” approach, he showed no sign of it. Nothing he said or did gave any indication that he might be dissatisfied with the idea that his main job in foreign affairs was to keep things on an even keel. Nor was there any visible indication that he might be drawn to Ronald Reagan's more “idealistic” ambition to change the world, especially with the “Wilsonian” aim of making it “safe for democracy” by encouraging the spread to as many other countries as possible of the liberties we Americans enjoyed.
Which is why Bush's address of September 20, 2001 came as so great a surprise. Delivered only nine days after the attacks on the World Trade Center and the Pentagon, and officially declaring that the United States was now at war, the September 20 speech put this nation, and all others, on notice that whether or not George W. Bush had been a strictly conventional realist in the mold of his father, he was now politically born again as a passionate democratic idealist of the Reaganite stamp.
It was also this speech that marked the emergence of the Bush Doctrine, and that pointed just as clearly to World War IV as the Truman Doctrine had to War World III. Bush did not explicitly give the name World War IV to the struggle ahead, but he did characterize it as a direct successor to the two world wars that had immediately preceded it. Thus, of the “global terrorist network” that had attacked us on our own soil, he said:
We have seen their kind before. They're the heirs of all the murderous ideologies of the 20th century. By sacrificing human life to serve their radical visions, by abandoning every value except the will to power, they follow in the path of fascism, Nazism, and totalitarianism. And they will follow that path all the way to where it ends in history's unmarked grave of discarded lies.
As this passage, coming toward the beginning of the speech, linked the Bush Doctrine to the Truman Doctrine and to the great struggle led by Franklin D. Roosevelt before it, the wind-up section demonstrated that if the second President Bush had previously lacked “the vision thing,” his eyes were blazing with it now. “Great harm has been done to us,” he intoned toward the end. “We have suffered great loss. And in our grief and anger we have found our mission and our moment.” Then he went on to spell out the substance of that mission and that moment:
The advance of human freedom, the great achievement of our time and the great hope of every time, now depends on us. Our nation, this generation, will lift the dark threat of violence from our people and our future. We will rally the world to this cause by our efforts, by our courage. We will not tire, we will not falter, and we will not fail.
Finally, in his peroration, drawing on some of the same language he had been applying to the nation as a whole, Bush shifted into the first person, pledging his own commitment to the great mission we were all charged with accomplishing:
I will not forget the wound to our country and those who inflicted it. I will not yield, I will not rest, I will not relent in waging this struggle for freedom and security for the American people. The course of this conflict is not known, yet its outcome is certain. Freedom and fear, justice and cruelty, have always been at war, and we know that God is not neutral between them.
Not even Ronald Reagan, the “Great Communicator” himself, had ever been so eloquent in expressing the “idealistic” impetus behind his conception of the American role in the world.5
This was not the last time Bush would sound these themes. Two-and-a-half years later, at a moment when things seemed to be going badly in the war, it was with the same ideas he had originally put forward on September 20, 2001 that he sought to reassure the nation. The occasion would be a commencement address at the Air Force Academy on June 2, 2004, where he would repeatedly place the “war against terrorism” in direct succession to World War II and World War III. He would also be unusually undiplomatic in making no bones about his rejection of realism:
For decades, free nations tolerated oppression in the Middle East for the sake of stability. In practice, this approach brought little stability and much oppression, so I have changed this policy.
And again, even less diplomatically:
Some who call themselves realists question whether the spread of democracy in the Middle East should be any concern of ours. But the realists in this case have lost contact with a fundamental reality: America has always been less secure when freedom is in retreat; America is always more secure when freedom is on the march.
To top it all off, he would go out of his way to assert that his own policy, which he properly justified in the first place as a better way to protect American interests than the alternative favored by the realists, also bore the stamp of the Reaganite version of Wilsonian idealism:
This conflict will take many turns, with setbacks on the course to victory. Through it all, our confidence comes from one unshakable belief: We believe in Ronald Reagan's words that “the future belongs to the free.”
The first pillar of the Bush Doctrine, then, was built on a repudiation of moral relativism and an entirely unapologetic assertion of the need for and the possibility of moral judgment in the realm of world affairs. And just to make sure that the point he had first made on September 20, 2001 had hit home, Bush returned to it even more outspokenly and in greater detail in the State of the Union address of January 29, 2002.
Bush had won enthusiastic plaudits from many for the “moral clarity” of his September 20 speech, but he had also provoked even greater dismay and disgust among “advanced” thinkers and “sophisticated” commentators and diplomats both at home and abroad. Now he intensified and exacerbated their outrage by becoming more specific. Having spoken in September only in general terms about the enemy in World War IV, Bush proceeded in his second major wartime pronouncement to single out three such nations—Iraq, Iran, and North Korea—which he described as forming an “axis of evil.”
Here again he was following in the footsteps of Ronald Reagan, who had denounced the Soviet Union, our principal enemy in World War III, as an “evil empire,” and who had been answered with a veritably hysterical outcry from chancelleries and campuses and editorial pages all over the world. Evil? What place did a word like that have in the lexicon of international affairs, assuming it would ever occur to an enlightened person to exhume it from the grave of obsolete concepts in any connection whatsoever? But in the eyes of the “experts,” Reagan was not an enlightened person. Instead, he was a “cowboy,” a B-movie actor, who had by some freak of democratic perversity landed in the White House. In denouncing the Soviet empire, he was accused either of signaling an intention to trigger a nuclear war or of being too stupid to understand that his wildly provocative rhetoric might do so inadvertently.
The reaction to Bush was perhaps less hysterical and more scornful than the outcry against Reagan, since this time there was no carrying-on about a nuclear war. But the air was just as thick with the old sneers and jeers. Who but an ignoramus and a simpleton—or a fanatical religious fundamentalist, of the very type on whom Bush was declaring war—would resort to archaic moral absolutes like “good” and “evil”? On the one hand, it was egregiously simple-minded to brand a whole nation as evil, and on the other, only a fool could bring himself to believe, as Bush (once more like Reagan) had evidently done in complete and ingenuous sincerity, that the United States, of all countries, represented the good. Surely only a know-nothing illiterate could be oblivious of the innumerable crimes committed by America both at home and abroad—crimes that the country's own leading intellectuals had so richly documented in the by-now standard academic view of its history.
Here is how Gore Vidal, one of those intellectuals, stated the case:
I mean, to watch Bush doing his little war dance in Congress . . . about “evildoers” and this “axis of evil” . . . I thought, he doesn't even know what the word axis means. Somebody just gave it to him. . . . This is about as mindless a statement as you could make. Then he comes up with about a dozen other countries that have “evil” people in them, who might commit “terrorist acts.” What is a terrorist act? Whatever he thinks is a terrorist act. And we are going to go after them. Because we are good and they are evil. And we're “gonna git 'em.”
This was rougher and cruder than the language issuing from editorial pages and think tanks and foreign ministries and even most other intellectuals, but it was no different from what nearly all of them thought and how many of them talked in private.6
As soon became clear, however, Bush was not deterred. In subsequent statements he continued to uphold the first pillar of his new doctrine and to affirm the universality of the moral purposes animating this new war:
Some worry that it is somehow undiplomatic or impolite to speak the language of right and wrong. I disagree. Different circumstances require different methods, but not different moralities. Moral truth is the same in every culture, in every time, and in every place. . . . We are in a conflict between good and evil, and America will call evil by its name.
Then, in a fascinating leap into the great theoretical debate of the post-cold-war era (though without identifying the main participants), Bush came down squarely on the side of Francis Fukuyama's much-misunderstood view of “the end of history,” according to which the demise of Communism had eliminated the only serious competitor to our own political system7:
The 20th century ended with a single surviving model of human progress, based on non-negotiable demands of human dignity, the rule of law, limits on the power of the state, respect for women and private property and free speech and equal justice and religious tolerance.
Having endorsed Fukuyama, Bush now brushed off the political scientist Samuel Huntington, whose rival theory postulated a “clash of civilizations” arising from the supposedly incompatible values prevailing in different parts of the world:
When it comes to the common rights and needs of men and women, there is no clash of civilizations. The requirements of freedom apply fully to Africa and Latin America and the entire Islamic world. The peoples of the Islamic nations want and deserve the same freedoms and opportunities as people in every nation. And their governments should listen to their hopes.
The Second Pillar
If the first of the four pillars on which the Bush Doctrine stood was a new moral attitude, the second was an equally dramatic shift in the conception of terrorism as it had come to be defined in standard academic and intellectual discourse.
Under this new understanding—confirmed over and over again by the fact that most of the terrorists about whom we were learning came from prosperous families—terrorism was no longer considered a product of economic factors. The “swamps” in which this murderous plague bred were swamps not of poverty and hunger but of political oppression. It was only by “draining” them, through a strategy of “regime change,” that we would be making ourselves safe from the threat of terrorism and simultaneously giving the peoples of “the entire Islamic world” the freedoms “they want and deserve.”
In the new understanding, furthermore, terrorists, with rare exceptions, were not individual psychotics acting on their own but agents of organizations that depended on the sponsorship of various governments. Our aim, therefore, could not be merely to capture or kill Osama bin Laden and wipe out the al Qaeda terrorists under his direct leadership. Bush vowed that we would also uproot and destroy the entire network of interconnected terrorist organizations and cells “with global reach” that existed in as many as 50 or 60 countries. No longer would we treat the members of these groups as criminals to be arrested by the police, read their Miranda rights, and brought to trial. From now on, they were to be regarded as the irregular troops of a military alliance at war with the United States, and indeed the civilized world as a whole.
Not that this analysis of terrorism had exactly been a secret. The State Department itself had a list of seven state sponsors of terrorism (all but two of which, Cuba and North Korea, were predominantly Muslim), and it regularly issued reports on terrorist incidents throughout the world. But aside from such things as the lobbing of a cruise missile or two, diplomatic and/or economic sanctions that were inconsistently and even perfunctorily enforced, and a number of covert operations, the law-enforcement approach still prevailed.
September 11 changed much—if not yet all—of that; still in use were atavistic phrases like “bringing the terrorists to justice.” But no one could any longer dream that the American answer to what had been done to us in New York and Washington would begin with an FBI investigation and end with a series of ordinary criminal trials. War had been declared on the United States, and to war we were going to go.
But against whom? Since it was certain that Osama bin Laden had masterminded September 11, and since he and the top leadership of al Qaeda were holed up in Afghanistan, the first target, and thus the first testing ground of this second pillar of the Bush Doctrine, chose itself.
Before resorting to military force, however, Bush issued an ultimatum to the extreme Islamic radicals of the Taliban who were then ruling Afghanistan. The ultimatum demanded that they turn Osama bin Laden and his people over to us and that they shut down all terrorist training camps there. By rejecting this ultimatum, the Taliban not only asked for an invasion but, under the Bush Doctrine, also asked to be overthrown. And so, on October 7, 2001, the United States—joined by Great Britain and about a dozen other countries—launched a military campaign against both al Qaeda and the regime that was providing it with “aid and safe haven.”
As compared with what would come later, there was relatively little opposition either at home or abroad to the opening of this first front of World War IV. The reason was that the Afghan campaign could easily be justified as a retaliatory strike against the terrorists who had attacked us. And while there was a good deal of murmuring about the dangers of pursuing a policy of “regime change,” there was very little sympathy in practice (outside the Muslim world, that is) for the Taliban.
Whatever opposition was mounted to the battle of Afghanistan mainly took the form of skepticism over the chances of winning it. True, such skepticism was in some quarters a mask for outright opposition to American military power in general. But once the Afghan campaign got under way, the main focus shifted to everything that seemed to be going awry on the battlefield.
For example, only a couple of weeks into the campaign, when there were missteps involving the use of the Afghan fighters of the Northern Alliance, observers like R.W. Apple of the New York Times immediately rushed to conjure up the ghost of Vietnam. This restless spirit, having been called forth from the vasty deep, henceforth refused to be exorcised, and would go on to elbow its way into every detail of the debates over all the early battles of World War IV. On this occasion, its message was that we were falling victim to the illusion that we could rely on an incompetent local force to do the fighting on the ground while we supplied advice and air support. This strategy would inevitably fail, and would suck us into the same “quagmire” into which we had been dragged in Vietnam. After all, as Apple and others argued, the Soviet Union had suffered its own “Vietnam” in Afghanistan—and unlike us, it had not been hampered by the logistical problems of projecting power over a great distance. How could we expect to do better?
When, however, the B-52's and the 15,000-pound “Daisy Cutter” bombs were unleashed, they temporarily banished the ghost of Vietnam and undercut the fears of some and the hopes of others that we were heading into a quagmire. Far from being good for nothing but “pounding the rubble,” as the critics had sarcastically charged, the Daisy Cutters exerted, as even a New York Times report was forced to concede, “a terrifying psychological impact as they exploded just above ground, wiping out everything for hundreds of yards.”
But the Daisy Cutters were only the half of it. As we were all to discover, our “smart-bomb” technology had advanced far beyond the stage it had reached when first introduced in 1991. In Afghanistan in 2001, such bombs—guided by “spotters” on the ground equipped with radios, laptops, and lasers, and often riding on horseback, and also aided by unmanned satellite drones and other systems in the air—were both incredibly precise in avoiding civilian casualties and absolutely lethal in destroying the enemy. It was this “new kind of American power,” added the New York Times report, that “enabled a ragtag opposition” (i.e., the same Northern Alliance supposedly dragging us into a quagmire) to rout the “battle-hardened troops” of the Taliban regime in less than three months, and with the loss of very few American troops.
In the event, Osama bin Laden was not captured and al Qaeda was not totally destroyed. But it was certainly damaged by the campaign in Afghanistan. As for the Taliban regime, it was overthrown and replaced by a government that would no longer give aid and comfort to terrorists. Moreover, while Afghanistan under the new government may not have been exactly democratic, it was infinitely less oppressive than its totalitarian predecessor. And thanks to the clearing of political ground that had been covered over by the radical Islamic extremism of the Taliban, the seeds of free institutions were being sown and given a fighting chance to sprout and grow.
The campaign in Afghanistan demonstrated in the most unmistakable terms what followed from the new understanding of terrorism that formed the second pillar of the Bush Doctrine: countries that gave safe haven to terrorists and refused to clean them out were asking the United States to do it for them, and the regimes ruling these countries were also asking to be overthrown in favor of new leaders with democratic aspirations. Of course, as circumstances permitted and prudence dictated, other instruments of power, whether economic or diplomatic, would be deployed. But Afghanistan showed that the military option was open, available for use, and lethally effective.
The Third Pillar
The third pillar on which the Bush Doctrine rested was the assertion of our right to preempt. Bush had already pretty clearly indicated on September 20, 2001 that he had no intention of waiting around to be attacked again (“We will pursue nations that provide aid or safe haven to terrorism”). But in the State of the Union speech in January 2002, he became much more explicit on this point too:
We'll be deliberate, yet time is not on our side. I will not wait on events, while dangers gather. I will not stand by, as peril draws closer and closer. The United States of America will not permit the world's most dangerous regimes to threaten us with the world's most destructive weapons.
To those with ears to hear, the January speech should have made it abundantly clear that Bush was now proposing to go beyond the fundamentally retaliatory strike against Afghanistan and to take preemptive action. Yet at first it went largely unnoticed that this right to strike, not in retaliation for but in anticipation of an attack, was a logical extension of the general outline Bush had provided on September 20. Nor did the new position attract much attention even when it was reiterated in the plainest of words on January 29. It was not until the third in the series of major speeches elaborating the Bush Doctrine—the one delivered on June 1, 2002 at West Point to the graduating class of newly commissioned officers of the United States Army—that the message got through at last.
Perhaps the reason the preemption pillar finally became clearly visible at West Point was that, for the first time, Bush placed his new ideas in historical context:
For much of the last century, America's defense relied on the cold-war doctrines of deterrence and containment. In some cases, those strategies still apply. But new threats also require new thinking. Deterrence—the promise of massive retaliation against nations—means nothing against shadowy terrorist networks with no nation or citizens to defend.
This covered al Qaeda and similar groups. But Bush then proceeded to explain, in addition, why the old doctrines could not work with a regime like Saddam Hussein's in Iraq:
Containment is not possible when unbalanced dictators with weapons of mass destruction can deliver those weapons or missiles or secretly provide them to terrorist allies.
Refusing to flinch from the implications of this analysis, Bush repudiated the previously sacred dogmas of arms control and treaties against the proliferation of weapons of mass destruction as a means of dealing with the dangers now facing us from Iraq and other members of the axis of evil:
We cannot defend America and our friends by hoping for the best. We cannot put our faith in the word of tyrants, who solemnly sign nonproliferation treaties, and then systematically break them.
Hence, Bush inexorably continued,
If we wait for threats to fully materialize, we will have waited too long. . . . [T]he war on terror will not be won on the defensive. We must take the battle to the enemy, disrupt his plans, and confront the worst threats before they emerge. In the world we have entered, the only path to safety is the path of action. And this nation will act.
At this early stage, the Bush administration was still denying that it had reached any definite decision about Saddam Hussein; but everyone knew that, in promising to act, Bush was talking about him. The immediate purpose was to topple the Iraqi dictator before he had a chance to supply weapons of mass destruction to the terrorists. But this was by no means the only or—surprising though it would seem in retrospect—even the decisive consideration either for Bush or his supporters (or, for that matter, his opponents).8 And in any case, the long-range strategic rationale went beyond the proximate causes of the invasion. Bush's idea was to extend the enterprise of “draining the swamps” begun in Afghanistan and then to set the entire region on a course toward democratization. For if Afghanistan under the Taliban represented the religious face of Middle Eastern terrorism, Iraq under Saddam Hussein was its most powerful secular partner. It was to deal with this two-headed beast that a two-pronged strategy was designed.
Unlike the plan to go after Afghanistan, however, the idea of invading Iraq and overthrowing Saddam Hussein provoked a firestorm hardly less intense than the one that was still raging over Bush's insistence on using the words “good” and “evil.”
Even before the debate on Iraq in particular, there had been strong objection to the whole idea of preemptive action by the United States. Some maintained that such action would be a violation of international law, while others contended that it would set a dangerous precedent under which, say, Pakistan might attack India or vice-versa. But once the discussion shifted from the Bush Doctrine in general to the question of Iraq, the objections became more specific.
Most of these were brought together in early August 2002 (only about two months after Bush's speech at West Point) in a piece entitled “Don't Attack Iraq.” The author was Brent Scowcroft, who had been National Security Adviser to the elder President Bush. Scowcroft asserted, first, that there was
scant evidence to tie Saddam to terrorist organizations, and even less to the September 11 attacks. Indeed, Saddam's goals have little in common with the terrorists who threaten us, and there is little incentive for him to make common cause with them.
That being the case, Scowcroft continued, “An attack on Iraq at this time would seriously jeopardize, if not destroy, the global counterterrorist campaign we have undertaken,” the campaign that must remain “our preeminent security priority.”
But this was not the only “priority” that to Scowcroft was “preeminent”:
Possibly the most dire consequences [of attacking Saddam] would be the effect in the region. The shared view in the region is that Iraq is principally an obsession of the U.S. The obsession of the region, however, is the Israeli-Palestinian conflict.
Showing little regard for the American “obsession,” Scowcroft was very solicitous of the regional one:
If we were seen to be turning our backs on that bitter [Israeli-Palestinian] conflict . . . in order to go after Iraq, there would be an explosion of outrage against us. We would be seen as ignoring a key interest of the Muslim world in order to satisfy what is seen to be a narrow American interest.
This, added Scowcroft, “could well destabilize Arab regimes in the region,” than which, to a quintessential realist like him, nothing could be worse.
In coming out publicly, and in these terms, against the second President Bush's policy, Scow-croft underscored the extent to which the son had diverged from the father's perspective. In addition, by lending greater credence to the already credible rumor that the elder Bush opposed invading Iraq, Scowcroft's article belied what would soon become one of the favorite theories of the hard Left—namely, that the son had gone to war in order to avenge the attempted assassination of his father.
On the other hand, by implicitly assenting to the notion that toppling Saddam was merely “a narrow American interest,” Scowcroft gave a certain measure of aid and comfort to the hard Left and its fellow travelers within the liberal community. For from these circles the cry had been going out that it was the corporations, especially Halliburton (which Vice President Dick Cheney had formerly headed) and the oil companies that were dragging us into an unnecessary war.
So, too, with Scowcroft's emphasis on resolving “the Israeli-Palestinian conflict”—a standard euphemism for putting pressure on Israel, whose “intransigence” was taken to be the major obstacle to peace. By strongly insinuating that the Israeli prime minister Ariel Sharon was a greater threat to us than Saddam Hussein, Scowcroft provided a respectable rationale for the hostility toward Israel that had come shamelessly out of the closet within hours of the attacks of 9/11 and that had been growing more and more overt, more and more virulent, and more and more widespread ever since. To the “paleoconservative” Right, where the charge first surfaced, it was less the oil companies than Israel that was mainly dragging us into invading Iraq. Before long, the Left would add the same accusation to its own indictment, and in due course it would be imprinted more and more openly on large swatches of mainstream opinion.
A cognate count in this indictment held that the invasion of Iraq had been secretly engineered by a cabal of Jewish officials acting not in the interest of their own country but in the service of Israel, and more particularly of Ariel Sharon. At first the framers and early spreaders of this defamatory charge considered it the better part of prudence to identify the conspirators not as Jews but as “neoconservatives.” It was a clever tactic, in that Jews did in fact constitute a large proportion of the repentant liberals and leftists who, having some two or three decades earlier broken ranks with the Left and moved rightward, came to be identified as neoconservatives. Everyone in the know knew this, and for those to whom it was news, the point could easily be gotten across by singling out only those neoconservatives who had Jewish-sounding names and to ignore the many other leading members of the group whose clearly non-Jewish names might confuse the picture.
This tactic had been given a trial run by Patrick J. Buchanan in opposing the first Gulf war of 1991. Buchanan had then already denounced the Johnny-come-lately neoconservatives for having hijacked and corrupted the conservative movement, but now he descended deeper into the fever swamps by insisting that there were “only two groups beating the drums . . . for war in the Middle East—the Israeli Defense Ministry and its amen corner in the United States.” Among those standing in the “amen corner” he subsequently singled out four prominent hawks with Jewish-sounding names, counterposing them to “kids with names like McAllister, Murphy, Gonzales, and Leroy Brown” who would actually do the fighting if these Jews had their way.
Ten years later, in 2001, in the writings of Buchanan and other paleoconservatives within the journalistic fraternity (notably Robert Novak, Arnaud de Borchgrave, and Paul Craig Roberts), one of the four hawks of 1991, Richard Perle, made a return appearance. But Perle was now joined in starring roles by Paul Wolfowitz and Douglas Feith, both occupying high positions in the Pentagon, and a large supporting cast of identifiably Jewish intellectuals and commentators outside the government (among them Charles Krauthammer, William Kristol, and Robert Kagan). Like their predecessors in 1991, the members of the new ensemble were portrayed as agents of their bellicose counterparts in the Israeli government. But there was also a difference: the new group had managed to infiltrate the upper reaches of the American government. Having pulled this off, they had conspired to manipulate their non-Jewish bosses—Vice President Cheney, Secretary of Defense Donald Rumsfeld, National Security Adviser Condoleezza Rice, and George W. Bush himself—into invading Iraq.
Before long, this theory was picked up and circulated by just about everyone in the whole world who was intent on discrediting the Bush Doctrine. And understandably so: for what could suit their purposes better than to “expose” the invasion of Iraq—and by extension the whole of World War IV—as a war started by Jews and being waged solely in the interest of Israel?
To protect themselves against the taint of anti-Semitism, purveyors of this theory sometimes disingenuously continued to pretend that when they said “neoconservative” they did not mean “Jew.” Yet the theory inescapably rested on all-too-familiar anti-Semitic canards—principally that Jews were never reliably loyal to the country in which they lived, and that they were always conspiring behind the scenes, often successfully, to manipulate the world for their own nefarious purposes.9
Quite apart from its pernicious moral and political implications, the theory was ridiculous in its own right. To begin with, it asked one to believe the unbelievable: that strong-minded people like Bush, Rumsfeld, Cheney, and Rice could be fooled by a bunch of cunning subordinates, whether Jewish or not, into doing anything at all against their better judgment, let alone something so momentous as waging a war, let alone a war in which they could detect no clear relation to American interests.
In the second place, there was the evidence uncovered by the purveyors of this theory themselves. That evidence, to which they triumphantly pointed, consisted of published articles and statements in which the alleged conspirators openly and unambiguously advocated the very policies they now stood accused of having secretly foisted upon an unwary Bush administration. Nor had these allegedly secret conspirators ever concealed their belief that toppling Saddam Hussein and adopting a policy aimed at the democratization of the entire Middle East would be good not only for the United States and for the people of the region but also for Israel. (And what, an uncharacteristically puzzled Richard Perle asked a hostile interviewer, was wrong with that?)
Which brings us to the fourth pillar on which the Bush Doctrine was erected.
The Fourth Pillar
Listening to the laments of Scowcroft and many others, one would think that George W. Bush had been ignoring “the Israeli-Palestinian conflict” altogether in his misplaced “obsession” with Iraq. In fact, however, even before 9/11 it had been widely and authoritatively reported that Bush was planning to come out publicly in favor of establishing a Palestinian state as the only path to a peaceful resolution of the conflict; and in October, after a short delay caused by 9/11, he became the first American President actually to do so. Yet at some point in the evolution of his thinking over the months that followed, Bush seems to have realized that there was something bizarre about supporting the establishment of a Palestinian state that would be run by a terrorist like Yasir Arafat and his henchmen. Why should the United States acquiesce, let alone help, in adding yet another state to those harboring and sponsoring terrorism precisely at a time when we were at war to rid the world of just such regimes?
Presumably it was under the prodding of this question that Bush came up with an idea even more novel in its way than the new conception of terrorism he had developed after 9/11. This idea was broached only three weeks after his speech at West Point, on June 24, 2002, when he issued a statement adding conditions to his endorsement of a Palestinian state:
Today, Palestinian authorities are encouraging, not opposing terrorism. This is unacceptable. And the United States will not support the establishment of a Palestinian state until its leaders engage in a sustained fight against the terrorists and dismantle their infrastructure.
But engaging in such a fight, he added, required the election of “new leaders, leaders not compromised by terror,” who would embark on building “entirely new political and economic institutions based on democracy, market economics, and action against terrorism.”
It was with these words that Bush brought his “vision” (as he kept calling it) of a Palestinian state living peacefully alongside Israel into line with his overall perspective on the evil of terrorism. And having traveled that far, he went the distance by repositioning the Palestinian issue into the larger context from which Arab propaganda had ripped it. Since this move passed almost unnoticed, it is worth dwelling on why it was so important.
Even before Israel was born in 1948, the Muslim countries of the Middle East had been fighting against the establishment of a sovereign Jewish state—any Jewish state—on land they believed Allah had reserved for those faithful to his prophet Muhammad. Hence the Arab-Israeli conflict had pitted hundreds of millions of Arabs and other Muslims, in control of more than two dozen countries and vast stretches of territory, against a handful of Jews who then numbered well under three-quarters of a million and who lived on a tiny sliver of land the size of New Jersey. But then came the Six-Day war of 1967. Launched in an effort to wipe Israel off the map, it ended instead with Israel in control of the West Bank (formerly occupied by Jordan) and Gaza (which had been controlled by Egypt). This humiliating defeat, however, was eventually turned into a rhetorical and political victory by Arab propagandists, who redefined the ongoing war of the whole Muslim world against the Jewish state as, instead, a struggle merely between the Palestinians and the Israelis. Thus was Israel's image transformed from a David to a Goliath, a move that succeeded in alienating much of the old sympathy previously enjoyed by the outnumbered and besieged Jewish state.
Bush now reversed this reversal. Not only did he reconstruct a truthful framework by telling the Palestinian people that they had been treated for decades “as pawns in the Middle East conflict.” He also insisted on being open and forthright about the nations that belonged in this larger picture and about what they had been up to:
I've said in the past that nations are either with us or against us in the war on terror. To be counted on the side of peace, nations must act. Every leader actually committed to peace will end incitement to violence in official media and publicly denounce homicide bombs. Every nation actually committed to peace will stop the flow of money, equipment, and recruits to terrorist groups seeking the destruction of Israel, including Hamas, Islamic Jihad, and Hizbullah. Every nation committed to peace must block the shipment of Iranian supplies to these groups and oppose regimes that promote terror, like Iraq. And Syria must choose the right side in the war on terror by closing terrorist camps and expelling terrorist organizations.
Here, then, Bush rebuilt the context in which to understand the Middle East conflict. In the months ahead, pressured by his main European ally, the British prime minister Tony Blair, and by his own Secretary of State, Colin Powell, Bush would sometimes seem to backslide into the old way of thinking. But he would invariably recover. Nor would he ever lose sight of the “vision” by which he was guided on this issue, and through which he had simultaneously made a strong start in fitting not the Palestinian Authority alone but the entire Muslim world, “friends” no less than enemies, into his conception of the war against terrorism.
With the inconsistency thus removed and the resultant shakiness repaired by the addition of this fourth pillar to undergird it, the Bush Doctrine was now firm, coherent, and complete.
Saluting the Flag Again
Both as a theoretical construct and as a guide to policy, the new Bush Doctrine could not have been further from the “Vietnam syndrome”—that loss of self-confidence and concomitant spread of neoisolationist and pacifist sentiment throughout the American body politic, and most prominently in the elite institutions of American culture, which began during the last years of the Vietnam war. I have already pointed to a likeness between the Truman Doctrine's declaration that World War III had started and the Bush Doctrine's equally portentous declaration that 9/11 had plunged us into World War I V. But fully to measure the distance traveled by the Bush Doctrine, I want to look now at yet another presidential doctrine—the one developed by Richard Nixon in the late 1960's precisely in response to the Vietnam syndrome.
Contrary to legend, our military intervention into Vietnam under John F. Kennedy in the early 1960's had been backed by every sector of mainstream opinion, with the elite media and the professoriate leading the cheers. At the beginning, indeed, the only criticism from the mainstream concerned tactical issues. Toward the middle, however, and with Lyndon B. Johnson having succeeded Kennedy in the White House, doubts began to arise concerning the political wisdom of the intervention, and by the time Nixon had replaced Johnson, the moral character of the United States was being indicted and besmirched. Large numbers of Americans, including even many of the people who had led the intervention in the Kennedy years, were now joining the tiny minority on the Left who at the time had denounced them for stupidity and immorality, and were now saying that going into Vietnam had progressed from a folly into a crime.
To this new political reality the Nixon Doctrine was a reluctant accommodation. As getting into Vietnam under Kennedy and Johnson had worked to undermine support for the old strategy of containment, Nixon—along with his chief adviser in foreign affairs, Henry Kissinger—thought that our way of getting out of Vietnam could conversely work to create the new strategy that had become necessary.
First, American forces would be withdrawn from Vietnam gradually, while the South Vietnamese built up enough power to assume responsibility for the defense of their own country. The American role would then be limited to providing arms and equipment. The same policy, suitably modified according to local circumstances, would be applied to the rest of the world as well. In every major region, the United States would now depend on local surrogates rather than on its own military to deter or contain any Soviet-sponsored aggression, or any other potentially destabilizing occurrence. We would supply arms and other forms of assistance, but henceforth the deterring and the fighting would be left to others.
On every point, the new Bush Doctrine contrasted sharply with the old Nixon Doctrine. Instead of withdrawal and fallback, Bush proposed a highly ambitious forward strategy of intervention. Instead of relying on local surrogates, Bush proposed an active deployment of our own military power. Instead of deterrence and containment, Bush proposed preemption and “taking the fight to the enemy.” And instead of worrying about the stability of the region in question, Bush proposed to destabilize it through “regime change.”
The Nixon Doctrine had obviously harmonized with the Vietnam syndrome. What about the Bush Doctrine? Was the political and military strategy it put forward comparably in tune with the post-9/11 public mood?
Certainly this is how it seemed in the immediate aftermath of the attacks: so much so that a group of younger commentators were quick to proclaim the birth of an entirely new era in American history. What December 7, 1941 had done to the old isolationism, they announced, September 11, 2001 had done to the Vietnam syndrome. It was politically dead, and the cultural fallout of that war—all the damaging changes wrought by the 1960's and the 1970's—would now follow it into the grave.
The most obvious sign of the new era was that once again we were saluting our now ubiquitously displayed flag. This was the very flag that, not so long ago, leftist radicals had thought fit only for burning. Yet now, even on the old flag-burning Left, a few prominent personalities were painfully wrenching their unaccustomed arms into something vaguely resembling a salute.
It was a scene reminiscent of the response of some Communists to the suppression by the new Soviet regime of the sailors' revolt that erupted in Kronstadt in the early 1920's. Far more murderous horrors would pour out of the malignant recesses of Stalinist rule, but as the first in that long series of atrocities leading to disillusionment with the Soviet Union, Kronstadt became the portent of them all. In its way, 9/11 served as an inverse Kronstadt for a number of radical leftists of today. What it did was raise questions about what one of them was now honest enough to describe as their inveterately “negative faith in America the ugly.”
September 11 also brought to mind a poem by W.H. Auden written upon the outbreak of World War II and entitled “September 1, 1939.” Although it contained hostile sentiments about America, remnants of Auden's own Communist period, the opening lines seemed so evocative of September 11, 2001 that they were often quoted in the early days of this new war:
I sit in one of the dives
On Fifty-second Street
Uncertain and afraid
As the clever hopes expire
Of a low dishonest decade.
Auden's low dishonest decade was the 1930's, and its clever hopes centered on the construction of a workers' paradise in the Soviet Union. Our counterpart was the 1960's, and its less clever hopes centered not on construction, however illusory, but on destruction—the destruction of the institutions that made up the American way of life. For America was conceived in that period as the great obstacle to any improvement in the lot of the wretched of the earth, not least those within its own borders.
As a “founding father” of neoconservatism who had broken ranks with the Left precisely because I was repelled by its “negative faith in America the ugly,” I naturally welcomed this new patriotic mood with open arms. In the years since making that break, I had been growing more and more impressed with the virtues of American society. I now saw that America was a country in which more liberty and more prosperity abounded than human beings had ever enjoyed in any other country or any other time. I now recognized that these blessings were also more widely shared than even the most visionary utopians had ever imagined possible. And I now understood that this was an immense achievement, entitling the United States of America to an honored place on the roster of the greatest civilizations the world had ever known.
The new patriotic mood therefore seemed to me a sign of greater intellectual sanity and moral health, and I fervently hoped that it would last. But I could not fully share the confidence of some of my younger political friends that the change was permanent—that, as they exulted, nothing in American politics and American culture would ever be the same again. As a veteran of the political and cultural wars of the 1960's, I knew from my own scars how ephemeral such a mood might well turn out to be, and how vulnerable it was to seemingly insignificant forces.
In this connection, I was haunted by one memory in particular. It was of an evening in the year 1960, when I went to address a meeting of left-wing radicals on a subject that had then barely begun to show the whites of its eyes: the possibility of American military involvement in a faraway place called Vietnam. Accompanying me that evening was the late Marion Magid, a member of my staff at COMMENTARY, of which I had recently become the editor. As we entered the drafty old hall on Union Square in Manhattan, Marion surveyed the 50 or so people in the audience, and whispered to me: “Do you realize that every young person in this room is a tragedy to some family or other?”
The memory of this quip brought back to life some sense of how unpromising the future had then appeared to be for that bedraggled-looking assemblage. No one would have dreamed that these young people, and the generation about to descend from them politically and culturally, would within the blink of a historical eye come to be hailed as “the best informed, the most intelligent, and the most idealistic this country has ever known.” Those words, even more incredibly, would emanate from what the new movement regarded as the very belly of the beast: from, to be specific, Archibald Cox, a professor at the Harvard Law School and later Solicitor General of the United States. Similar encomia would flow unctuously from the mouths of parents, teachers, clergymen, artists, and journalists.
More incredible yet, the ideas and attitudes of the new movement, cleaned up but essentially unchanged, would within a mere ten years turn one of our two major parties upside down and inside out. In 1961, President John F. Kennedy had famously declared that we would “pay any price, bear any burden, . . . to assure the survival and the success of liberty.” By 1972, George McGovern, nominated for President by Kennedy's own party, was campaigning on the slogan, “Come Home, America.” It was a slogan that to an uncanny degree reflected the ethos of the embryonic movement I had addressed in Union Square only about a decade before.
The New “Jackal Bins”
In going over this familiar ground, I am trying to make two points. One is that the nascent radical movement of the late 1950's and early 1960's was up against an adversary, namely, the “Establishment,” that looked unassailable. Even so—and this is my second point—to the bewilderment of almost everyone, not least the radicals themselves, they blew and they blew and they blew the house down.
Here we had a major development that slipped in under the radar of virtually all the pundits and the trend-spotters. How well I remember John Roche, a political scientist then working in the Johnson White House, being quoted by the columnist Jimmy Breslin as having derisively labeled the radicals a bunch of “Upper West Side jackal bins.” As further investigation disclosed, Roche had actually said “Jacobins,” a word so unfamiliar to his interviewer that “jackal bins” was the best Breslin could do in transcribing his notes.
Much ink has been spilled, gallons of it by me, in the struggle to explain how and why a great “Establishment” representing so wide a national consensus could have been toppled so easily and so quickly by so small and marginal a group as these “jackal bins.” In the domain of foreign affairs, of course, the usual answer is Vietnam. In this view, it was by deciding to fight an unpopular war that the Establishment rendered itself vulnerable.
The ostensible problem with this explanation, to say it again, is that at least until 1965 Vietnam was a popular war. All the major media—from the New York Times to the Washington Post, from Time to Newsweek, from CBS to ABC—supported our intervention. So did most of the professoriate. And so did the public. Even when all but one or two of the people who had either directly led us into Vietnam, or had applauded our intervention, commenced falling all over themselves to join the antiwar parade, public opinion continued supporting the war.
But it did not matter. Public opinion had ceased to count. Indeed, as the Tet offensive of 1968 revealed, reality itself had ceased to count. As all would later come to agree and some vainly struggled to insist at the time, Tet was a crushing defeat not for us but for the North Vietnamese. But Walter Cronkite had only to declare it a defeat for us from the anchor desk of the CBS Evening News, and a defeat it became.
Admittedly, in electoral politics, where numbers are decisive, public opinion remained potent. Consequently, none of the doves contending for the presidency in 1968 or 1972 could beat Richard Nixon. Yet even Nixon felt it necessary to campaign on the claim that he had a “plan” not for winning but for getting us out of Vietnam.
All of which is to say that, on Vietnam, elite opinion trumped popular opinion. Nor were the effects restricted to foreign policy. They extended into the newly antagonistic attitude toward everything America was and represented.
It hardly needs stressing that this attitude found a home in the world of the arts, the universities, and the major media of news and entertainment, where intellectuals shaped by the 1960's, and their acolytes in the publishing houses of New York and in the studios of Hollywood, held sway. But it would be a serious mistake to suppose that the trickle-down effect of the professoriate's attitude was confined to literature, journalism, and show business.
John Maynard Keynes once said that “Practical men who believe themselves to be quite exempt from any intellectual influences, are usually the slaves of some defunct economist.” Keynes was referring specifically to businessmen. But practical functionaries like bureaucrats and administrators are subject to the same rule, though they tend to be the slaves not of economists but of historians and sociologists and philosophers and novelists who are very much alive even when their ideas have, or should have, become defunct. Nor is it necessary for the “practical men” to have studied the works in question, or even ever to have heard of their authors. All they need do is read the New York Times, or switch on their television sets, or go to the movies—and, drip by drip, a more easily assimilable form of the original material is absorbed into their heads and their nervous systems.
These, in sum, were some of the factors that made me wonder whether the terrorist attacks of September 11, 2001 would turn out to mark a genuine turning point comparable to the bombing of Pearl Harbor on December 7, 1941. I was well aware that, before Pearl Harbor, several groups ranging across the political spectrum had fought against our joining the British, who had been at war with Nazi Germany since 1939. There were the isolationists, both liberal and conservative, who detected no American interest in this distant conflict; there were the right-wing radicals who thought that if we were going to go to war, it ought to be on the side of Nazi Germany against Communist Russia, not the other way around; and there were the left-wing radicals who saw the war as a struggle between two equally malign imperialistic systems in which they had no stake. Under the influence of these groups, a large majority of Americans had opposed our entry into the war right up to the moment of the Japanese attack on Pearl Harbor. But from that moment on, the opposition faded away. The antiwar groups either lost most of their members or lapsed into a morose silence, and public opinion did a 180-degree turn.
At first, September 11 did seem to resemble Pearl Harbor in its galvanizing effect, while by all indications the first battle of World War IV—the battle of Afghanistan—was supported by a perhaps even larger percentage of the public than Vietnam had been at the beginning. Nevertheless, even though the opposition in 2001 was still numerically insignificant, it was much stronger than it had been in the early days of Vietnam. The reason was that it now maintained a tight grip over the institutions that, in the later stages of that war, had been surrendered bit by bit to the anti-American Left.
There was, for openers, the literary community, which could stand in for the world of the arts in general. No sooner had the Twin Towers been toppled and the Pentagon smashed than a fierce competition began for the gold in the anti-American Olympics. Susan Sontag, one of my old ex-friends on the Left, seized an early lead in this contest with a piece in which she asserted that 9/11 was an attack “undertaken as a consequence of specific American alliances and actions.” Not content with suggesting that we had brought this aggression on ourselves, she went on to compare the backing in Congress for our “robotic President” to “the unanimously applauded, self-congratulatory bromides of a Soviet Party Congress.”
Another of my old ex-friends, Norman Mailer, surprisingly slow out of the starting gate, soon came up strong on the inside by comparing the Twin Towers to “two huge buck teeth,” and pronouncing the ruins at Ground Zero “more beautiful than the buildings were.” Still playing the enfant terrible even as he was closing in on his eightieth year, Mailer denounced us as “cultural oppressors and aesthetic oppressors” of the Third World. In what did this oppression consist? It consisted, he expatiated, in our establishing “enclaves of our food out there, like McDonald's” and in putting “our high-rise buildings” around the airports of even “the meanest, scummiest, capital[s] in the world.” For these horrendous crimes we had, on 9/11, received a measure—and only a small measure at that—of our just deserts.
Then there were the universities. A report issued shortly after 9/11 by the American Council of Trustees and Alumni (ACTA) cited about a hundred malodorous statements wafting out of campuses all over the country that resembled Son-tag and Mailer in blaming the attacks not on the terrorists but on America. Among these were three especially choice specimens. From a professor at the University of New Mexico: “Anyone who can blow up the Pentagon gets my vote.” From a professor at Rutgers: “[We] should be aware that the ultimate cause [of 9/11] is the fascism of U.S. foreign policy over the past many decades.” And from a professor at the University of Massachusetts: “[The American flag] is a symbol of terrorism and death and fear and destruction and oppression.”
When the ACTA report was issued, protesting wails of “McCarthyism” were heard throughout the land, especially from the professors cited. Like them, Susan Sontag, too, claimed that her freedom of speech was being placed in jeopardy. In this peculiar reading of the First Amendment, much favored by leftists in general, they were free to say anything they liked, but the right to free speech ended where criticism of what they had said began.
Actually, however, with rare exceptions, attempts to stifle dissent on the campus were largely directed at the many students and the few faculty members who supported the 9/11 war. All these attempts could be encapsulated into a single phenomenon: on a number of campuses, students or professors who displayed American flags or patriotic posters were forced to take them down. As for Susan Sontag's freedom of speech, hardly had the ink dried on her post-9/11 piece before she became the subject of countless fawning reports and interviews in periodicals and on television programs around the world.
Speaking of television, it was soon drowning us with material presenting Islam in glowing terms. Mainly, these programs took their cue from the President and other political leaders. Out of the best of motives, and for prudential reasons as well, elected officials were striving mightily to deny that the war against terrorism was a war against Islam. Hence they never ceased heaping praises on the beauties of that religion, about which few of them knew anything.
But it was from the universities, not from the politicians, that the substantive content of these broadcasts derived, in interviews with academics, many of them Muslims themselves, whose accounts of Islam were selectively roseate. Sometimes they were even downright untruthful, especially in sanitizing the doctrine of jihad or holy war, or in misrepresenting the extent to which leading Muslim clerics all over the world had been celebrating suicide bombers—not excluding those who had crashed into the World Trade Center and the Pentagon—as heroes and martyrs.
I do not bring this up in order to enter into a theological dispute. My purpose, rather, is to offer another case study in the continued workings of the trickle-down effect I have already described. Thus, hard on the heels of 9/11, the universities began adding innumerable courses on Islam to their curricula. On the campus, “understanding Islam” inevitably translated into apologetics for it, and most of the media dutifully followed suit. The media also adopted the stance of neutrality between the terrorists and ourselves that prevailed among the relatively moderate professoriate, as when the major television networks ordered their anchors to avoid exhibiting partisanship.
Here the great exception was the Fox News Channel. The New York Times, in an article deploring the fact that Fox was covering the war from a frankly pro-American perspective, expressed relief that no other network had so cavalierly discarded the sacred conventions dictating that journalists, in the words of the president of ABC News, must “maintain their neutrality in times of war.”
Although the vast majority of those who blamed America for having been attacked were on the Left, a few voices on the Right joined this perverted chorus. Speaking on Pat Robertson's TV program, the Reverend Jerry Falwell delivered himself of the view that God was punishing the United States for the moral decay exemplified by a variety of liberal groups among us. Both later apologized for singling out these groups, but each continued to insist that God was withdrawing His protection from America because all of us had become great sinners. And in the amen corner that quickly formed on the secular Right, commentators like Robert Novak and Pat Buchanan added that we had called the attack down on our heads not so much by our willful disobedience to divine law as by our manipulated obedience to Israel.
Oddly enough, however, within the Arab world itself, there was much less emphasis on Israel as the root cause of the attacks than was placed on it by most, if not all, of Buchanan's fellow paleoconservatives on the Right. Even to Osama bin Laden himself, support of Israel ranked only third on a list of our “crimes” against Islam.
Not, to be sure, that Arabs everywhere—together with most non-Arab Middle Eastern Muslims like the Iranians—had given up their dream of wiping Israel off the map. To anyone who thought otherwise, Fouad Ajami of Johns Hopkins, an American who grew up as a Muslim in Lebanon, had this to say about the Arab world's “great refusal” to accept Israel under any conditions whatsoever:
The great refusal persists in that “Arab street” of ordinary men and women, among the intellectuals and the writers, and in the professional syndicates. . . . The force of this refusal can be seen in the press of the governments and of the oppositionists, among the secularists and the Islamists alike, in countries that have concluded diplomatic agreements with Israel and those that haven't.
Ajami emphasized that the great refusal remained “fiercest in Egypt,” notwithstanding the peace treaty it had signed with Israel in 1978. It might have been expected, then, that the Egyptians would be eager to blame the widespread animus against the U.S. in their own country on American policy toward Israel, especially since Egypt, being second only to the Jewish state as a recipient of American aid, had a powerful incentive to explain away so ungrateful a response to the benevolent treatment it was receiving at our hands. But no. Only about two weeks before 9/11, Ab'd Al-Mun'im Murad, a columnist in Al-Akhbar, a daily newspaper sponsored by the Egyptian government, wrote:
The conflict that we call the Arab-Israeli conflict is, in truth an Arab conflict with Western, and particularly American, colonialism. The U.S. treats [the Arabs] as it treated the slaves inside the American continent. To this end, [the U.S.] is helped by the smaller enemy, and I mean Israel.
In another piece, the same writer expanded on this unusually candid acknowledgment:
The issue no longer concerns the Israeli-Arab conflict. The real issue is the Arab-American conflict—Arabs must understand that the U.S. is not “the American friend”—and its task, past, present, and future, is [to impose] hegemony on the world, primarily on the Middle East and the Arab world.
Then, in a third piece, also published in late August, Murad gave us an inkling of the reciprocal “task” he had in mind to be performed on America:
The Statue of Liberty, in New York Harbor, must be destroyed because of . . . the idiotic American policy that goes from disgrace to disgrace in the swamp of bias and blind fanaticism. . . . The age of the American collapse has begun.
If this was the kind of thing we were getting from an Arab country that everyone regarded as “moderate,” in radical states like Iraq and Iran nothing less would suffice than identifying America as the “Great Satan.” As for the Palestinians, their contempt for America was hardly exceeded by their loathing of Israel. For example, the mufti—or chief cleric—appointed by the Palestinian Authority under Yasir Arafat had prayed that God would “destroy America,” while the editor of a leading Palestinian journal proclaimed:
History does not remember the United States, but it remembers Iraq, the cradle of civilization. . . . History remembers every piece of Arab land, because it is the bosom of human civilization. On the other hand, the [American] murderers of humanity, the creators of the barbaric culture and the bloodsuckers of nations, are doomed to death and destined to shrink to a microscopic size, like Micronesia.
The absence of even a word here about Israel showed that if the Jewish state had never come into existence, the United States would still have stood as an embodiment of everything that most of these Arabs considered evil. Indeed, the hatred of Israel was in large part a surrogate for anti-Americanism, rather than the reverse. Israel was seen as the spearhead of the American drive for domination over the Middle East. As such, the Jewish state was a translation of America into, as it were, Hebrew—the “little enemy,” the “little Satan.” To rid the region of it would thus be tantamount to cleansing an area belonging to Islam (dar al-Islam) of the blasphemous political, social, and cultural influences emanating from a barbaric and murderous force. But the force, so to speak, was with America, of which Israel was merely an instrument.
Although Buchanan and Novak were earlier and more outspoken in blaming 9/11 on American friendliness toward Israel, this idea was not confined to the Right or to the marginal precincts of paleoconservatism. On the contrary: while it popped up on the Right, it thoroughly pervaded the radical Left and much of the soft Left, and was even espoused by a number of liberal centrists like Mickey Kaus. For the moment, indeed, the blame-Israel-firsters were concentrated most heavily on the Left.
It was also on the Left, and above all in the universities, that their fraternal twins, the blame-America-firsters, were located. Yet Eric Foner, a professor of history at my own alma mater, Columbia, risibly claimed that the ACTA report was misleading since the polls proved that there was “firm support” for the war among college students. “If our aim is to indoctrinate students with unpatriotic beliefs,” Foner smirked, “we're obviously doing a very poor job of it.”
True enough. But what Foner, as a historian, must have known but neglected to mention was that even at the height of the radical fevers on the campus in the 1960's, only a minority of students sided with the antiwar radicals. Still, even though they were in the majority, the non-radical students were unable to make themselves heard above the antiwar din, and whenever they tried, they were shouted down. This is how it was, too, on the campus after 9/11. There were, here and there, brave defiers of the academic orthodoxies. But mostly, the silent majority remained silent, for fear of incurring the disapproval of their teachers, or even of being punished for the crime of “insensitivity.”
Such, then, was the assault that began to be mounted within hours of 9/11 by the guerrillas-with-tenure in the universities, along with their spiritual and political disciples scattered throughout other quarters of our culture. Could this “tiny handful of aging Rip van Winkles,” as they were breezily brushed off by one commentator, grow into a force as powerful as the “jackal bins” of yesteryear? Was the upsurge of confidence in America, and American virtue, that spontaneously materialized on 9/11 strong enough to withstand them this time around?
Some who shared my apprehensions believed that if things went well on the military front, all would be well on the home front, too. And that is how it appeared from the effect wrought by the spectacular success of the Afghanistan campaign, which disposed of the “quagmire” theory and also dampened antiwar activity on at least a number of campuses. Nevertheless, the mopping-up operation in Afghanistan created an opportunity for more subtle forms of opposition to gain traction. There were complaints that the terrorists captured in Afghanistan and then sent to a special facility in Guantanamo were not being treated as regular prisoners of war. And there were also allegations of the threat to civil liberties posed in America itself by measures like the Patriot Act, which had been designed to ward off any further terrorist attacks at home. Although these concerns were mostly based on misreadings of the Geneva Convention and of the Patriot Act itself, some people no doubt raised them in good faith. But there is also no doubt that such issues could—and did—serve as a respectable cover for wholesale opposition to the entire war.
Another respectable cover was the charge that Bush was following a policy of “unilateralism.” The alarm over this supposedly unheard-of outrage was first sounded by the chancelleries and chattering classes of Western Europe when Bush stated that, in taking the fight to the terrorists and their sponsors, we would prefer to do so with allies and with the blessing of the UN, but if necessary we would go it alone and without an imprimatur from the Security Council.
This was too much for the Europeans. Having duly offered us their condolences over 9/11, they could barely let a decent interval pass before going back into the ancient family business of showing how vastly superior in wisdom and finesse they were to the Americans, whose primitive character was once again on display in the “simplistic” ideas and crude moralizing of George W. Bush. Now they urged that our military operations end with Afghanistan, and that we leave the rest to diplomacy in deferential consultation with the great masters of that recondite art in Paris and Brussels.
Taking their cue from these masters, the New York Times, along with many other publications ranging from the Center to the hard Left—and soon to be seconded by all the Democratic candidates in the presidential primaries, except for Senator Joseph Lieberman—began hitting Bush for recklessness and overreaching. What we saw developing here was a broader coalition than the antiwar movement spawned by Vietnam had managed to put together, especially in its first few years. The antiwar movement then had been made up almost entirely of leftists and liberals, whereas this new movement was bringing together the whole of the hard Left, elements of the soft Left, and sectors of the American Right.
Treading the path previously marked out by his colleague Mickey Kaus on the issue of Israel, Michael Kinsley of the soft Left allied himself with Pat Buchanan in bringing forth yet another respectable cover. This was to indict the President for evading the Constitution by proposing to fight undeclared wars. Meanwhile, the same charge was moving into the political mainstream through Democratic Senators like Robert Byrd, Edward M. Kennedy, and Tom Daschle, though they also continued carrying on about quagmires and slippery slopes and “unilateralism.”
I for one was certain that, as the military facet of World War IV widened—with Iraq clearly being the next most likely front—opposition would not only grow but would acquire enough assurance to dispense with any respectable covers. Which was to say that it would be taken over by extremists and radicalized. About this I turned out to be correct, while those who scoffed at the “jackal bins” and the “aging Rip Van Winkles” as a politically insignificant bunch turned out to be wrong. But I never imagined that the new antiwar movement would so rapidly arrive at the stage of virulence it had taken years for its ancestors of the Vietnam era to reach.
Varieties of Anti-Americanism
A possible explanation of the great velocity achieved by the new antiwar movement was that, like the respectable critique immediately preceding it, the radical opposition was following the lead of European opinion. In this instance, encouragement and reinforcement came from the almost incredible degree of hostility to America that erupted in the wake of 9/11 all over the European continent, and most blatantly in France and Germany, and that gathered even more steam in the run-up to the battle of Iraq. If demonstrations and public-opinion polls could be believed, huge numbers of Europeans loathed the United States so deeply that they were unwilling to side with it even against one of the most tyrannical and murderous despots on earth.
That this was the feeling in the Muslim world did not come as a surprise. Unlike in Europe, where the attacks of 9/11 did elicit a passing moment of sympathy for the United States (“We Are All Americans Now,” proclaimed a headline the next day in the leading leftist daily in Paris), in the realm of Islam the news of 9/11 brought dancing in the streets and screams of jubilation. Almost to a man, Muslim clerics in their sermons assured the faithful that in striking a blow against the “Great Satan,” Osama bin Laden had acted as a jihadist, or holy warrior, in strict accordance with the will of God.
This could have been predicted from a debate on the topic “Bin Laden—The Arab Despair and American Fear” that was televised on the Arabic-language network Al-Jazeera about two months before 9/11. Using “American Fear” in the title was a bit premature, since this was a time when very few Americans were frightened by Islamic terrorism, for the simple reason that scarcely any had ever heard of bin Laden or al Qaeda. Be that as it may, at the conclusion of the program, the host said to the lone guest who had been denouncing bin Laden as a terrorist: “I am looking at the viewers' reactions for one that would support your positions—but . . . I can't find any.” He then cited “an opinion poll in a Kuwaiti paper which showed that 69 percent of Kuwaitis, Egyptians, Syrians, Lebanese, and Palestinians think bin Laden is an Arab hero and an Islamic jihad warrior.” And on the basis of the station's own poll, he also estimated that among all Arabs “from the Gulf to the Ocean,” the proportion sharing this view of bin Laden was “maybe even 99 percent.”
Surely, then, the chairman of the Syrian Arab Writers Associations was speaking for hordes of his “brothers” in declaring shortly after 9/11 that
When the twin towers collapsed . . . I felt deep within me like someone delivered from the grave; I [felt] that I was being carried in the air above the corpse of the mythological symbol of arrogant American imperialist power. . . . My lungs filled with air, and I breathed in relief, as I had never breathed before.
If this was how the Arab/Muslim world largely felt about 9/11, what could have been expected from that world when the United States picked itself up off the ground—Ground Zero, to be exact—and began fighting back? What could have been expected is precisely what happened: another furious outburst of anti-Americanism. Only this time the outbursts were infused not by jubilation but by the desperate hope that the United States would somehow be humiliated. This hope was soon extinguished by the quick defeat of the Taliban regime in Afghanistan, but it was immediately rekindled by the way Saddam Hussein was standing up against America. Saddam had killed hundreds of thousands of Muslims in Iran, and countless Arabs in his own country and Kuwait. Obviously, however, to his Arab and Muslim “brothers” this was completely canceled out by his defiance of the United States.
Was there, perhaps, an element of the same twisted sentiment in the willingness of millions upon millions of Europeans to lend de-facto aid and comfort to this monster? Of course, the claim was that most such people were neither pro-Saddam nor anti-American: all they wanted was to “give peace a chance.” But this claim was belied by the slogans, the body language, the speeches, and the manifestos of the “peace” party. Though hatred of America may not have been universal among opponents of American military action, it was obviously very widespread and very deep. And though other considerations (pacifist sentiment, concern about civilian casualties, contempt for George Bush, faith in the UN, etc.) were at work, these factors had no trouble coexisting harmoniously with extreme hostility to the United States.
Thus, within two months of 9/11, a survey of influential people in 23 countries was undertaken by the Pew Research Center, the Princeton Survey Research Associates, and the International Herald Tribune. Here is how a British newspaper summarized the findings:
Did America somehow ask for the terrorist outrages in New York and Washington? . . . [M]ost people of influence in the rest of the world . . . believe that, to a certain extent, the U.S. was asking for it. . . . From its closest allies, in Europe, to the Middle East, Russia, and Asia, a uniform 70 percent said people considered it good that after September 11 Americans had realized what it was to be vulnerable.
It would therefore seem that the Italian playwright Dario Fo, winner of the Nobel Prize for Literature in 1997, was more representative of European opinion than he may at first have appeared when spewing out the following sentiment:
The great speculators wallow in an economy that every year kills tens of millions of people with poverty—so what is 20,000 [sic] dead in New York? Regardless of who carried out the massacre, this violence is the legitimate daughter of the culture of violence, hunger, and inhumane exploitation.
In France, a leading philosopher and social theorist, Jean Baudrillard, produced a somewhat different type of apologia for the terrorists of 9/11 and their ilk. This was so laden with postmodern jargon and so convoluted that it bordered on parody (“The collapse of the towers of the World Trade Center is unimaginable, but this does not suffice to make it a real event”). But Baudrillard's piece did at least contain a revealing confession:
That we have dreamed of this event, that everyone without exception has dreamed of it, . . . is unacceptable for the Western moral conscience, but it is still a fact. . . . Ultimately, they [al Qaeda] did it, but we willed it.
Much the same idea, in even more straightforward terms, was espoused across the Channel by Mary Beard, a teacher of classics at my other alma mater, Cambridge University, who wrote: “[H]owever tactfully you dress it up, the United States had it coming. . . . World bullies . . . will in the end pay the price.” With this the highly regarded novelist Martin Amis agreed. But Beard's old-fashioned English plainness evidently being a little too plain for him, Amis resorted to a bit of fancy continental footwork in formulating his own endorsement of the idea that America had been asking for it:
Terrorism is political communication by other means. The message of September 11 ran as follows: America, it is time you learned how implacably you are hated. . . . Various national characteristics—self-reliance, a fiercer patriotism than any in Western Europe, an assiduous geographical incuriosity—have created a deficit of empathy for the sufferings of people far away.
What on earth was going on here? After 9/11, most Americans had gradually come to recognize that we were hated by the terrorists who had attacked us and their Muslim cheerleaders not for our failings and sins but precisely for our virtues as a free and prosperous country. But why should we be hated by hordes of people living in other free and prosperous countries? In their case, presumably, it must be for our sins. And yet most of us knew for certain that, whatever sins we might have committed, they were not the ones of which the Europeans kept accusing us.
To wit: far from being a nation of overbearing bullies, we were humbly begging for the support of tiny countries we could easily have pushed around. Far from being “unilateralists,” we were busy soliciting the gratuitous permission and the dubious blessing of the Security Council before taking military action against Saddam Hussein. Far from “rushing into war,” we were spending months dancing a diplomatic gavotte in the vain hope of enlisting the help of France, Germany, and Russia. And so on, and so on, down to the last detail in the catalogue.
What, then, was going on? An answer to this puzzling question that would eventually gain perhaps the widest circulation came from Robert Kagan of the Carnegie Endowment. In a catchy formulation that soon became famous, Kagan proposed that Americans were from Mars and Europeans were from Venus. Expanding on this formulation, he wrote:
On the all-important question of power—the efficacy of power, the morality of power, the desirability of power—American and European perspectives are diverging. Europe is turning away from power, or to put it a little differently, it is moving beyond power into a self-contained world of laws and rules and transnational negotiation and cooperation. It is entering a post-historical paradise of peace and relative prosperity, the realization of Kant's “Perpetual Peace.” The United States, meanwhile, remains mired in history, exercising power in the anarchic Hobbesian world where international laws and rules are unreliable and where true security and the defense and promotion of a liberal order still depend on the possession and use of military might.
In developing his theory, Kagan got many things right and cast a salubrious light into many dark corners. But it also seemed to me that he was putting the shoes of his theory on the wrong feet. Although I fully accepted Kagan's description of the divergent attitudes toward military power, I did not agree that the Europeans were already living in the future while the United States remained “mired” in the past. In my judgment, the opposite was closer to the truth.
The “post-historical paradise” into which the Europeans were supposedly moving struck me as nothing more than the web of international institutions that had been created at the end of World War II under the leadership of the United States in the hope that they would foster peace and prosperity. These included the United Nations, the World Bank, the World Court, and others. Then after 1947, and again under the leadership of the United States, adaptations were made to the already existing institutions and new ones like NATO were added to fit the needs of World War III. With the victorious conclusion of World War III in 1989-90, the old international order became obsolete, and new arrangements tailored to a new era would have to be forged. But more than a decade elapsed before 9/11 finally made the contours of the “post-cold-war era” clear enough for these new arrangements to begin being developed.
Looked at from this angle, the Bush Doctrine revealed itself as an extremely bold effort to break out of the institutional framework and the strategy constructed to fight the last war. But it was more: it also drew up a blueprint for a new structure and a new strategy to fight a different breed of enemy in a war that was just starting and that showed signs of stretching out into the future as far as the eye could see. Facing the realities of what now confronted us, Bush had come to the conclusion that few if any of the old instrumentalities were capable of defeating this new breed of enemy, and that the strategies of the past were equally helpless before this enemy's way of waging war. To move into the future meant to substitute preemption for deterrence, and to rely on American military might rather than the “soft power” represented by the UN and the other relics of World War III. Indeed, not even the hard power of NATO—which had specifically been restricted by design to the European continent and whose deployment in other places could, and would be, obstructed by the French—was of much use in the world of the future.
Examined from this same angle, the European justifications for resisting the Bush Doctrine—the complaints about “unilateralism,” trigger-happiness, and the rest—were unveiled as mere rationalizations. Here I went along with Kagan in tracing these rationalizations to a decline in the power of the Europeans. He put it very well:
World War II all but destroyed European nations as global powers. . . . For a half-century after World War II, however, this weakness was masked by the unique geopolitical circumstances of the cold war. Dwarfed by the two superpowers on its flanks, a weakened Europe nevertheless served as the central strategic theater of the worldwide struggle between Communism and democratic capitalism. . . . Although shorn of most traditional measures of great-power status, Europe remained the geopolitical pivot, and this, along with lingering habits of world leadership, allowed Europeans to retain international influence well beyond what their sheer military capabilities might have afforded. Europe lost this strategic centrality after the cold war ended, but it took a few more years for the lingering mirage of European global power to fade.
So far, so good. Where I parted company with Kagan's analysis was over his acquiescence in the claim that the Europeans had in fact made the leap into the post-national, or postmodern, “Kantian paradise” of the future. To me it seemed clear that it was they, and not we Americans, who were “mired” in the past. They were fighting tooth and nail against the American effort to move into the future precisely because holding onto the ideas, the strategic habits, and the international institutions of the cold war would allow them to go on exerting “international influence well beyond what their sheer military capabilities might have afforded.” It was George W. Bush—that “simplistic” moralizer and trigger-happy “cowboy,” that flouter of international law and reckless unilateralist—who had possessed the wit to see the future and had summoned up the courage to cross over into it.
But Bush was also a politician, and as such he felt it necessary to make some accommodation to the pressures coming at him both at home and from abroad. What this required was an occasional return visit to the past. On such visits, as when he would seek endorsements from the UN Security Council, he showed a polite measure of deference to those, again both at home and abroad, who insisted on reading the Bush Doctrine not as a blueprint for the future but as a reckless repudiation of the approach favored by the allegedly more sophisticated Europeans and their American counterparts. In Kagan's apt description of how the Europeans saw themselves:
Europeans insist they approach problems with greater nuance and sophistication. They try to influence others through subtlety and indirection. . . . They generally favor peaceful responses to problems, preferring negotiation, diplomacy, and persuasion to coercion. They are quicker to appeal to international law, international conventions, and international opinion to adjudicate disputes. They try to use commercial and economic ties to bind nations together. They often emphasize process over result, believing that ultimately process can become substance.
None of this was new: the Europeans had made almost exactly the same claim of superior sophistication during the Reagan years. At that time—in 1983—it had elicited a definitive comment from Owen Harries (the former head of policy planning in the Australian Department of Foreign Affairs and himself a member of the realist school):
When one is exposed to this claim of superior realism and sophistication, one's first inclination is to ask where exactly is the evidence for it. If one considers some of the salient episodes in the history of Europe in this century—the events leading up to 1914, the Versailles peace conference, Munich, the extent of the effort Europe has been prepared to make to secure its own defense since 1948, and the current attitude toward the defense of its vital interests in the Persian gulf—one is not irresistibly led to concede European superiority.
Two decades later, Harries as a realist would have his own grave reservations about the Bush Doctrine. But I had no hesitation in adding the “sophisticated” European opposition to it as the latest episode in the long string of disastrously mistaken judgments he had enumerated back in 1983.
The astonishing success of the campaigns in Afghanistan and Iraq made a hash of the skepticism of the many pundits who had been so sure that we had too few troops or were following the wrong battle plan. Instead of getting bogged down, as they had predicted, our forces raced through these two campaigns in record time; and instead of ten of thousands of body bags being flown home, the casualties were numbered in the hundreds. As the military historian Victor Davis Hanson summarized what had transpired in Iraq:
In a span of about three weeks, the United States military overran a country the size of California. It utterly obliterated Saddam Hussein's military hardware . . . and tore apart his armies. Of the approximately 110 American deaths in the course of the hostilities, fully a fourth occurred as a result of accidents, friendly fire, or peacekeeping mishaps rather than at the hands of enemy soldiers. The extraordinarily low ratio of total American casualties per number of U.S. soldiers deployed . . . is almost unmatched in modern military history.
True, the aftermath of major military operations, especially in Iraq, turned out to be rougher than the Pentagon seems to have expected. Thanks to the guerrilla insurgency mounted by a coalition of intransigent Saddam loyalists, radical Shiite militias, and terrorists imported from Iran and Syria, American soldiers continued to be killed. Nevertheless, by any historical standard—the more than 6,500 who died on D-Day alone in World War II, to cite only one example—our total losses remained amazingly low.
But it was not military matters that aroused the equally sour skepticism of the realists. Their doubts centered, rather, on the issue of whether the Bush Doctrine was politically viable. Most of all, they questioned the idea that democratization represented the best and perhaps even the only way to defeat militant Islam and the terrorism it was using as its main weapon against us. Bush had placed his bet on a belief in the universality of the desire for freedom and the prosperity that freedom brought with it. But what if he was wrong? What if the Middle East was incapable of democratization? What if the peoples of that region did not wish to be as free and as prosperous as we were? And what if Islam as a religion was by its very nature incompatible with democracy?
These were hard questions about which reasonable men could and did differ. But those of us who backed Bush's bet had our own set of doubts about the doubts of the realists. They seemed to forget that the Middle East of today had not been created by Allah in the 7th century, and that the miserable despotisms there had not evolved through some inexorable historical process powered entirely by internal cultural forces. Instead, the states in question had all been conjured into existence less than a hundred years ago out of the ruins of the defeated Ottoman empire in World War I. Their boundaries were drawn by the victorious British and French with the stroke of an often arbitrary pen, and their hapless peoples were handed over in due course to one tyrant after another.
Mindful of this history, we backers of the Bush Doctrine wondered why it should have been taken as axiomatic that these states would and/or should last forever in their present forms, and why the political configuration of the Middle East should be eternally immune from the democratizing forces that had been sweeping the rest of the world.
And we wondered, too, whether it could really be true that Muslims were so different from most of their fellow human beings that they liked being pushed around and repressed and beaten and killed by thugs—even if the thugs wore clerical garb or went around quoting from the Quran. We wondered whether Muslims really preferred being poor and hungry and ill-housed to enjoying the comforts and conveniences that we in the West took so totally for granted that we no longer remembered to be grateful for them. And we wondered why, if all this were the case, there had been so great an outburst of relief and happiness among the people of Kabul after we drove out their Taliban oppressors.
Yes, came the response, but what about the people of Iraq? Most supporters of the invasion—myself included—had predicted that we would be greeted there with flowers and cheers; yet our troops encountered car bombs and hatred. Nevertheless, and contrary to the impression created by the media, survey after survey demonstrated that the vast majority of Iraqis did welcome us, and were happy to be liberated from the murderous tyranny under which they had lived for so long under Saddam Hussein. The hatred and the car bombs came from the same breed of jihadists who had attacked us on 9/11, and who, unlike the skeptics in our own country, were afraid that we were actually succeeding in democratizing Iraq. Indeed, this was the very warning sent by the terrorist leader Abu Musab al Zarqawi to the remnants of al Qaeda still hunkered down in the caves of Afghanistan: “Democracy is coming, and there will be no excuse thereafter [for terrorism in Iraq].”
Speaking for many of his fellow realists, Fareed Zakaria of Newsweek disagreed with al Zarqawi that democracy was coming to Iraq and contended that it was premature to try establishing it there or anywhere else in the Middle East:
We do not seek democracy in the Middle East—at least not yet. We seek first what might be called the preconditions for democracy . . . the rule of law, individual rights, private property, independent courts, the separation of church and state. . . . We should not assume that what took hundreds of years in the West can happen overnight in the Middle East.
Now, those of us who believed in the Bush Doctrine saw nothing wrong with pursuing Zakaria's agenda. But we rejected the charge—often made not only by realists like Zakaria but also by paleoconservatives like Buchanan—that our position was too “ideological” or naively “idealistic” or even “utopian.” We agreed entirely with what the President had long since contended: that the realist alternative of settling for autocratic and despotic regimes in the Middle East had neither brought the regional stability it promised nor—as 9/11 horribly demonstrated—made us safe at home. Bush had also long since given his answer to the question posed by “some who call themselves realists” as to whether “the spread of democracy in the Middle East should be any concern of ours.” It was, he affirmed in the strongest terms, a concern of ours precisely because democratization would make us more secure, and he accused the realists of having “lost contact with a fundamental reality” on this point. In this respect, I would argue, Bush was adopting a course akin to the one taken by the Marshall Plan, which had simultaneously served American interests and benefited others. Like the Marshall Plan, his new policy was a synthesis of realism and idealism: a case of doing well by doing good.
Those of us who supported the new policy also took issue with the view that democracy and capitalism could grow only in a soil that had been cultivated for centuries. We reminded the realists that in the aftermath of World War II, the United States managed within a single decade to transform both Nazi Germany and imperial Japan into capitalist democracies. And in the aftermath of the defeat of Communism in World War III, a similar process got under way on its own steam in Central and Eastern Europe, and even in the old heartland of the evil empire itself. Why not the Islamic world? The realist answer was that things were different there. To which our answer was that things were different everywhere, and a thousand reasons to expect the failure of any enterprise could always be conjured up to discourage making an ambitious effort.
To this, in turn, the counter frequently was that the Bush administration had wildly underestimated the special difficulties of democratizing Iraq and had correlatively misjudged the time so great a transformation would take, even assuming it to be possible at all. Yet talk about a “cakewalk” and the like mainly came from outside the administration; and in any event it had been applied to the future military campaign (which definitely did turn out to be a cakewalk), not to the ensuing reconstruction of Iraq. As to the latter, the administration kept repeating that we would stay in Iraq “for as long as it takes and not a day longer.” How long would that be? For those who opposed the Bush Doctrine, a year (or even a month?) after the end of major combat operations was already much too much; for those of us who supported it, “as long as it takes and not a day longer” still seemed, given the stakes, the only satisfactory formula.
As with democratization, so with the reform and modernization of Islam. In considering this even more difficult question, we found ourselves asking whether Islam could really go on for all eternity resisting the kind of reformation and modernization that had begun within Christianity and Judaism in the early modern period. Not that we were so naive as to imagine that Islam could be reformed overnight, or from the outside. In its heyday, Islam was able to impose itself on large parts of the world by the sword; there was no chance today of an inverse instant transformation of Islam by the force of American arms.
There was, however, a very good chance that a clearing of the ground, and a sowing of the seeds out of which new political, economic, and social conditions could grow, would gradually give rise to correlative religious pressures from within. Such pressures would take the form of an ultimately irresistible demand on theologians and clerics to find warrants in the Quran and the sharia under which it would be possible to remain a good Muslim while enjoying the blessings of decent government, and even of political and economic liberty. In this way a course might finally be set toward the reform and modernization of the Islamic religion itself.
The Democrats of 2004
What I have been trying to say is that the obstacles to a benevolent transformation of the Middle East—whether military, political, or religious—are not insuperable. In the long run they can be overcome, and there can be no question that we possess the power and the means and the resources to work toward their overcoming. But do we have the skills and the stomach to do what will be required? Can we in our present condition play even so limited and so benign an imperial role as we did in occupying Germany and Japan after World War II?
Some of our critics on the European Right sneer at us not, as the Left does, for being imperialists but for being such clumsy ones—for lacking the political dexterity to oversee the emergence of successor governments more amenable to reform and modernization than the despotisms now in place. I confess that I am prey to anxieties about our capabilities, and to others stemming from our character as a nation. And in thinking about our long record of inattention and passivity toward terrorism before 9/11, I fear a relapse into appeasement, diplomatic evasion, and ineffectual damage control.
Anxieties and fears like these were given a great boost by the attacks on the Bush Doctrine that became so poisonous in the 2004 presidential primary campaigns of the Democratic party. I have already told of my early apprehensions about the potential spread of the antiwar movement from the margins to the center, and my subsequent amazement in watching it go so far so fast. Whereas it took twelve years for the radicals I addressed in that drafty union hall in 1960 to capture the Democratic party behind George McGovern, their political and spiritual heirs of 2001 seemed to be pulling off the same trick in less than two. This time their leader of choice was the raucously antiwar Howard Dean. Though he eventually failed to win the nomination, his early successes frightened most of the relatively moderate candidates into a sharp leftward turn on Iraq, and drove out the few who supported the campaign there. As for John Kerry, in order to win the nomination, he had to disavow the vote he had cast authorizing the President to use force against Saddam Hussein.
To make matters worse, the campaign to discredit the action in Iraq moved from the hustings into the halls of Congress, where it wore the camouflage of a series of allegedly nonpartisan hearings. In these hearings, the most prominent of which was held by the Senate Intelligence Committee, high officials of the Bush administration were hectored by Democratic legislators (and even a few Republicans) in terms that often came close to sounding like the many articles and books in circulation that were accusing the President of having lied to us in going after Saddam Hussein. This was no slow process of trickle-down; this was an instantaneous inundation of the whole political landscape.
Among the lies through which Bush supposedly misled John Kerry and everyone else was that there might have been some connection between Saddam and al Qaeda. Now, even those of us who believed in such a connection were willing to admit that the evidence was not (yet) definitive; but this was a far cry from denying that there was any basis for it at all.10 So far a cry, that according to the reports that would be issued both by the Senate Intelligence Committee and the 9/11 Commission in the summer of 2004 (and contrary to how their conclusions would be interpreted in the media), al Qaeda did in fact have a cooperative, if informal, relationship with Iraqi agents working under Saddam.11
It was the same with another of the lies Bush allegedly told to justify the invasion of Iraq. In his State of the Union address of 2003, he said that “The British government has learned that Saddam Hussein recently sought significant quantities of uranium from Africa.” Then an obscure retired diplomat named Joseph C. Wilson IV, who had earlier been sent to Niger by the CIA to check out this claim, earned his 15 minutes of fame—not to mention a best-selling book—by loudly denouncing this assertion as a lie. But it would in due course be established that every one of the notorious “sixteen words” Bush had uttered was true. This was the consensus of the Senate Intelligence Committee report, two separate British investigations, and a variety of European intelligence agencies, including even the French.12 Not only that, but it turned out that Wilson's own report to the CIA had tended to confirm the suspicion that Saddam had been shopping for uranium in Africa, and not, as he went around declaring, to debunk it.13 The liar here, then, was not Bush but Wilson.
But of course the biggest lie Bush was charged with telling was that Saddam possessed weapons of mass destruction. On this issue, too, those of us who still suspected that the WMD remained hidden, or that they had been shipped to Syria, or both, were willing to admit that we might well be wrong. But how could Bush have been lying when every intelligence agency in every country in the world was convinced that Saddam maintained an arsenal of such weapons? And how could Bush have “hyped” or exaggerated the reports he was given by our own intelligence agencies when the director of the CIA himself described the case as a “slam dunk”?
To be sure, again according to the Senate Intelligence Committee report, the case, far from being a “slam dunk,” actually rested on weak or faulty evidence. Yet the committee itself “did not find any evidence that administration officials attempted to coerce, influence, or pressure analysts to change their judgments related to Iraq's weapons of mass destruction capabilities.” The CIA, that is, did not tell the President what it thought he wanted to hear. It told him what it thought it knew; and what it told him, he had every reason to believe.14
In the wake of the WMD issue, several others emerged that did even more to shake the confidence of some who had been enthusiastic supporters of the operation in Iraq. On top of the mounting number of American soldiers being killed as they were trying to bring security to Iraq, and on the heels of the horrendous episodes of the murder and desecration of the bodies of four American contractors in Falluja, came the revelation that Iraqi prisoners in Abu Ghraib had been subjected to ugly mistreatment by their American captors.
Among supporters of the Bush Doctrine, these setbacks set off a great wave of defeatist gloom that was deepened by the nervous tactical shifts they produced in our military planners (such as the decision to hold back from cleaning out the terrorist militias hiding in and behind holy places in Falluja and Najaf). Even the formerly unshakable Fouad Ajami was shaken. In a piece entitled “Iraq May Survive, But the Dream is Dead,” he wrote: “Let's face it: Iraq is not going to be America's showcase in the Arab-Muslim world.”
That the antiwar party would batten on all this—and would continue ignoring the enormous progress we had made in the reconstruction of Iraqi society—was only to be expected. It was also only natural for the Democrats to take as much political advantage of the setbacks as they could. But it was not necessarily to be expected that the Democrats would seize just as eagerly as the radicals upon every piece of bad news as another weapon in the war against the war. Nor was it necessarily to be expected that mainstream Democratic politicians would go so far off the intellectual and moral rails as to compare the harassment and humiliation of the prisoners in Abu Ghraib—none of whom, so far as anyone then knew, was even maimed, let alone killed—to the horrendous torturing and murdering that had gone on in that same prison under Saddam Hussein or, even more outlandishly, to the Soviet gulag in which many millions of prisoners died.
Yet this was what Edward M. Kennedy did on the floor of the Senate, where he declared that the torture chamber of Saddam Hussein had been reopened “under new management—U.S. management,” and this was what Al Gore did when he accused Bush of “establishing an American gulag.” Joining with the politicians was the main financial backer of the Democratic party's presidential campaign, George Soros, who actually said that Abu Ghraib was even worse than the attack of 9/11. On the platform with Soros when he made this morally disgusting statement was Senator Hillary Rodham Clinton, who let it go by without a peep of protest.
Equally ignominious was the response of mainstream Democrats to the most effective demagogic exfoliation of the antiwar radicals, Michael Moore's film Fahrenheit 9/11. Shortly after 9/11—that is, long before the appearance of this movie but with many of its charges against Bush already on vivid display in Moore's public statements about Afghanistan—one liberal commentator had described him as a “well-known crank, regarded with considerable distaste even on the Left.” The same commentator (shades of how the “jackal bins” of yore were regarded) had also dismissed as “preposterous” the idea that Moore's views “represent a significant body of antiwar opinion.” Lending a measure of plausibility to this assessment was the fact that Moore elicited a few boos when, in accepting an Academy Award for Bowling for Columbine in 2003, he declared:
We live in the time where we have fictitious election results that elect a fictitious president. We live in a time where we have a man sending us to war for fictitious reasons. . . . [W]e are against this war, Mr. Bush. Shame on you, Mr. Bush, shame on you.
By 2004, however, when Fahrenheit 9/11 came out, things had changed. True, this movie—a compendium of every scurrility ever hurled at George W. Bush, and a few new ones besides, all gleefully stitched together in the best conspiratorial traditions of the “paranoid style in American politics”—did manage to embarrass even several liberal commentators. One of them described the film as a product of the “loony Left,” and feared that its extremism might discredit the “legitimate” case against Bush and the war. Yet in an amazing reversal of the normal pattern in the distribution of prudence, such fears of extremism were more pronounced among liberal pundits than among mainstream Democratic politicians.
Thus, so many leading Democrats flocked to a screening of Fahrenheit 9/11 in Washington that (as the columnist Mark Steyn quipped) the business of Congress had to be put on hold; and when the screening was over, nary a dissonant boo disturbed the harmony of the ensuing ovation. The chairman of the Democratic National Committee, Terry McAuliffe, pronounced the film “very powerful, much more powerful than I thought it would be.” Then, when asked by CNN whether he thought “the movie was essentially fair and factually based,” McAuliffe answered, “I do. . . . Clearly the movie makes it clear that George Bush is not fit to be President of this country.” Senator Tom Harkin of Iowa seconded McAuliffe and urged all Americans to see the film: “It's important for the American people to understand what has gone on before, what led us to this point, and to see it sort of in this unvarnished presentation by Michael Moore.”
Possibly some of the other important Democrats who attended the screening—including Senators Tom Daschle, Max Baucus, Barbara Boxer, and Bill Nelson; Congressmen Charles Rangel, Henry Waxman, and Jim McDermott; and elders of the party like Arthur Schlesinger, Jr. and Theodore Sorensen—disagreed with Harkin and McAuliffe. But if so, they remained remarkably quiet about it.
As for John Kerry himself, he did not take time out to see Fahrenheit 9/11, explaining that there was no need since he had “lived it.”
2004 and 1952
Returning now to the gloom that afflicted supporters of the Bush Doctrine in the spring of 2004: one of the reasons Fouad Ajami gave for it was that “our enemies have taken our measure; they have taken stock of our national discord over the war.” Emboldened by our restraint in Falluja and elsewhere within Iraq, as well as by our concomitant willingness to bring the UN back into the political picture, our enemies had begun to breathe easier—and not only in Iraq:
Once the administration talked of a “Greater Middle East” where the “deficits” of freedom, knowledge, and women's empowerment would be tackled, where our power would be used to erode the entrenched despotisms in the Arab-Muslim world.
But now, Ajami lamented, it had become clear that “we shall not chase the Syrian dictator to a spider hole, nor will we sack the Iranian theocracy.” There were even indications that, abandoning the dream of democracy altogether, we might settle for the rule of a “strong man” in Iraq.
But how accurate was the measure our enemies had taken of us? Was it possible that their gauge was being thrown off by the overheated atmosphere of a more than usually bitter presidential campaign, and by the caution George Bush felt it necessary to adopt in seeking reelection?
This seemed to me then, and it still seems to me now, the most decisive question of all. I therefore want to conclude by examining it, and I want to do so by returning to the analogy I drew earlier between the start of World War III in 1947 and the start of World War IV in 2001.
When the Truman Doctrine was enunciated in 1947, it was attacked from several different directions. On the Right, there were the isolationists who—after being sidelined by World War II—had made something of a comeback in the Republican party under the leadership of Senator Robert Taft. Their complaint was that Truman had committed the United States to endless interventions that had no clear bearing on our national interest. But there was also another faction on the Right that denounced containment not as recklessly ambitious but as too timid. This group was still small, but within the next few years it would find spokesmen in Republican political figures like Richard Nixon and John Foster Dulles and conservative intellectuals like William F. Buckley, Jr. and James Burnham.
At the other end of the political spectrum, there were the Communists and their “liberal” fellow travelers who—strengthened by our alliance with the Soviet Union in World War II—had emerged as a relatively sizable group and would soon form a new political party behind Henry Wallace. In their view, the Soviets had more cause to defend themselves against us than we had to defend ourselves against them, and it was Truman, not Stalin, who posed the greater danger to “free peoples everywhere.” But criticism also came from the political center, as represented by Walter Lippmann, the most influential and most prestigious commentator of the period. Lippmann argued that Truman had sounded “the tocsin of an ideological crusade” that was nothing less than messianic in its scope.
In the election of 1948, Truman had the seemingly impossible task of confronting all three of these challenges (and a few others as well). When, against what every poll had predicted, he succeeded in warding them off, he could reasonably claim a mandate for his foreign policy. And so it came about that, under the aegis of the Truman Doctrine, American troops were sent off in 1950 to fight in Korea. “What a nation can do or must do,” Truman would later write, “begins with the willingness and the ability of its people to shoulder the burden,” and Truman was rightly confident that the American people were willing to shoulder the burden of Korea.
Even so, enough bitter opposition remained within and around the Republican party to leave it uncertain as to whether containment was an American policy or only the policy of the Democrats. This uncertainty was exacerbated by the presidential election of 1952, when the Republicans behind Dwight D. Eisenhower ran against Truman's hand-picked successor Adlai Stevenson in a campaign featuring strident attacks on the Truman Doctrine by Eisenhower's running mate Richard Nixon and his future Secretary of State John Foster Dulles. Nixon, for example, mocked Stevenson as a graduate of the “Cowardly College of Communist Containment” run by Truman's Secretary of State Dean Acheson, while Dulles repeatedly called for ditching containment in favor of a policy of “rollback” and “liberation.” And both Nixon and Dulles strongly signaled their endorsement of General Douglas MacArthur's insistence that Truman was wrong to settle for holding the line in Korea instead of going all the way—or, as MacArthur had famously put it, “There is no substitute for victory.”
Yet when Eisenhower came into office, he hardly touched a hair on the head of the Truman Doctrine. Far from adopting a bolder and more aggressive strategy, the new President ended the Korean war on the basis of the status quo ante—in other words, precisely on the terms of containment. Even more telling was Eisenhower's refusal three years later to intervene when the Hungarians, apparently encouraged by the rhetoric of liberation still being employed in the broadcasts of Radio Free Europe, rose up in revolt against their Soviet masters. For better or worse, this finally dispelled any lingering doubt as to whether containment was the policy just of the Democratic party. With full bipartisan support behind it, the Truman Doctrine had become the official policy of the United States of America.
The analogy is obviously not perfect, but the resemblances between the political battles of 1952 and those of 2004 are striking enough to help us in thinking about what a few moments ago I called the most decisive of all the questions now facing the United States. To frame the question in slightly different terms from the ones I originally used: what will happen if the Democrats behind John Kerry defeat George W. Bush in November? Will they follow through on their violent denunciations of Bush's policy, or will they, like the Republicans of 1952 with respect to Korea, quietly forget their campaign promises of reliance on the UN and the Europeans, and continue on much the same course as Bush has followed in Iraq? And looking beyond Iraq itself, will they do unto the Bush Doctrine as the Republicans of 1952 did unto the Truman Doctrine? Will they treat Iraq as only one battle in the larger war—World War IV—into which 9/11 plunged us? Will they resolve to go on fighting that war with the strategy adumbrated by the Bush Doctrine, and for as long as it may take to win it?
From the way the Democrats have been acting and speaking, I fear that the answer is no. Nor was I reassured by the flamboyant display of hawkishness they put on at their national convention in July. Yet as a passionate supporter of the Bush Doctrine I pray that I am wrong about this. If John Kerry should become our next President, and he may, it would be a great calamity if he were to abandon the Bush Doctrine in favor of the law-enforcement approach through which we dealt so ineffectually with terrorism before 9/11, while leaving the rest to those weakest of reeds, the UN and the Europeans. No matter how he might dress up such a shift, it would—rightly—be interpreted by our enemies as a craven retreat, and dire consequences would ensue. Once again the despotisms of the Middle East would feel free to offer sanctuary and launching pads to Islamic terrorists; once again these terrorists would have the confidence to attack us—and this time on an infinitely greater scale than before.
If, however, the victorious Democrats were quietly to recognize that our salvation will come neither from the Europeans nor from the UN, and if they were to accept that the Bush Doctrine represents the only adequate response to the great threat that was literally brought home to us on 9/11, then our enemies would no longer be emboldened—certainly not to the extent they have recently been—by “our national discord over the war.”
In World War III, despite the bipartisan consensus that became apparent after 1952 (and contrary to the roseate reminiscences of how it was then), plenty of “discord” remained, and there were plenty of missteps—most notably involving Vietnam—along the way to victory. There were also moments when it looked as though we were losing, and when our enemies seemed so strong that the best we could do was in effect to sue for a negotiated peace.
Now, with World War IV barely begun, a similar dynamic is already at work. In World War III, we as a nation persisted in spite of the inevitable setbacks and mistakes and the defeatism they generated, until, in the end, we won. To us the reward of victory was the elimination of a military, political, and ideological threat. To the people living both within the Soviet Union itself and in its East European empire, it brought liberation from a totalitarian tyranny. Admittedly, liberation did not mean that everything immediately came up roses, but it would be foolish to contend that nothing changed for the better when Communism landed on the very ash heap of history that Marx had predicted would be the final resting place of capitalism.
Suppose that we hang in long enough to carry World War IV to a comparably successful conclusion. What will victory mean this time around? Well, to us it will mean the elimination of another, and in some respects greater, threat to our safety and security. But because that threat cannot be eliminated without “draining the swamps” in which it breeds, victory will also entail the liberation of another group of countries from another species of totalitarian tyranny. As we can already see from Afghanistan and Iraq, liberation will no more result in the overnight establishment of ideal conditions in the Middle East than it has done in East Europe. But as we can also see from Afghanistan and Iraq, better things will immediately happen, and a genuine opportunity will be opened up for even better things to come.
The memory of how it was toward the end of World War III suggests another intriguing parallel with how it is now in the early days of World War IV. We have learned from the testimony of former officials of the Soviet Union that, unlike the elites here, who heaped scorn on Ronald Reagan's idea that a viable system of missile defense could be built, the Russians (including their best scientists) had no doubt that the United States could and would succeed in creating such a system and that this would do them in. Today the same kind of scorn is heaped by the same kind of people on George W. Bush's idea that the Middle East can be democratized, while our enemies in the region—like the Russians with respect to “Star Wars”—believe that we are actually succeeding.
One indication is the warning to this effect issued by al Zarqawi to al Qaeda, from which I have already quoted. But his letter is not the only sign that the secular despots and the Islamofascists in the Middle East are deeply worried over what the Bush Doctrine holds in store for them. There is Libya's Qaddafi, who has admitted that it was his anxiety about “being next” that induced him to give up his nuclear program. And there are the Syrians and the Iranians. Of course they keep making defiant noises and they keep trying to create as much trouble for us as possible, but with all due respect to the disappointed expectations of Fouad Ajami, I have to ask: why would they be sending jihadists and weapons into Iraq if not in a desperate last-ditch campaign to derail a process whose prospects are in their judgment only too fair and whose repercussions they fear are only too likely to send them flying?
This fear may, as Ajami says, have been tempered by our response to the troubles they themselves have been causing us. But it cannot have been altogether assuaged, since it is solidly grounded in the new geostrategic realities in their region that have been created under the aegis of the Bush Doctrine. Professor Haim Harari, a former president of the Weizmann Institute, describes these realities succinctly:
Now that Afghanistan, Iraq, and Libya are out, two-and-a-half terrorist states remain: Iran, Syria, and Lebanon, the latter being a Syrian colony. . . . As a result of the conquest of Afghanistan and Iraq, both Iran and Syria are now totally surrounded by territories unfriendly to them. Iran is encircled by Afghanistan, by the Gulf States, Iraq, and the Muslim republics of the former Soviet Union. Syria is surrounded by Turkey, Iraq, Jordan, and Israel. This is a significant strategic change and it applies strong pressure on the terrorist countries. It is not surprising that Iran is so active in trying to incite a Shiite uprising in Iraq. I do not know if the American plan was actually to encircle both Iran and Syria, but that is the resulting situation.
Finally, there is the effect the Bush Doctrine has had on the forces pushing for liberalization throughout the Middle East. When Ronald Reagan used the word “evil” in speaking of the Soviet Union, and even confidently predicted its demise, he gave new hope to democratic dissidents in and out of the gulag. Back then, very much like Ajami on Bush, some of us fell into near despair when Reagan failed to act in full accordance with his own convictions. When, for example, he responded tepidly to the great Polish crisis of 1982 that culminated in the imposition of martial law, the columnist George F. Will, one of his staunchest supporters, angrily declared that the administration headed by Reagan “loved commerce more than it loathed Communism,” and I wrote an article expressing “anguish” over his foreign policy. Yet even though (once more like Ajami today) our criticisms were mostly right in detail, we were proved wondrously wrong about the eventual outcome. It was different with the dissidents behind the Iron Curtain. They knew better than to get stuck on tactical details, and they never once lost heart.
So it has been with the Bush Doctrine. Bush has made reform and democratization the talk of the entire Middle East. Where before there was only silence, now there are countless articles and speeches and conferences, and even sermons, dedicated to the cause of political and religious liberalization and exploring ways to bring it about. Like the dissidents behind the Iron Curtain in the 1980's, the democratizers in the Middle East today evidently remain undiscouraged. Falluja and the rest notwithstanding, there has been, if anything, a steady increase in the volume and range of the reformist talk that was and continues to be inspired by the Bush Doctrine.15
I do not wish to exaggerate. Except in Iran, and perhaps also one or two other non-Arab Muslim states, the democratizers are still a relatively small group, and as yet their ranks seem to contain no one comparable in intellectual stature or moral and political influence to Sakharov or Solzhenitsyn or Sharansky. But the editor of the Middle East Review of International Affairs, Barry Rubin, who has generally been very skeptical about the chances for democratization in the region, offers a cautious assessment that seems reasonable to me:
Democracy and reform are on the Arab world's agenda. It will be a long, uphill fight to bring change to those countries, but at least a process has begun. Liberals remain few and weak; the dictatorships are strong and the Islamist threat will discourage openness or innovation. Still, at least there are more people trying to move things in the right direction.
To which I (though not Rubin) would add, thanks to George W. Bush.
Then there is Gaza, where at least some elements of the fabled Palestinian street have for the very first time exploded with denunciations not of Israel or the United States, but of Yasir Arafat's tyrannical and corrupt rule. For the first time, too, we find articles in the Arab press calling for Arafat's removal—in favor not of the Islamist alternative represented by Hamas but of a different kind of leadership.
Here, for example, is the Jordan Times:
The rapid deterioration of the domestic political order in Gaza mirrors similar dilemmas that plague most of the Arab world, revolving around the tendency of small power elites or single men to monopolize political and economic power in their hands via their direct, personal control of domestic security and police systems. Gaza is yet another warning about the failure of the modern Arab security state and the need for a better brand of statehood based on law-based citizen rights rather than gun-based regime protection and perpetual incumbency.
And here is the Arab Times of Kuwait:
Arafat should quit his position because he is the head of a corrupt authority. Arafat has destroyed Palestine. He has led it to terrorism, death, and a hopeless situation.
And there is this, from the Gulf News in Dubai:
Palestinians are saying their president for life—Arafat—is the problem along with his cronies who rule them, rob them, and impoverish them. Arabs have a responsibility here too. They can say “Israel” until they are all blue in the face, but it does not change the fact that a large part of the fault lies with the Palestinians and the Arabs.
According to a Palestinian legislator quoted by the Washington Post, “what is happening in the streets of Gaza has [nothing] to do with reform. It's a simple power struggle.” By contrast, the Iranian-born commentator Amir Taheri sees it as a new kind of “intifada aimed at bringing down yet another Arab tyranny.” Chances are that there is some truth in both of these opposing judgments, and in any event it is still too early to tell how the turmoil in Gaza will play itself out. But it is surely not too early to say that there would have been no uprising against Arafat, and much less talk about reform, if not for George W. Bush's policies combined with his courageous willingness to back those of Ariel Sharon.
In his first State of the Union address, President Bush affirmed that history had called America to action, and that it was both “our responsibility and our privilege to fight freedom's fight”—a fight he also characterized as “a unique opportunity for us to seize.” Only last May, he reminded us that “We did not seek this war on terror,” but, having been sought out by it, we responded, and now we were trying to meet the “great demands” that “history has placed on our country.”
In this language, and especially in the repeated references to history, we can hear an echo of the concluding paragraphs of George F. Kennan's “X” essay, written at the outbreak of World War III:
The issue of Soviet-American relations is in essence a test of the overall worth of the United States as a nation among nations. To avoid destruction the United States need only measure up to its own best traditions and prove itself worthy of preservation as a great nation.
Kennan then went on to his peroration:
In the light of these circumstances, the thoughtful observer of Russian-American relations will experience a certain gratitude for a Providence which, by providing the American people with this implacable challenge, has made their entire security as a nation dependent on their pulling themselves together and accepting the responsibilities of moral and political leadership that history plainly intended them to bear.
Substitute “Islamic terrorism” for “Russian-American relations,” and every other word of this magnificent statement applies to us as a nation today. In 1947, we accepted the responsibilities of moral and political leadership that history “plainly intended” us to bear, and for the next 42 years we acted on them. We may not always have acted on them wisely or well, and we often did so only after much kicking and screaming. But act on them we did. We thereby ensured our own “preservation as a great nation,” while also bringing a better life to millions upon millions of people in a major region of the world.
Now “our entire security as a nation”—including, to a greater extent than in 1947, our physical security—once more depends on whether we are ready and willing to accept and act upon the responsibilities of moral and political leadership that history has yet again so squarely placed upon our shoulders. Are we ready? Are we willing? I think we are, but the jury is still out, and will not return a final verdict until well after the election of 2004.
—August 2, 2004
1 “How to Win World War IV” (February 2002), “The Return of the Jackal Bins” (April 2002), and “In Praise of the Bush Doctrine” (September 2002). A fourth piece I used was “Israel Isn't the Issue” (Wall Street Journal, September 20, 2001).
2 He did, however, seem to have committed a sin of omission. Richard Lowry, the editor of National Review, reports that according to John Lehman, one of the Republican commissioners, “Clarke's original testimony included ‘a searing indictment of some Clinton officials and Clinton policies.’ That was the Clarke, even-handed in his criticisms of both the Bush and Clinton administrations, whom Lehman and other Republican commissioners expected to show up at the public hearings. It was a surprise ‘that he would come out against Bush that way.’ Republicans were taken aback: ‘It caught us flat-footed, but not the Democrats.’ ” In a different though related context, the commission quotes material written by Clarke while he was still in office that is inconsistent with his more recent, much-publicized denial of any relationship whatsoever between Iraq and al Qaeda.
3 Hill was referring here to the hearings of the 9/11 commission, not its final report, which did not single out the Bush administration for criticism on this score.
4 The analysis offered by Kennan in “The Sources of Soviet Conduct”—as against his own later revisionist interpretation of it—turned out to be right in almost every important detail, except for the timing. He thought it would take only fifteen years for the strategy to succeed in causing the “implosion” of the Soviet empire.
5 In expressing his determination to win the war, however, Bush was mainly reaching back to the language of Winston Churchill, who vowed as World War II was getting under way in 1940: “We shall not flag or fail. We shall go on to the end.”
6 It is worth noting that Churchill, who had been the target of many derogatory epithets in his long career but who was never regarded even by his worst enemies as “simple-minded,” had no hesitation in attaching a phrase like “monster of wickedness” to Hitler. Nor did the political philosopher Hannah Arendt, whose mind was, if anything, overcomplicated rather than too simple, have any problem in her masterpiece, The Origins of Totalitarianism, with calling both Nazism and Communism “absolute evil.”
7 Fukuyama did not return the compliment. While not exactly rejecting the Bush Doctrine, he would later criticize it and call for a “recalibration.” He would do this more in sorrow than in anger, but still in terms that were otherwise not always easy to distinguish from those of what I characterize below as the respectable opposition.
8 As John Podhoretz would later write: “Those who supported the war, in overwhelming numbers, believed there were multiple justifications for it. Those who opposed and oppose it, in equally overwhelming numbers, weren't swayed by the WMD arguments. Indeed, many of them had no difficulty opposing the war while believing that Saddam possessed vast quantities of such weapons. Take Sen. Edward Kennedy. ‘We have known for many years,’ he said in September 2002, ‘that Saddam Hussein is seeking and developing weapons of mass destruction.’ And yet only a few weeks later he was one of 23 senators who voted against authorizing the Iraq war. Take French President Jacques Chirac, who believed Saddam had WMD and still did everything in his power to block the war. So whether policymakers supported or opposed the war effort was not determined by their conviction about the presence of weapons of mass destruction.”
9 The classic expression of this fantasy was, of course, The Protocols of the Elders of Zion, a document that had been forged by the Czarist secret police in the late 19th century but that had more recently been resurrected and distributed by the millions throughout the Arab-Muslim world, and beyond. It would also form the basis of a dramatic television series produced in Egypt.
10 Stephen F. Hayes has done especially good work on this issue, both in a series of articles in the Weekly Standard and in his book The Connection: How al Qaeda's Collaboration with Saddam Hussein Has Endangered America.
11 Additional corroboration of “meetings . . . between senior Iraqi representatives and senior al Qaeda operatives” would come from a comparable British investigation conducted by Lord Butler, whose report would be released around the same time as the Senate Intelligence Committee.
12 From the Butler Report: “We conclude also that the statement in President Bush's State of the Union Address of 28 January 2003 that ‘The British Government has learned that Saddam Hussein recently sought significant quantities of uranium from Africa’ was well-founded.”
13 From the Senate Intelligence Committee Report: “He [the CIA reports officer] said he judged that the most important fact in the report [by Wilson] was that Nigerian officials admitted that the Iraqi delegation had traveled there in 1999, and that the Nigerian prime minister believed the Iraqis were interested in purchasing uranium, because this provided some confirmation of foreign government service reporting.”
14 Going even further than the Senate Intelligence Committee, the Butler Report concluded: “We believe that it would be a rash person who asserted at this stage that evidence of Iraqi possession of stocks of biological or chemical agents, or even of banned missiles, does not exist or will never be found.”
15 A representative sample can be found on the website of the Middle East Media Research Institute (http://www.memri.org/reform.html).
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
World War IV: How It Started, What It Means, and Why We Have to Win
Must-Reads from Magazine
Last year, we asked experts to examine Candidate Trump’s policy proposals. This year, we’ve asked them to examine how he has executed these proposals in office.
On Trade By Scott Lincicome
Last year, economic, legal, and geopolitical calamity lurked in the shadows of almost every trade-policy promise made by presidential candidate Donald Trump. Eight months into the Trump presidency, those problems have—thankfully—not yet materialized. Instead, Trump trade policy has been a mixture of bluster, disappointment, relief, and uncertainty. This last category warrants close attention: In the coming months, Trump’s dangerous trade ambitions could remain in check, thus keeping a global trade system alive. Or politics, legal ambiguity, and Trump’s own emotional impulses could deal that system a fatal blow.
There is no doubt that President Trump has already done serious damage to the United States’ longstanding position as a world leader on trade policy, the American political consensus in favor of trade liberalization, and Republican views of trade and globalization. His constant vituperation has offended U.S. allies and trading partners, causing them to turn to Europe, Asia, or Latin America in search of alternatives to the once-welcoming and predictable U.S. market. He has accelerated (not started) the American retreat from the World Trade Organization, further wounding a multilateral trading system that was a U.S. invention—an invention that has, contrary to popular belief, served U.S. economic and foreign-policy interests well since the 1940s.
Trump’s day-one withdrawal from the Trans-Pacific Partnership—the flawed-yet-deserving Asia-Pacific trade agreement started by President Bush and ultimately signed by President Obama—has left vacuums in both Asia-Pacific trade and international economic law. TPP was far from perfect, but it was widely supported by U.S. trade and foreign-policy experts because of its economic and geopolitical benefits. The deal contained important new rules for 21st-century issues such as e-commerce, GMOs, and state-owned enterprises. Moreover, it would have provided small but significant benefits for U.S. workers and the economy, while cementing the United States’s influence in a region increasingly covered by China’s shadow. Now, TPP parties are working to complete a “TPP-11” deal that excludes the United States, while China is negotiating its own version of the TPP—the Regional Comprehensive Economic Partnership. And many of TPP’s novel provisions are being relitigated in contentious NAFTA renegotiations with Canada and Mexico (both TPP parties).
All of this is disappointing, but it’s probably survivable and hardly the fire and brimstone of the Trump campaign trail (hence, the relief). Trump has repeatedly threatened tariffs and other forms of dangerous unilateral protectionism, but economic, legal, and political realities have intervened. For example, when Trump promised new “national security” tariffs on steel and aluminum under Section 232 of the Trade Expansion Act of 1962, the opposition from Congress, business groups, strategic allies, NGOs, and even members of Trump’s administration was unrelenting. As a result, planned tariffs have quietly been shelved (for now). Other presidential threats have similarly come and gone without major action, giving market participants some heartburn but little long-term pain. Only in the opaque area of trade remedies—antidumping, countervailing duty, and safeguard measures—has there been a marked uptick in U.S. protectionism. But this is the result of long and technical administrative proceedings initiated by U.S. industries or unions that formally petitioned the government under relevant domestic law—hardly the wave-of-the-hand actions that Trump promised.
Some measure of relief is warranted, but we’re not out of the woods just yet. Indeed, in the last eight months, Trump has publicly threatened to
- block steel and aluminum imports for national-security reasons or bring new cases against semiconductors and ships, under the aforementioned Section 232;
- withdraw from the North American Free Trade Agreement and the U.S.-Korea FTA;
- slap tariffs on Chinese imports under Section 301 of the Trade Act of 1974 because of alleged Chinese intellectual-property-rights violations; and
- impose onerous new “Buy American” requirements on U.S. pipelines and government-funded infrastructure projects.
And those are just the public threats. Behind closed doors, Trump has reportedly considered enacting sweeping import restrictions under the International Emergency Economic Powers Act. The president reportedly yelled, “I want tariffs. Bring me some tariffs!” when told by his “globalist” advisers that legal and economic realities prevent him from imposing broad-based protectionism on a whim.
None of the threats on Trump’s wish list is officially off the table, and any one of them would have serious economic consequences: Steel tariffs alone would put more than 1.3 million American jobs at risk; NAFTA withdrawal could destroy 250,000 more; and several nations have promised immediate retaliation against American goods, services, or investment in response to Trumpian protectionism. Trump’s actions would also raise major legal issues. For example, the World Trade Organization’s broad, subjective “national security” exception wasn’t intended to be used as a get-out-of-jail free-card for steel tariffs, and a dispute over a member’s right to invoke it could imperil the multilateral trading system. Meanwhile, Trump’s withdrawal from a free-trade agreement without congressional consent would raise major constitutional questions as to whether the president had that authority and what would happen to the myriad U.S. tariffs and other commitments that were embedded in legislation and passed into law. Lawsuits over these and other issues surrounding presidential trade powers would throw billions of dollars of cross-border trade and investments into legal limbo.
The president’s unpredictability, political weakness, and clear affinity for protectionism, combined with ample (though ambiguous) legal authority to act unilaterally, mean that any one of his trade threats could still materialize in the coming months. The White House’s internationalists may have won the early battles, but the war will rage for as long as Trump is president. Continued vigilance and advocacy for the benefits of freer trade remain critical.
And congressional legislation clarifying and limiting the president’s trade powers might not be a bad idea either…just in case.
Click here to read what Scott Lincicome wrote about Candidate Trump and trade last year.
Scott Lincicome is an international trade attorney, adjunct scholar at the Cato Institute, and visiting lecturer at Duke University Law School. The views expressed herein are his own and do not necessarily reflect those of his employer.
On Taxes By James Pethokoukis
At some point in his first term, President Donald Trump will likely sign legislation that cuts taxes by some amount for somebody. This modest prediction is based less on reading the political tea leaves than understanding conservative politics. If any issue made the modern Republican Party, it was tax cuts. Not surprising, then, that candidate Trump promised big cuts for individuals and businesses. And with the GOP now holding the White House and Congress, failure to deliver is almost unimaginable.
Of course it’s almost equally unimaginable that the Trump tax cuts will at all resemble the ambitious plans devised by Trump advisers during the campaign. There were two of those blueprints. The first, rolled out September 2015, proposed lowering the top personal rate to 25 percent from the current 39.6 percent, and cutting the corporate rate to 15 percent from the current 35 percent. Along with other changes, including eliminating the alternative minimum tax and estate tax, this initial plan might have lowered annual government revenue by a whopping $1 trillion a year or more (even if one assumes much faster economic growth).
This was, in other words, more a fantasy proposal cooked up by Reagan-era supply-siders than a serious effort to reform the tax code without worsening our historically high federal debt. Indeed, Trump’s sole purpose in signing on to the plan may have been to win over that very same group, still influential among base voters. Trump himself talked little about the plan while on the hustings, especially compared with immigration, trade, and The Wall.
The Trump campaign’s second bite at the apple a year later was a scaled-back plan, but still a colossal one. Instead of losing a trillion bucks a year, maybe the government would be out just a half trillion or so. Again, since the plan was unaccompanied by spending cuts elsewhere in the budget, it was more a set of glorified campaign talking points than a serious proposal. And like the first, Trump didn’t talk much about it.
So after Trump’s shock election, there really was no realistic Trump tax plan. No worries, however, since there was a House Republican tax plan all ready to go, with an enthusiastic House Speaker Paul Ryan ready to push it hard through the lower chamber. It was an ambitious proposal but one within reality, especially with a bit of fiscal tweaking. That plan called for, among other things, lowering the top personal rate to 33 percent and the corporate rate to 20 percent, immediately expensing new capital investment, and expanding the child tax credit.
And more so than the Trump campaign plans, the House plan intended to reform the tax code, not just cut taxes. For example, it eliminated all personal itemized deductions other than mortgage interest and charitable contributions. The House plan also made a stronger attempt to pay for the tax through a border-adjustment tax and limiting business-interest deductibility. All in all, the plan cost a couple of trillion dollars over a decade, not assuming economic feedback. On such a dynamic basis, according to Tax Foundation modeling, the House plan would reduce 10-year revenues by just under $200 billion.
So if Republicans really wanted to make their plan revenue neutral, it was certainly doable through relatively minor changes, such as less dramatic corporate or personal rate cuts. Yet the plan would still be a massive improvement over the status quo, both in terms of encouraging more domestic investment and providing middle-class tax relief.
With a detailed plan at the ready and Republicans running Washington, it is easy to understand why many in the GOP thought it reasonable to predict that Trump would be signing a mega tax bill by August of this year, just as Ronald Reagan did in the first year of his first term. Reagan did it from his ranch in Santa Barbara, California. Maybe Trump would repeat the feat from his Trump Tower penthouse in Manhattan.
But that did not happen. Then again, very little of Trump’s ambitious domestic agenda has happened as planned. Repeal and replace was promised by Easter, leaving plenty of time to hash out the fine details of tax reform and move legislation through the House and Senate. But the GOP health reform was a long slog consuming valuable time, attention, and political capital. Also deserving blame was Trump’s inability to focus on pushing policy priorities rather than pounding political opponents on Twitter. As of now, it seems highly unlikely that significant tax reform will occur in 2017. And 2018 looks challenging as well.
Yes, Trump has provided more distraction than leadership on this issue. And trying to pass major legislation in a midterm year only adds to the political difficulties. But the biggest problem is that there is no tax-reform plan for Republicans to push.
What happened to the ready-to-serve House plan? It suffered from not being a fantasy. It acknowledged both political and policy constraints, something the populist president almost never does. For instance: the House plan tried to pay for the tax cuts—a political necessity to placate debt-hawk Republicans. That requires making somebody somewhere unhappy. Ryan knew that without such an effort, it would be extraordinarily difficult to reduce the corporate tax rate to anywhere close to 20 percent. But while exporters supported the border tax, importers hated it, complaining that it would raise costs. Nor was the Trump White House happy about axing business-interest deductibility.
Still, as problematic as those pay-fors were, the alternatives—limiting tax breaks for mortgages, 401(k)s, and state and local taxes—are equally if not more so. The state and local tax deduction is a case in point. Pushed hard by Republican leaders as the primary revenue generator to replace border adjustment, it seems unlikely to survive criticism from blue-state Republicans. Eventual legislation is likely to be a far smaller and less comprehensive bill than first envisioned—more cut than reform—with some temporary parts designed to satisfy congressional budget rules. Indeed, Senate budget writers cleared room for just a $1.5 trillion tax cut, and even that might be overly ambitious. Expect Trump and his people to call whatever passes a “down payment” on true tax reform. Pro-growth conservatives should call it a missed opportunity.
Click here to read what James Pethokoukis wrote about Candidate Trump and taxes last year.
James Pethokoukis is the DeWitt Wallace Fellow at the American Enterprise Institute. He is also an official CNBC contributor.
‘The Wall’ By Linda Chavez
“We’re going to build a wall. That wall will go up so fast, your head will spin.” Donald Trump made this promise on August 23, 2016, repeated it throughout his presidential campaign, and has reiterated it in tweets and at press conferences and rallies ever since. But the only spinning going on lately has been the president’s own efforts to assure his base that he will eventually build a wall, or a fence, or some barrier along the U.S. border with Mexico, except maybe for those areas that don’t need one or already have one. Oh, and someone will pay for it—preferably Mexico, as he promised—but if not, Congress, unless Democrats or even Republicans refuse to go along. A year after winning the presidency, Trump’s most ubiquitous pledge, The Great Wall separating the U.S. from Mexico, remains largely a figment of his imagination and evidence of his supporters’ gullibility.
No issue defined Trump’s campaign more viscerally than immigration, and on none was his position less ambiguous. Trump’s presidential record on immigration enforcement and policy, however, is decidedly more mixed. He continues to promise that construction of the wall is going to start soon: “Way ahead of schedule. Way ahead of schedule. Way, way, way ahead of schedule,” he said in February. But the cost, with estimates as high as $70 billion, and the sheer impracticality of erecting a solid barrier along 1,900 miles make little sense in light of recent trends in illegal immigration. Illegal immigration is at historically low levels today (roughly the same, in absolute numbers, as it was in the early 1970s) and has been falling more or less consistently since the peak in 2000, mostly because fewer people are crossing the border from Mexico. Apprehensions of Mexicans are at a 50-year low, as are all apprehensions along the southern border. Year-to-date in 2017, apprehensions at the Mexican border have dropped 24 percent compared with those in 2016, when a slight uptick occurred as more people tried to cross in advance of a feared Trump victory and border crackdown. The population of undocumented immigrants living in the U.S. is down as well and now stands at roughly 11 million, from a peak of 12.2 million in 2007; and two-thirds of these unauthorized immigrants have lived here a decade or longer. More Mexicans—whom Trump described as “bringing drugs. . . crime. They’re rapists”—are now leaving the U.S. than arriving. In 2013, for the first time since the 1960s, Mexico fell as the top source of immigrants to the U.S., behind both China and India.
Trump’s pledge to build a wall, of course, wasn’t his only promise on immigration, but he hasn’t lived up to his own hype in other areas either, which is a good thing. He said he’d end on day one the Obama administration’s Deferred Action for Childhood Arrivals (DACA), a program that provided temporary protection from removal for young people who arrived here illegally before age 16. Instead, Trump waited until September 5 to send his beleaguered Attorney General Jeff Sessions out to announce that DACA would end in six months unless Congress acted. Trump then almost immediately backtracked in a series of tweets and offhand statements. Polls show that large majorities of Americans, including some two-thirds of Trump voters, have no interest in deporting so-called Dreamers, half of whom came before they were seven years old and 90 percent of whom are employed and paying taxes. Trump’s own misgivings and the backlash over the policy’s announcement led him into a tentative deal with Democratic leaders Representative Nancy Pelosi and Senator Chuck Schumer in September to support legislation granting legal status for Dreamers who complete school, get jobs, or join the military. Trump’s most nativist supporters have already dubbed him “Amnesty Don” for even suggesting that Dreamers should be allowed to remain and gain temporary legal status, much less earn a path toward citizenship. But whether such legislation will make it through Congress is still uncertain. Similar bills have repeatedly passed one chamber and died in the other over the past 10 years, but the potential threat that the administration might begin deporting many of the 800,000 young adults who signed up for DACA should concentrate the minds of the Republican leadership to allow legislation to move forward. One of the complications in the House is the “Hastert Rule,” named after former Speaker Dennis Hastert, an informal agreement that binds the speaker from bringing a bill to the floor unless a majority of the majority party supports it.
To be sure, Trump’s rhetoric and his appointment of hard-line immigration restrictionists to posts in his administration have led to fear among immigrants, as have the administration’s erratic, irrational enforcement policies. Previous administrations, including Barack Obama’s, gave priority to detaining and deporting aliens convicted of serious crimes, but in one of his first executive orders and Department of Homeland Security memoranda, Trump broadened the priorities for detention and removal to include anyone even suspected of committing a crime, with or without charges or conviction. As a result, arrests for immigration offenses have increased under Trump and have swept up hundreds of individuals who pose no threat to safety or security, some picked up outside their children’s schools or when seeking court orders against domestic abuse. Actual deportations, on the other hand, are down slightly in Trump’s first eight months compared with the same period in Obama’s last year. This is largely because the overloaded system isn’t equipped for mass deportation. Trump promised to rid the country of a greatly exaggerated 2 million criminal aliens and “a vast number of additional criminal illegal immigrants who have fled or evaded justice.” But his boasting that “their days on the run will soon be over” has always been aimed less at promoting sensible immigration policy than at stoking nativist anger in pursuit of his own brand of identity politics. Trump’s America will be a less welcoming place for immigrants—legal as well as illegal—if Trump gets his way on proposed legislation to reduce legal immigration by half over the next decade. But labor shortages and an aging population make it unlikely that Trump’s efforts will succeed. The simple fact is that we need more, not fewer, immigrants if the economy is to grow. Building walls and deporting workers is exactly the wrong way to go about needed immigration reform, whether Trump and his hard-core base can admit it or not.
Click here to read what Linda Chavez wrote about Candidate Trump and ‘The Wall’ last year.
Linda Chavez is the president of the Becoming American Institute and a frequent contributor to Commentary.
On Infrastructure By Philip Klein
A massive infrastructure bill was supposed to be one of the early triumphs of President Trump’s administration. Instead, Trump’s inability to advance the ball on one of his signature issues has highlighted the lack of focus, inattention to detail, and difficulties working with Congress that are emblematic of his presidency to date.
The idea of rebuilding the nation’s infrastructure, though overshadowed by daily controversies during the wild 2016 campaign, wove together several elements of the Trump phenomenon.
His experience in building projects such as luxury hotels, resorts, skyscrapers, and golf courses became central to his argument that he had the skills required to get things done in Washington. By touting the economic benefits of infrastructure during his campaign, Trump also signaled that he was an unorthodox Republican, breaking with decades of conservative critiques of Keynesian stimulus projects. Trump also spoke of infrastructure in nationalist terms, integrating it into riffs about how the United States was constantly losing to China. “They have trains that go 300 miles per hour,” he said during the campaign. “We have trains that go: Chug. Chug. Chug.”
When Trump pulled off his election-victory upset, Washington insiders quickly focused on infrastructure as one issue on which he could get a legislative win and box Democrats into a corner. After all, could Democrats really resist passing a major policy priority that had eluded them when one of their own was in the White House?
In his Inaugural Address, Trump threw a jab at Bush-era Republicanism, declaring that the U.S. “spent trillions of dollars overseas while America’s infrastructure has fallen into disrepair and decay.” Going forward, he said, “America will start winning again, winning like never before.” He promised: “We will build new roads, and highways, and bridges, and airports, and tunnels, and railways all across our wonderful nation.”
Now in the fall of the first year of his presidency, any effort to advance infrastructure legislation has been drowned out by daily controversies involving White House intrigue, the investigation into Russian influence in the 2016 election, and Trump’s raucous Twitter feed. Congress, meanwhile, spent much of the year focused on repealing and replacing Obamacare.
This isn’t to say that the Trump administration didn’t try, in fits and starts, to push infrastructure. In May, with the release of his first budget, Trump included $200 billion in funding for infrastructure as the first step in his $1 trillion infrastructure initiative. He also released a six-page fact sheet outlining his vision for infrastructure, which remains the most detailed resource on his infrastructure goals.
The document, broadly speaking, argues that current infrastructure money is spent inefficiently. It proposes greater selectivity in using federal dollars for infrastructure investments that are in the national interest and recommends giving state and local governments more leeway over their own projects. It also calls for more public-private partnerships.
Specifically, the proposal would create a nongovernment entity to manage the nation’s air-traffic-control system. It would also support private rest stops, give states the ability to work with private companies to manage their toll roads, and streamline the environmental-review process. The proposal received little attention, as it was rolled out during a week when Russia hearings took center stage in Congress and Trump was traveling in Europe and the Middle East.
Such inattention was supposed to end in early June, when White House officials announced “Infrastructure Week.” This was a carefully orchestrated campaign in which Trump was supposed to deliver speeches and lead staged events to highlight different aspects of his infrastructure initiative. But during this week, Washington was captivated by testimony of fired FBI Director James Comey, and Trump veered way off message in his speeches and on his favorite social-media platform.
He went on a Twitter tear. Trump attacked his own Justice Department for pursuing a “watered down” travel ban, took a shot at the mayor of London in the wake of a terrorist attack, unloaded on “fake news” outlets, and hit Comey as a liar. During a speech meant to make the case for both parties to get behind his infrastructure effort, Trump went off on a tangent, blasting Democrats as “obstructionists” on health care.
In truth, any hope of getting Democrats on board for the Trump infrastructure push had been fading even before this implosion. Liberals had already pressured lawmakers to pursue a policy of total resistance to Trump. But during Trump’s big policy push, Senate Minority Leader Chuck Schumer declared overtly that Democrats had no appetite for his infrastructure initiative due to its reliance on privatization.
Before long, the phrase “Infrastructure Week” had become a punch line—an ironic metaphor for a presidency gone off the rails.
Trump has made little progress on infrastructure since then, beyond issuing an executive order in August aimed at making the permitting process for building roads, bridges, and pipelines more efficient. But again, this announcement was overshadowed, as it came during the same news conference in which he blamed “both sides” for the violence in Charlottesville and complained about the slippery slope of removing the Robert E. Lee statue.
On the other hand, by striking a deal with Democratic leaders on the debt ceiling and negotiating with them on immigration, Trump has revived talk about the possibility that he could be ready to compromise with them to get infrastructure legislation passed as well. It is important to note, however, that in both cases—DACA and the debt ceiling—there was a ticking-time-bomb element that forced action. No such urgency exists when it comes to infrastructure.
From the perspective of a limited-government conservative, Trump’s inability thus far to negotiate a trillion-dollar federal infrastructure package with Democrats is nothing to shed tears about. But if we’re looking at the issue through the broader lens of whether or not Trump has been able to deliver on his ambitious campaign promises and make the transition from being a bombastic reality-television star to governing, it’s a case study in failure.
Click here to read what Philip Klein wrote about Candidate Trump and infrastructure last year.
Philip Klein is managing editor of the Washington Examiner.
On NATO By Tod Lindberg
On the campaign trail, Donald Trump was unsparing in his disparagement of U.S. alliances. In a word, allies were freeloaders—complacent in their reliance on the United States to provide them security, contributing nothing like their “fair share” of the cost of their defense, and lavishing the dividend on their domestic needs. Maybe that was acceptable when they were flat on their backs after a war that left the United States on top, but now that they are prospering and the United States has pressing needs of its own, it’s time for the allies to pay up. He also mused about NATO being “obsolete.”
This was alarming (to put it mildly) to most American foreign-policy specialists—to say nothing of the reaction of U.S. allies. The postwar alliance structure in Europe has been the backbone of security on a continent where the United States fought two wars. The North Atlantic Treaty Organization underpinned the postwar revival of Western Europe and subsequently, after the collapse of the Warsaw Pact and the demise of the Soviet Union, of Central and Eastern Europe. The relevance of the alliance has gained renewed salience with Russia’s aggression against its neighbors, first in Georgia in 2008, then in Ukraine in 2014.
At the heart of the alliance is Article 5 of the Washington Treaty of 1949—the commitment of each member to regard an armed attack on any as an attack on all. In practical terms, the meaning of Article 5 is that American power provides a security guarantee for Europe, a commitment upheld and explicitly reiterated by U.S. presidents since Harry S. Truman. The treaty is binding, yet equally in practical terms, it is the
American president whose commander-in-chief powers will dictate the response of the U.S. military to any attack—and by extension, the sincerity of his commitment determines the deterrent value of Article 5 against potential aggressors. Would a President Trump abrogate the U.S. commitment? Or hold it hostage to defense-spending increases by allies—perhaps even by demanding the payment of a much larger past-due bill, as the candidate suggested on at least one occasion?
In Asia, the biggest long-term challenge is the rise of China; the U.S. alliances with Japan, South Korea, Australia, and the Philippines (as well as the more complicated commitment enshrined in the Taiwan Relations Act) represent the underpinning of Pacific security. Would this, too, be up for grabs under Trump? Was “America First” shorthand for an isolationist retooling of U.S. relations with the rest of the world? The short answer to these questions turns out to be no. Trump has no apparent intention to do away with U.S. alliance relationships, however cumbersome and expensive he perceives them to be, and he evinces no intention to try to replace the postwar security architecture with something new and different, whatever that might be. So what happened? Were his many critics sounding the alarm therefore wrong about his intentions? Did he change his mind? Is the question of alliances now settled? Since Trump has taken office, alliance policy seems to have operated on two tracks within the U.S. government. The first track is the president’s own. He has continued to warn allies that they need to pay up—though his demands have moderated considerably, coalescing around the 2 percent of GDP that allies have pledged to spend on defense (though very few do). And although he has reaffirmed the U.S. Article 5 commitment on some occasions, on others when it would have been appropriate for him to do so, he has declined, apparently intentionally. Still, he has never repudiated the commitment. There seem to be two possibilities here: either a deliberate exercise in ambiguity, or incompetence and confusion of the kind his critics have long diagnosed.
I think the evidence points distinctly toward the former. That evidence is the second track of policy within the government. Vice President Mike Pence, Secretary of State Rex Tillerson, and Secretary of Defense James Mattis—as well as officials junior to them—have been on something close to a nonstop reassurance tour of U.S. allies and partners since the beginning of the administration. National Security Adviser H.R. McMaster has joined the chorus since he stepped in to replace the ousted Michael Flynn. Their message has been unambiguous: The United States stands by its security and alliance commitments, and allies must contribute more to collective defense. True, some allies continue to harbor doubts centered on the persona of Trump. Yet—therefore?—many are moving to spend more on defense.
Now, the simple fact is that Trump could order his Cabinet members and senior staff to desist from repeating the first half of their message—the reassurance. Trump might have had some resignations to cope with, but it is well within his power to issue such an edict, and he hasn’t done so. The most likely reason he hasn’t is that he has concluded that too much is riding on these alliances. To continue in this speculative vein, what Trump knew to be true about U.S. allies during the campaign season was that they weren’t contributing enough; that’s a message that Washington has been sending with little effect for decades. What he didn’t know on the campaign trail and has since determined is how central these alliances are to U.S. national security. U.S. alliances aren’t quite so fragile as some feared. The case for them, competently made by the likes of Mattis, must be compelling, including to the skeptic in chief.
It’s here that we may be getting a little lesson in the cunning of history. From his skeptical premise, Trump sparked a very broad debate over alliances. Senior officials of his administration have probably devoted more time and energy to making the public case for NATO and our Pacific alliances during his first 10 months in office than their predecessors did in the previous 10 years. The latter had taken the utility of alliances to U.S. national security as a given.
All this attention has had an effect on public opinion. But the effect has not been, as many feared, a groundswell of support for isolationist or anti-alliance sentiment. Just the opposite. For the past three years, the Chicago Council Survey has asked, “How effective do you think [maintaining effective alliances is] to achieving the foreign-policy goals of the United States?” In 2015, 32 percent of all respondents responded “very effective.” In 2016, the figure was 40 percent. In 2017? Forty-nine percent. Specifically on NATO, 69 percent say the alliance is “essential” to U.S. security, a slight increase from 65 percent in 2016 and well above the 57 percent who said the same when the Chicago Council first asked the question in 2002.
For the first time in the history of the survey, a majority of Americans, 52 percent, say they would support “the use of U.S. troops…if Russia invades a NATO ally like Latvia, Lithuania, or Estonia.” The Trump administration has had little to say about the Russian threat to the Baltics but a great deal to say about the danger of North Korea’s nuclear weapons and missile program. A year ago, 47 percent said they would favor “the use of U.S. troops…if North Korea invaded South Korea.” That was the view of 26 percent of Americans in 1990. Today, it’s what 62 percent think.
Finally, on the question of allies paying up, the survey asked which comes closer to the respondent’s views: “The United States should encourage greater allied defense spending through persuasion and diplomatic means” or “The United States should withhold its commitment to defend NATO members” until they actually spend more. Overall, 59 percent said persuasion and diplomacy; 38 percent (including 51 percent of Republicans) would put Article 5 at risk. Maybe I’m hearing things, but that sounds to me more like a warning to our allies to take seriously American insistence that they spend more on defense starting now than it does an abrogation of the commitments at the center of U.S. national-security strategy for 70 years.
Click here to read what Tod Lindberg wrote about Candidate Trump and NATO last year.
Tod Lindberg is a member of the Chicago Council Survey’s foreign policy advisory board.
On Asia By Michael Auslin
Despite continued Russian threats in Eastern Europe and the lurking danger of an Iranian race to a nuclear bomb, it is Asia that has vaulted to the top of the national-security agenda. Barack Obama had warned Donald Trump that North Korea would be the major national-security threat he would face, and North Korean dictator Kim Jong Un has proved him right. Kim is on the threshold of fielding a reliable intercontinental ballistic missile (ICBM) that can reach U.S. territory in the Pacific and even the American homeland. He is within striking distance of achieving his family’s long-held dream of possessing the ultimate weapon. Not since 1994, when Bill Clinton initially ordered and then called back an air strike on Pyongyang’s nascent nuclear facilities, has the region seemed so close to war.
Beyond the Korean peninsula, Asia has arguably been Trump’s central foreign preoccupation since his entry into politics. He talked during his campaign about a 45 percent tariff on Chinese goods. And despite his noninterventionist affect, he began his transition phase by getting tough on China for its increasingly assertive actions during the Obama years, including the successful building and militarization of islands in contested waters in the South China Sea.
Then Trump retreated from his tough stance toward Beijing, initiating a period of seesawing between cooperation and confrontation and mixing together trade and economic concerns with security and diplomatic issues. His explicit linkage of the two, carefully separated by previous presidents, has been particularly unnerving to Beijing. China’s regime has warned of the risks of a larger trade war if Trump continues to threaten economic retaliation for disagreement on security issues. Of equal concern to Beijing has been his recent willingness to permit more frequent freedom-of-navigation operations by the U.S. Navy in the disputed South China Sea waters off the Spratly and Paracel Islands.
Trump’s initial hard line, including an unprecedented transition-period phone call to Taiwan’s president, put Beijing on its back foot. But his subsequent inconstancy has led to a reassertion of Chinese activism on economic and diplomatic issues. His withdrawal from the Trans-Pacific Partnership and general anti-free-trade stance have allowed Chinese President Xi Jinping to claim the mantle of global economic leadership—promoting free-trade alternatives and grandiose policies such as the “Belt and Road Initiative,” in which Xi has promised more than $1 trillion of infrastructure investment to link the world in a trading network centered in China.
In contrast, Trump’s relations with America’s Asian allies, particularly Japan and South Korea, have been surprisingly smooth. Again backing down from campaign rhetoric, Trump early on reaffirmed the importance of both alliances, and buried talk of making the two pay more for hosting U.S. forces on their territory. His bond with Japanese Prime Minister Shinzo Abe has been particularly close, and his conversations with South Korea’s new left-leaning president, Moon Jae In, have gone better than some expected. Far from scaling back the alliances, Trump and his top officials, including Secretary of Defense James Mattis, have put them at the center of American strategy in the Pacific, especially with respect to North Korea.
It is North Korea, however, that remains the first great test of the Trump administration. Trump clearly inherited a failed policy, stretching over past Democratic and Republican administrations alike, and was doubly cursed in coming to office on the eve of Kim Jong Un’s nuclear and ICBM breakout.
Yet despite Trump’s heated rhetoric, he and his team have actually moved cautiously on North Korea. Like its predecessors, the administration has combined shows of force, such as flying B-1 bombers over the peninsula, with appeals to the United Nations for further sanctions on Pyongyang. Two new rounds of sanctions, in July and September, may indeed have been harder than those previously levied, but, just as in the past, the administration had to settle for less than it wanted. More worrying, Trump appears to be adopting the long-held goal of presidents past: North Korean denuclearization. This is a strategic mistake that threatens to lock him into an unending series of negotiations that have served over the past quarter-century to buy time for Pyongyang to develop its nuclear and missile capabilities. I believe it would be a far more realistic move for Trump to drop the chimera of denuclearization and instead tacitly acknowledge that North Korea is a nuclear-weapons-capable state. This would free up the administration to focus on the far more important job of deterring and containing a nuclear North Korea. Since Trump is almost certainly sure to avoid a preventive war to remove Kim’s nuclear weapons, given the associated military and political risks, he will be forced in the end to accept them. That then mandates a credible and comprehensive policy to restrict North Korea’s actions abroad while making clear that any nuclear use will result in a devastating counterstrike. Washington has been deterring North Korea ever since the end of the Korean War. This new approach explicitly makes deterrence the center of U.S. policy, dropping the unobtainable goal of denuclearization or the imprudent goal of normalizing relations with North Korea. To be successful, Trump will need to get the support of both Seoul and Tokyo, which is a tall order. The alternative, however, is another round of Kabuki negotiations and the diversion of U.S. attention from the far more necessary task of ensuring that Kim Jong Un is kept in his nuclear box.
Click here to read what Michael Auslin wrote about Candidate Trump and Asia last year.
Michael Auslin is the Williams-Griffis Fellow in Contemporary Asia at the Hoover Institution, Stanford University, and the author of The End of the Asian Century (Yale).
On Israel By Daniella J. Greenbaum
As a candidate, Donald Trump’s positions on Israel were a blend of incoherence and inconsistency. He was an isolationist, except he was also Israel’s biggest supporter; he would enforce the Iran deal, except he wanted to rip it up on day one; he was the most pro- Israel candidate on the stage, except that he wanted to be “the neutral guy”; he wouldn’t commit to a policy on Jerusalem, except he declared his plan to immediately move the American Embassy to Israel’s eternal and undivided capital.
Words—especially a president’s—matter, but until Trump took office, it was impossible to predict how his administration would treat the Jewish state. Some Israel advocates became convinced that Trump’s victory would lead to the fulfillment of their bucket list of Middle East dreams—in particular, resolution of the long-simmering issue involving the location of the U.S. Embassy in Israel. The Jerusalem Embassy Act, which became law in 1995, recognized that “each sovereign nation, under international law and custom, may designate its own capital” and that “since 1950, the city of Jerusalem has been the capital of the State of Israel.” It ordered that “the United States Embassy in Israel should be established in Jerusalem no later than May 31, 1999.”
And yet, despite all that, the American Embassy has remained in Tel Aviv. (Presidents were given the power to push the date back on national-security grounds.) Much like then-candidates Bill Clinton and George W. Bush, Trump pledged to move the embassy if elected president. In a March 2016 speech to the American Israel Public Affairs Committee’s Policy Conference, Trump said unequivocally: “We will move the American Embassy to the eternal capital of the Jewish people, Jerusalem.”
The American Embassy belongs in Jerusalem, and Trump’s evolution on the issue was, for the most part, encouraging. (Early on in his candidacy, he was booed at the Republican Jewish Coalition’s annual meeting after refusing to take a position on Jerusalem’s status.) But for Israelis, who face myriad threats on a daily basis—both physically, from their many hostile neighbors, and economically, through an international boycott, divestment, and sanctions campaign—the location of the embassy ranks low on the list of urgent political matters. Even the most ardent proponents of this policy shift acknowledge it has the potential to inflame tensions in the region. Like his predecessors, Trump signed the waiver and suspended the move.
Next on the bucket list: discarding Barack Obama’s cataclysmic Iran deal. When Trump was a candidate, his intentions for the Joint Comprehensive Plan of Action (JCPOA) were anything but clear. He told AIPAC, “My number-one priority is to dismantle the disastrous deal with Iran.” But he also said, “We will enforce it like you’ve never seen a contract enforced before folks, believe me.” It’s hard to know which part of his schizophrenic speech the audience—and the country—was supposed to believe. The schizophrenia has continued during his tenure, with Trump certifying the Iran deal twice before announcing in October his decision not to recertify a third time. Despite signaling his extreme displeasure with the deal, Trump has so far opted not to terminate it. But, by refusing to recertify, he has instead left to Congress the decision whether or not to reimpose sanctions.
Most important, perhaps, to pro-Israel forces was Trump’s choice of foreign-policy team. While Jared Kushner’s lack of political experience made him an odd choice for Middle East maven—Trump exclaimed at an inauguration event: “if [he] can’t produce peace in the Middle East, nobody can”—there is no denying that Kushner is a Zionist. Along with Jason Greenblatt, Trump’s envoy to the Israeli–Palestinian peace process, Kushner visited Israel this summer to determine whether restarting peace talks was a viable course of action. The duo have articulated their desire to refrain from repeating the mistakes of previous administrations: “It is no secret that our approach to these discussions departs from some of the usual orthodoxy. … Instead of working to impose a solution from the outside, we are giving the parties space to make their own decisions about the future,” Greenblatt explained. Maybe that’s why Benjamin Netanyahu seems so elated. Bibi’s friction with Obama was well documented, and the prime minister has expressed his jubilation at the changed nature of his relationship to Washington. During the United Nations General Assembly, he tweeted: “Under your leadership, @realDonaldTrump, the alliance between the United States and Israel has never been stronger.”
During the campaign, it was hard to imagine that might be the case. Trump’s repeated use of the phrase “America First,” a classic isolationist trope with anti-Semitic overtones, was deeply concerning to pro- Israel voters. He continually insisted that foreign governments were a drain on the American economy: “I want to help all of our allies, but we are losing billions and billions of dollars. We cannot be the policemen of the world. We cannot protect countries all over the world…where they’re not paying us what we need.” According to a 2016 report from the Congressional Research Service, “Israel is the largest cumulative recipient of U.S. foreign assistance since World War II.” The report calculates that the United States has, over the years, provided Israel with more than $127 billion in bilateral assistance. If words and campaign promises meant anything to Trump, the candidate who insisted that Israel could pay “big league” would have metamorphosed into the president who ensured that it did.
But Trump’s campaign promises seem to have had no bearing on his actions. In an appropriations bill, Congress pledged an extra $75 million in aid to Israel, on top of the annual $3.1 billion already promised for this year. As part of negotiations for the 2016 Memorandum of Understanding, the Israeli government promised to return any funds that surpassed the pre-negotiated aid package. In what was doubtlessly a major disappointment to Trump’s America-first base, the State Department confirmed it will not be asking the Israelis to return the additional funds.
His behavior toward Israel during his eight months in office has confirmed what was evident throughout the campaign: Donald Trump’s words and actions have, at best, a haphazard relationship to each other. So far Israel has benefited. That may not always be the case.
Click here to read what Jordan Chandler Hirsch wrote about Candidate Trump and Israel last year.
Daniella J. Greenbaum is assistant editor of Commentary.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Of Hobbes and Harvey Weinstein
In man’s natural state, with no social or religious order to impose limits upon his hungers and passions, “notions of right and wrong, justice and injustice have there no place. Where there is no common power, force and fraud are…the cardinal virtues.” Thus did Thomas Hobbes, in 1651, anticipate and describe the sordid story of the film producer Harvey Weinstein.
The reason Weinstein’s three decades of monstrous personal and professional conduct are so appalling and fascinating in equal measure is that he was clearly functioning outside the “social compact” Hobbes said was necessary to save men from a perpetual state of war they would wage against one another in the state of nature. For that is what Weinstein was doing, in his own way: waging Hobbesian war against the women he abused and finding orgasmic pleasure in his victories.
And Weinstein did so while cleverly pretending to leadership within the social compact and disingenuously advocating for its improvement both through political change and artistic accomplishment. Hobbes said the life of man in the state of nature was nasty, brutish, and short, but he did not say the warrior could not be strategic. Rochefoucauld’s immortal declaration that hypocrisy is the tribute vice pays to virtue is entirely wrong in this case. Weinstein paid off feminists and liberals to extend his zone of protection and seduction, not to help support the virtues he was subverting with his own vices.
Hobbes said that in the state of nature there was “no arts; no letters; no society.” But if the man in the state of nature, the nihilistic warrior, coexists with people who live within the social compact, would it not be a brilliant strategy to use the arts, letters, and society as cover, and a means of infiltrating and suborning the social compact? Harvey Weinstein is a brutal thug, a man of no grace, more akin to a mafioso than a maker of culture. And yet as a movie producer he gravitated toward respectable, quality, middlebrow, elevated and elevating fare. People wanted to work with him because of the kinds of movies he made. I think we can see that was the whole point of the exercise: It was exciting to be called into his presence because you knew you would do better, more socially responsible, more praiseworthy work under his aegis than you would with another producer.
And then, garbed only in a bathrobe, Weinstein would strike.
Weinstein was universally known to be a terrible person long before the horrifying tales of his sexual predation, depredation, and assault were finally revealed. And—this is important—known to be a uniquely terrible person. His specific acts of repugnant public thuggishness were detailed in dozens of articles and blog items over the decades, and were notable precisely because they were and are not common currency in business or anywhere else. It was said of him after the latest revelations that he had mysterious abilities to suppress negative stories about himself, and perhaps he did; even so, it was a matter of common knowledge that he was the most disgusting person in the movie business, and that’s saying a lot. And that’s before we get to sex.
To take one example, Ken Auletta related a story in the New Yorker in 2001 about the director Julie Taymor and her husband, the composer Eliot Goldenthal. She had helmed a movie about Frida Kahlo produced by Weinstein. There was a preview screening at the Lincoln Square theater in Manhattan. The audience liked it, but some of its responses indicated that the plotline was confusing. Weinstein, whose hunger to edit the work of others had long since earned him the name “Harvey Scissorhands,” wanted to recut it to clarify the picture. Taymor didn’t, citing the audience’s favorable reaction. Then this happened:
He saw Taymor’s agent…and yelled at him, “Get the fuck out of here!” To Goldenthal, who wrote the score for Frida, Weinstein said, “I don’t like the look on your face.” Then, according to several witnesses, he moved very close to Goldenthal and said, “Why don’t you defend her so I can beat the shit out of you?” Goldenthal quickly escorted Taymor away. When asked about this incident, Weinstein insisted that he did not threaten Goldenthal, yet he concedes, “I am not saying I was remotely hospitable. I did not behave well. I was not physically menacing to anybody. But I was rude and impolite.” One member of Taymor’s team described Weinstein’s conduct as actually bordering on “criminal assault.”
Weinstein told the late David Carr in 2002 that his conduct in such cases had merely been the result of excess glucose in his system, that he was changing his diet, and he was getting better. That glucose problem was his blanket explanation for all the bad stories about him, like this one:
“You know what? It’s good that I’m the fucking sheriff of this fucking lawless piece-of-shit town.” Weinstein said that to Andrew Goldman, then a reporter for the New York Observer, when he took him out of a party in a headlock last November after there was a tussle for Goldman’s tape recorder and someone got knocked in the head.
Goldman’s then-girlfriend, Rebecca Traister, asked Weinstein about a controversial movie he had produced. Traister provided the predicate for this anecdote in a recent piece: “Weinstein didn’t like my question about O, there was an altercation…[and] he called me a c—.”
Auletta also related how Weinstein physically threatened the studio executive Stacey Snider. She went to Disney executive Jeffrey Katzenberg and told him the story. Katzenberg, “one of his closest friends in the business,” told Weinstein he had to apologize. He did, kind of. Afterward, Katzenberg told Auletta, “I love Harvey.”
These anecdotes are 15 years old. And there were anecdotes published about Weinstein’s behavior dating back another 15 years. What they revealed then is no different from what they reveal now: Weinstein is an out-and-out psychopath. And apparently this was fine in his profession…as long as he was successful and important, and the stories involved only violence and intimidation.
Flash-forward to October 2017. Katzenberg—the man who loved Harvey—publicly released an email he had sent to Weinstein after he was done for: “You have done terrible things to a number of women over a period of years. I cannot in any way say this is OK with me…There appear to be two Harvey Weinsteins…one that I have known well, appreciated, and admired and another that I have not known at all.”
So which Weinstein, pray tell, was the one from whom Katzenberg had had to protect Stacey Snider? The one he knew or the one he didn’t know? Because they are, of course, the same person. We know that sexual violence is more about power than sex—about the ultimate domination and humiliation. In these anecdotes and others about Weinstein, we see that his great passions in life were dominating and humiliating. Even if the rumors hadn’t been swirling around his sexual misconduct for decades, could anyone actually have been surprised he sought to secure his victory over the social compact in the most visceral way possible outside of murder?
The commentariat’s reaction to the Weinstein revelations has been desperately confused, and for once, the confusion is constructive, because there are strange ideological and moral convergences.
The most extreme argument has it that he’s really not a unique monster, that every working woman in America has encountered a Weinstein, and that the problem derives from a culture of “toxic masculinity.” This attitude is an outgrowth of the now-fashionable view that there have been no real gains for women and minorities over the past half-century, that the gains are illusory or tokenish, and that something more revolutionary is required to level the playing field.
As a matter of fact in the Weinstein case, this view is false. Women have indeed encountered boors and creeps in their workplaces. But a wolf-whistler is not a rapist. Someone who leers at a woman isn’t the same as someone who masturbates in front of her. Coping with grotesque and inappropriate co-workers and bosses is something every human being, regardless of gender, has had to deal with, and will have to deal with until we are all replaced by robots. It’s worse for women, to be sure. Still, no one should have to go through such experiences. But we all have and we all do. It’s one of the many unpleasant aspects of being human.
Still, the extreme view of “toxic masculinity” contains a deeper truth that is anything but revolutionary. It takes us right back to Hobbes. His central insight—indeed, the insight of civilization itself—is that every man is a potential Weinstein. This clear-eyed, even cold-eyed view of man’s nature is the central conviction of philosophical conservatism. Without limits, without having impressed upon us a fear of the legal sanction of punishment or the social sanction of shame and ostracism, we are in danger of seeking our earthly rewards in the state of nature.
The revolutionary and the conservative also seem to agree there’s something viscerally disturbing about sex crimes that sets them apart. But here is where the consensus between us breaks down. Logically, if the problem is that we live in a toxic culture that facilitates these crimes, then the men who commit them are, at root, cogs in an inherently unjust system. The fault ultimately is the system’s, not theirs.
Harvey Weinstein is an exceptionally clever man who spent decades standing above and outside the system, manipulating it and gaming it for his own ends. He’s no cog. Tina Brown once ran Weinstein’s magazine and book-publishing line. She wrote that “strange contracts pre-dating us would suddenly surface, book deals with no deadline attached authored by attractive or nearly famous women, one I recall was by the stewardess on a private plane.” Which means he didn’t get into book publishing, or magazine publishing, to oversee the production of books and articles. He did it because he needed entities through which he would pass through payoffs both to women he had harassed and molested and to journalists whose silence he bought through options and advances. His primary interest wasn’t in the creation of culture. It was the creation of conditions under which he could hunt.
Which may explain his choice of the entertainment industry in the first place. In how many industries is there a specific term for demanding sexual favors in exchange for employment? There’s a “casting couch”; there’s no “insurance-adjustor couch.” In how many industries do people conduct meetings in hotel rooms at off hours anyway? And in how many industries could that meeting in a hotel room end up with the dominant player telling a young woman she should feel comfortable getting naked in front of him because the job for which she is applying will require her to get naked in front of millions?
Weinstein is entirely responsible for his own actions, but his predatory existence was certainly made easier by the general collapse of most formal boundaries between the genders. Young women were told to meet him in private at night in fancy suites. Half a century earlier, no young woman would have been permitted to travel alone in a hotel elevator to a man’s room. The world in which that was the norm imposed unacceptable limitations on the freedoms of women. But it did place serious impediments in the paths of predators whose despicable joy in life is living entirely without religious, spiritual, cultural, or moral impediment.
Hobbes was the great philosopher of limits. We Americans don’t accept his view of things; we tend to think better of people than he did. We tend to believe in the greater good, which he resolutely did not. We believe in self-government, which he certainly did not. But what our more optimistic outlook finds extraordinarily difficult to reckon with is behavior that challenges this complacency about human nature. We try to find larger explanations for it that place it in a more comprehensible context: It’s toxic masculinity! It’s the residue of the 1960s! It’s the people who enabled it! The truth is that, on occasion—and this is one such occasion—we are forced to come face to face with the worst of what any of us could be. And no one explanation suffices save Hamlet’s: “Use every man after his desert, and who should ’scape whipping?”
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
The education-reform outfit’s hard-left shift
In remaking itself, TFA has subtly downgraded the principles that had won it allies across the spectrum. George W. Bush, Mitch McConnell, John Cornyn, Chris Christie, and Meg Whitman are a few of the Republicans who championed TFA. The group attracted such boldface names, and hundreds of millions of dollars from some of the largest American firms and philanthropies, because it stood for a simple but powerful idea: that teacher quality is the decisive factor in the educational outcomes produced by schools.
Judging by its interventions in recent debates, it isn’t all that clear that senior TFA executives still believe this. These days, TFA’s voice on charters, accountability, and curricular rigor is decidedly muffled. Such education-reform essentials have been eclipsed in TFA’s discourse by immigration, policing, “queer” and transgender-identity issues, and other left-wing causes. TFA’s message seems to be that until numerous other social ills are cured—until immigration is less restricted, policing becomes more gentle, and poverty is eliminated—an excellent education will elude the poor. That was the status-quo defeatism TFA originally set out to challenge.
Wendy Kopp conceived TFA when she was a senior at Princeton in 1989. Unable to get a New York City teaching job without a graduate degree and state certification, Kopp wrote a thesis calling for the creation of a nontraditional recruitment pipeline that would bring America’s most promising young people to its neediest classrooms. TFA members would teach for two years, applying their energy and ambition to drive achievement at the classroom level. She speculated that some would stay in education, while others would go on to careers in law, medicine, business, journalism, etc. But all would remain “lifelong leaders in the effort to end educational inequity.”
The following year, Kopp launched TFA with a corps of 489 new teachers who were dispatched to schools in six regions—a virtuoso feat of social entrepreneurship. Since then some 50,000 teachers have completed the program. This year’s corps counts around 6,400 members, serving 53 regions from coast to coast.
By the time I joined, in 2005, TFA had distilled the experience of its best corps members into a theory of educational transformation called “Teaching as Leadership.” Most people, it said, aren’t natural-born educators. But they could rise to classroom greatness by setting “big goals” for all students, planning engaging lessons, continually assessing their students, maintaining tough discipline, and investing parents and the wider community in their goals.
Mostly, great teachers work hard—really hard. TFA brought the work habits usually associated with large law firms and high-end management consultancies to America’s K–12 failure factories. Its “summer institute” for new recruits was a grueling ordeal of tears, sweat, and 16-hour days. When I was a corps member, we were told that this is what it would take to overcome the forces of the status quo, which were chronically low expectations; broken homes and criminality in the streets; messy, undisciplined classrooms; and bloated bureaucracies that put the needs of adults above those of children.
The TFA worldview diverged sharply from the one that predominated in the education industry. The leading lights of the profession held that the achievement gap was a product of inadequate funding and larger social inequalities. Thus they transferred blame for classroom outcomes from teachers to policymakers and society at large. Teachers’ unions were particularly fond of this theory, since it provided cover for resisting accountability and high expectations.
TFA raged against all this. The assumption that some kids were doomed to underachievement was wrong and, indeed, bigoted. Ditto for the notion that inner-city children couldn’t be expected to behave like young scholars. These children could pull themselves up, provided they had dedicated educators who believed in them. This wasn’t to say that external factors were discounted altogether. But TFA concentrated on the things that educators and school leaders could control. It would emphasize self-help and uplift. And it would accept friends and allies across political divides to fulfill the promise of educational equality.T oday’s Teach for America is a different story. TFA’s leaders have now fully enlisted the organization in the culture war—to the detriment of its mission and the high-minded civic sensibility that used to animate its work.
This has been most visible in TFA’s response to the 2016 election. TFA chief executive Elisa Villanueva Beard, who took over from Kopp four years ago, doesn’t bother to mask either her progressivism or her revulsion at the new administration. When, a couple of weeks after the election, the president-elect announced his choice of Betsy DeVos to lead the Department of Education, Beard’s response was swift and cold.
A November 23 TFA news release began by decrying Trump’s “indisputably hostile and racially charged campaign” and called on DeVos to uphold “diversity, equity, and inclusiveness.” The statement went on to outline 11 TFA demands. Topping the litany was protection of the previous administration’s Deferred Action for Childhood Arrivals, or DACA, program, which granted legal status to certain illegal immigrants brought into the country as children. Then came the identity-politics checklist: “SAFE classrooms for LGBTQ youth and teachers,” “safe classrooms for students and teachers with disabilities,” “safe classrooms for Muslim students and teachers,” “culturally responsive teaching,” and so on.
Of the 11 demands, only three directly touched core education-reform areas—high expectations, accountability, and data-driven instruction—and these were couched in the broadest terms possible. Most notably, there wasn’t a single kind word for DeVos: no well wishes, no hope of “working together to achieve common goals,” no call for dialogue, nothing but angry demands. This, even though the secretary-designee was a passionate charter advocate and came from the same corporate philanthropy and activism ecosystem that TFA had long inhabited.
It is true that inner-city educators were horrified at the election of a candidate who winked at David Duke and suggested that a federal judge’s Mexican heritage was disqualifying. TFA’s particular concern about DACA makes sense, since many corps members work with illegal-immigrant children in border states. (My own stint took me to the Rio Grande Valley region of South Texas.)
Even so, TFA’s allergic reaction to the Trump phenomenon reflects faulty strategic thinking. Beard isn’t Rachel Maddow, and TFA isn’t supposed to be an immigration-reform outfit, still less a progressive think tank. With Republicans having swept all three branches of the federal government, as well as a majority of statehouses and governors’ mansions, TFA must come to terms with the GOP. Condemning the new education secretary as barely legitimate wasn’t wise.
Beard is also making a grave mistake by attempting to banish legitimate conservative positions from the reform movement. In the wake of the bloody white-nationalist protests in Charlottesville, Virginia, she blasted an email to the organization that denounced in one breath opposition to affirmative action and “racist and xenophobic violence.” Some two-thirds of Americans oppose race-based affirmative action. Will these Americans give TFA a fair hearing on educational reform when the organization equates them with alt-right thugs? In a phone interview, Beard said she didn’t intend to link white nationalism with opposition to affirmative action.
As for DACA, the amount of attention TFA devotes to the fate of those affected is out of all proportion. TFA has a full-time director for DACA issues. A search of its website reveals at least 31 news releases, statements, and personal blogs on DACA—including a 2013 call for solidarity with “UndocuQueer students” that delved into the more exotic dimensions of intersectionality. As one education reformer told me in an interview, “They are super-concerned with ‘can’t wait’ issues—DACA and so on—and so much of their mental space [is filled up] by that kind of thing that less of their attention and time is being spent” on central priorities. “Personally, I think that’s such a shame.” (This reformer, and others I interviewed for this article, declined to speak on the record.)
By contrast, TFA didn’t call out Mayor Bill de Blasio on his attempts to roll back charter schools in New York. The organization has rarely targeted teachers’ unions the way it has ripped into Trump. But it is the National Education Association and the American Federation of Teachers that pose the main obstacle to expanding school choice and dismissing ineffective teachers. It is the unions that are bent on snuffing out data-driven instruction. It was a teachers’ union boss (Karen Lewis of Chicago), not the 45th president, who in 2012 accused TFA of supporting policies that “kill and disenfranchise children.”T
each for America’s turn to the harder left predated Trump’s ascent, and it isn’t mainly about him. Rather, it tracks deeper shifts within American liberalism, from the meritocratic Clintonian ideas of the 1990s and early aughts to today’s socialist revival and the fervid politics of race, gender, and sexuality.
Culturally, TFA was always more liberal than conservative. Educators tend to be liberal Democrats, regardless of the path that brings them to the classroom. But education reformers are unwanted children of American liberalism. They are signed up for the Democratic program, but they clash with public-sector labor unions, the most powerful component of the party base.
As TFA went from startup to corporate-backed giant, it sustained withering attacks from leftist quarters. On her influential education blog, New York University’s Diane Ravitch (a one-time education reformer who changed sides) relentlessly hammered corps members as “woefully unprepared,” as scabs “used to take jobs away from experienced teachers,” as agents of “privatization” and the “neoliberal attack on the public sector.” It was Ravitch who publicized Lewis’s claim that TFAers “kill” kids.
Michelle Rhee, the Korean-American alumna who in 2007 was tapped as chancellor of the District of Columbia system, became a lightning rod for anti-TFA sentiment on the left. Rhee’s no-nonsense approach to failing schools was summed up in a Time magazine cover that showed her holding a broom in the middle of a classroom. When D.C. Mayor Adrian Fenty didn’t win reelection in 2010, it was seen as a popular verdict against this image of TFA-style reform.
In 2013, one university instructor, herself a TFA alumna, urged college professors not to write letters of recommendation for students seeking admission to the organization. Liberal pundits took issue with TFA’s alleged elitism and lack of diversity, portraying it as the latest in a long line of “effete” white reformist institutions that invariably let down the minorities they try to help. TFA, argued a writer in the insurgent leftist magazine Jacobin, is “another chimerical attempt in a long history of chimerical attempts to sell educational reform as a solution to class inequality. At worst, it’s a Trojan horse for all that is unseemly about the contemporary education-reform movement.” By “unseemly,” the writer meant conservative and corporate.
The assaults have had an effect. Applications to TFA dropped to 37,000 last year, down from 57,000 in 2013. Thus ended a growth spurt that had seen the organization increase the size of its corps by about a fifth each year since 2000. Partly this was due to more jobs and better salaries on offer to elite graduates in a rebounding private sector. But as Beard conceded in a statement in April 2016, partly it was the “toxic debate surrounding education” that was “pushing future leaders away from considering education as a space where they can have real impact.”
The temptation for any successful nonprofit crusade is to care more about viability and growth than the original cause. Wounded by the union-led attacks, TFA leaders have apparently concluded that identity politics and a progressive public presence can revive recruitment. With its raft of corporate donors and the massive Walton-family endowment, TFA would never fit in comfortably with an American liberalism moving in the direction of Bernie Sanders and Elizabeth Warren. But talk of Black Lives and “UndocuQueers” might help it reconnect with younger millennials nursed on race-and-gender theory.
Thus, TFA leads its current pitch by touting its diversity. Beard opened her keynote at last year’s 25th-anniversary summit in Washington by noting: “We are more diverse than we have ever been. . . . We are a community that is black, that is Latino, that is white, that is American Indian, that is Asian and Pacific Islander, that is multiracial. We are a community that is lesbian, gay, bisexual, queer and trans.” The organization’s first priority, Beard went on, will always be “to build an inclusive community.”
It makes sense to recruit diverse teachers to lead classrooms in minority-majority regions, to be sure. But one can’t help detecting a certain liberal guilt behind this rhetoric, as if TFA had taken all the attacks against it to heart: We aren’t elite, we swear! Yet the 90 percent of black children who don’t reach math proficiency by eighth grade need good math teachers, period. Their parents don’t care how teachers worship (if at all), what they look like, or what they get up to in the bedroom. They want teachers who will put their children on a trajectory out of poverty.
Minority parents, moreover, fear for their kids’ well-being in chaotic schools and gang-infested streets. Yet to hear many of the speakers at TFA’s summit, you would have thought that police and other authority figures represent the main threat to black and Hispanic children. At a session titled “#StayWoke,” a TFA teacher railed against the police:
I teach 22 second-graders in Southeast D.C., all of them students of color. Sixteen of them are beautiful, carefree black and brown boys, who, despite their charm and playfulness, could be slain in the streets by the power that be [sic], simply because of the color of their skin, what clothes they wear, or the music they choose to listen to.
Educators must therefore impart “a racial literacy, a literacy of resistance.” Their students “must grow up woke.” Another teacher-panelist condemned anti-gang violence initiatives that
come from the same place as the appetite to charge black and brown people with charges of self-destruction. The tradition of blaming black folk keeps us from aiming at real sources of violence. If we were really interested in ending violence, we would be asking who pulled the trigger to underfund schools in Philadelphia? Who poisoned our brothers and sisters in Flint, Michigan? Who and what made New Orleans the incarceration capital of the world? We would teach our students to raise these questions.
Throughout, he led the assembly in chants of “Stay Woke!”
Talk of teaching “resistance” represented a reversion to the radical pedagogy and racial separatism that left a legacy of broken inner-city schools in the previous century. TFA’s own experience, and that of TFA-linked charter networks such as the Knowledge Is Power Program, had taught reformers that, to thrive academically, low-income students need rigid structure and order. Racial resentment won’t set these kids up for success but for alienation and failure—and prison.
Another session, on “Academic Rigor, Social and Political Consciousness, and Culturally Relevant Pedagogy,” pushed similar ideas. Jeff Duncan-Andrade, an associate professor of “Raza studies” at San Francisco State University, urged teachers to develop an ultra-localized race-conscious curriculum:
Don’t even essentialize Oakland’s culture! If you’re from the town, you know it’s a big-ass difference between the west and the east [sic]. We talk differently, we walk differently, we dress differently, we speak differently. The historical elements are different. So if you use stuff from the west [of Oakland] you have to really figure out, ‘How do I modify this to be relevant to the communities I’m serving in East Oakland?’ Develop curriculum, pedagogy, assessment that is responsive to the community you serve. You gotta become an ethnographer. You gotta get on the streets, get into the neighborhoods and barrios…talk to the ancestors…
If your curriculum is not building pathways to self-love for kids who at every turn of their day are taught to hate themselves, hate the color of their skin, hate the texture of their hair, hate the color of their eyes, hate the language they speak, hate the culture they come from, hate the ‘hood that they come from, hate the countries that their people come from, then what’s the purpose of your schooling?
Other sessions included “Native American Community Academy: A Case Study in Culturally Responsive Pedagogy”; “What Is the Role of White Leaders?”; “Navigating Gender Dynamics”; “Beyond Marriage Equality: Safety and Empowerment in the Education of LGBTQ Youth”; “A Chorus of Voices: Building Power Together,” featuring the incendiary Black Lives Matter activist and TFA alumnus DeRay McKesson; “Every Student Counts: Moving the Equity Agenda Forward for Asian American and Pacific Islander Students”; “Intentionally Diverse Learning Communities”; and much more of the kind.
Lost amid all this talk of identitarian self-love was the educator’s role in leading poor children toward things bigger and higher than Oakland, with its no doubt edifying east–west street rivalries—toward the glories of the West and the civic and constitutional bonds that link Americans of all backgrounds. You can be sure that the people who participate in TFA see to it that their own children learn to appreciate Caravaggio and Shakespeare and The Federalist. The whole point of the organization was to ensure that kids from Oakland could do the same.
Twenty-seven years since Teach for America was founded, the group’s mission remains vital. Today fewer than 1 in 10 children growing up in low-income communities graduate college. The basic political dynamics of education reform haven’t changed: Teach for America, and the other reform efforts it has inspired, have shown what works. The question is whether Teach for America is still determined to reform schools and fight for educational excellence for all—or whether it wants to become a cash-flush and slick vehicle for the new politics of identity.
Choose your plan and pay nothing for six Weeks!
Review of 'iGen' By Jean Twenge
n 1954, scientists James Olds and Peter Milner ran some experiments on rats in a laboratory at McGill University. What they found was remarkable and disturbing. They discovered that if electrodes were implanted into a particular part of the rat brain—the lateral hypothalamus—rats would voluntarily give themselves electric shocks. They would press a lever several thousand times per hour, for days on end, and even forgo food so that they could keep pressing. The scientists discovered that the rats were even prepared to endure torture in order to receive these shocks: The animals would run back and forth over an electrified grid if that’s what it took to get their fix. They enjoyed the shocks so much that they endured charring on the bottoms of their feet to receive them. For a long time afterward, Olds and Milner thought that they had discovered the “bliss center” of the brain—but this was wrong. They had discovered the reward center. They had found the part of the brain that gives us our drives and our desires. These scientists assumed that the rats must have been in a deep state of pleasure while receiving these electric shocks, but in reality they were in a prolonged state of acute craving.
Jean Twenge’s important new book, iGen: Why Today’s Super-Connected Kids Are Growing Up Less Rebellious, More Tolerant, Less Happy—and Completely Unprepared for Adulthood, talks about a new form of electronic stimulation that appears to be driving young people to extreme distraction. A professor of psychology at San Diego State University, Twenge has built her career on looking at patterns in very large samples of people across long periods of time. She takes data from the General Social Survey, which has examined adults 18 years and older since 1966; the American Freshman Survey, which has questioned college students since 1991; the Youth Risk Behavior Surveillance System; and the Monitoring the Future databases. She looks to see whether there have been any changes in behavior and personality across time for people the same age but from different generations. Prior to iGen, she was the author of The Narcissism Epidemic (2009), co-written with psychologist W. Keith Campbell, and Generation Me (2013), a book about self-entitled Millennials. Twenge knows whereof she speaks.
Unlike previous patterns of rising narcissism, the trends of self-regard and self-entitlement associated with those born after 1995 appear to have petered out. What Twenge does find, however, is that reversals in trends of narcissism have been replaced by sharp increases in anxiety. Rates of anxiety and depression are spiking rapidly in young people, while at the same time their engagement with adult behaviors is declining. Using dozens of graphs, Twenge shows the reader how teenagers today drink less, go out less, socialize less, are less motivated to get their driver’s license, work less, date less, and even have sex less.
At first glance, the data seem counterintuitive, because the social pressures to abstain from alcohol and casual sex have never been more relaxed. But, on further reading, it appears that young people’s avoidance of adult behaviors has at least something to do with the addictive and distracting nature of smartphones and social media. Of course, Twenge is careful to point out that this is all “correlational.” She does not have a smoking gun and cannot prove causality. But the speculation seems plausible. All of the changes she observes started accelerating after 2007, when smartphones became ubiquitous. She writes:
I asked my undergraduate students what I thought was a very simple question: “What do you do with your phone while you sleep? Why?” Their answers were a profile in obsession. Nearly all slept with their phones, putting them under their pillows, on the mattress, or at the very least within arm’s reach of the bed. They checked social media websites and watched videos right before they went to bed and reached for their phones again as soon as they woke up in the morning (they had to—all of them used it as their alarm). Their phone was the last thing they saw before they went to bed and the first thing they saw when they woke up. If they woke up in the middle of the night they often ended up looking at their phones. They talked about their phones the way an addict would talk about crack: “I know I shouldn’t, but I just can’t help it.”
Recent experiments also lend support to the hypothesis. In an experiment carried out in 2013, psychologists Larry Rosen and Nancy Cheever brought 163 university students into a room. Some students had their phones unexpectedly taken away and others were told to put their phones on silent and out of sight. All students were then asked to fill out a brief anxiety questionnaire in 20-minute intervals. Those who were the heaviest smartphone users and heaviest social-media users recorded anxiety levels that kept climbing over the 90-minute period. The kids who used their smartphones the least did not have any increase in anxiety. This experiment lends strong support to the hypothesis that smartphones, by their propensity to promote constant use, do in fact cause agitation.
Twenge’s chapter on mental health in the generation born after 1995 makes for the book’s most disturbing reading. Heavy smartphone and social-media use correlates with higher anxiety and increased feelings of loneliness, particularly in girls. Social media seems to allow girls to bully one another in much more subtle and effective ways than were previously available. They constantly include or exclude one another from online activities such as group “chats,” and they are forever surveilling their peers’ presentation and appearance. This means that if girls aren’t vigilantly checking their social-media accounts, they won’t know if they’re being gossiped about or excluded from some fun activity. Like the electrodes placed on Olds and Milner’s rats, this new technology seems to activate the reward center—but it does not induce states of contentment, satisfaction, or meaning. It also takes time away from other activities such as sports and in-person socializing that would induce feelings of contentment and satisfaction. For a young person who is developing his personality and his competencies in the real world, this could have a profound and long-lasting effect.
Twenge tries not to be alarmist, and she presents her findings in a cautious, conscientious manner. She takes care to make caveats and eschew emotionally laden language. But it’s hard not to be alarmed by what she has found. In the six years between 2009 and 2015, the number of high-school girls who attempted suicide increased by 43 percent and the number of college students who “seriously considered” ending their lives rose by 51 percent. Suicides in young people are carefully tracked—there can be no ambiguity in this data—and increasing rates of children killing themselves are strong evidence that something is seriously amiss. From 2007 (the year smartphones became omnipresent) to 2015, suicide among 15- to 19-year-olds rose by 46 percent, and among those aged 12 to 14, it rose by half. And this rise is particularly pronounced for young girls. Three times as many 12- to 14-year-old girls killed themselves in 2015 as in 2007; among boys that age, suicide doubled in the same period. The suicide rate is always higher for boys (partly because they use more violent methods), but girls are now beginning to close this gender gap.
Another startling chapter in Twenge’s book focuses on sex, relationships, and family formation. We all know that young people are putting off marriage and child-rearing until later years, often for sensible reasons. But what is less well known is that young people are dating a lot less and spending a lot more time alone. It appears that old-fashioned romance and courtship norms are out the window, and so too is sex among young people. Twenge writes:
[M]ore young adults are not having sex at all. More than twice as many iGen’ers and late Millennials (those born in the 1990s) in their early twenties (16 percent) had not had sex at all since age 18 compared to GenX’ers at the same age (6 percent). A more sophisticated statistical analysis that included all adults and controlled for age and time period confirmed twice as many “adult virgins” among those born in the 1990s than among those born in the 1960s.
But if 16 percent are virgins, that means 84 percent of young people are having sex. Perhaps, then, there’s only a small segment bucking the trend toward more libertine lifestyles? Not so. Twenge writes:
Even with age controlled [in samples], Gen X’ers born in the 1970s report having an average of 10.05 sexual partners in their lifetimes, whereas Millennials and iGen’ers born in the 1990s report having sex with 5.29 partners. So Millennials and iGen’ers, the generations known for quick, casual sex, are actually having sex with fewer people.
For decades, conservatives have worried about loosened social and sexual mores among young people. It’s true that sexual promiscuity poses meaningful risks to youths’ well-being, especially among women. But there are also risks that manifest at a broader level when there is a lack of sexual activity in young people. And this risk can be summed up in three words—angry young men. Anthropologists are well aware that societies without strong norms of monogamous pairing produce a host of negative outcomes. In such populations, crime and child abuse increase while savings and GDP decline. Those are just some of the problems that come from men’s directing their energies toward competing with one another for mates instead of providing for families. In monogamous societies, male-to-male competition is tempered by the demands of family life and planning for children’s futures.
These trends identified by Twenge—increased anxiety and depression, huge amounts of time spent on the Internet, and less time spent dating and socializing—do not bode well for the future of Western societies. It should come as no surprise that young people who struggle to connect with one another and young men who can’t find girlfriends will express their anxieties as political resentments. Twenge’s book reveals just how extensive those anxieties are.
Like the rats that forgo food to binge on electric shocks, teenagers are forgoing formative life experiences and human connection in order to satiate their desire for electronic rewards. But the problem is not necessarily insurmountable. Twenge identifies possible protective factors such as playing sports, real-life socializing, adequate sleep, sunlight, and good food. Indeed, phone apps designed to encourage good habits are becoming popular, as are those that lock people out of their social-media accounts for predetermined periods of time. Twenge also argues that iGen has several positive indicators. They are less narcissistic and are more industrious than the generation before them, and they are also more realistic about the demands of work and careers. But harnessing those qualities will require an effort that seems at once piddling and gargantuan. IGen’s future well-being, and ours, depends on whether or not they can just put down their phones.
Choose your plan and pay nothing for six Weeks!
Playwrights and politics
No similar incidents have been reported, but not for lack of opportunity. In the past year, references to Trump have been shoehorned into any number of theatrical productions in New York and elsewhere. One Trump-related play by a noted author, Robert Schenkkan’s Building the Wall, has already been produced off Broadway and across America, and various other Trump-themed plays are in the pipeline, including Tracy Letts’s The Minutes and Beau Willimon’s The Parisian Woman, both of which will open on Broadway later this season.
The first thing to be said about this avalanche of theatrical activity is that these plays and productions, so far as is known, all show Trump in a negative light. That was to be expected. Save for David Mamet, I am not aware of any prominent present-day American playwright, stage actor, director, or technician who has ever publicly expressed anything other than liberal or progressive views on any political subject whatsoever. However, it appears one can simultaneously oppose Trump and still be skeptical about the artistic effects of such lockstep unanimity, for many left-of-center drama critics have had unfavorable things to say about the works of art inspired to date by the Trump presidency.
So even a political monoculture like that of the American theater can criticize the fruits of its own one-sidedness. But can such a culture produce any other kind of art? Or might the Theater of Trump be inherently flawed in a way that prevents it from transcending its limitations?F rom Aristophanes to Angels in America, politics has always been a normal part of the subject matter of theater. Not until the end of the 19th century, though, did a major playwright emerge whose primary interest in writing plays was political rather than aesthetic. George Bernard Shaw saw himself less as an artist than as a propagandist for the causes to which he subscribed, which included socialism, vegetarianism, pacifism, and (late in his life) Stalinism. But Shaw took care to sugar the political pill by embedding his preoccupations in entertaining comedies of ideas, and he was just as careful to make his villains as attractive—and persuasive-sounding—as his heroes.
In those far-off days, the English-speaking theater world was more politically diverse than it is today both on and off stage. It was only in the late ’40s that the balance started to shift, at first slowly, then with steadily increasing speed. In England, this ultimately led to a theater in which it is now common to find explicit political statements embedded not merely in plays but also in such commercial musicals as Billy Elliot, a show about the British miners’ strike of 1984 in which a chorus of children sings a holiday carol whose refrain runs as follows: “Merry Christmas, Maggie Thatcher / We all celebrate today / Cause it’s one day closer to your death.”
As this example suggests, postwar English political theater is consumed with indictments of the evils arising from the existence of a rigid class system. American playwrights, by contrast, are typically more inclined to follow in the footsteps of Arthur Miller and Tennessee Williams, both of whose plays portray (albeit for different reasons) the spiritual and emotional poverty of middle-class life. In both countries, most theater is neither explicitly nor implicitly political. Nevertheless, the theater communities of England and America have for the last half-century or so been all but unanimous in their offstage political convictions. This means that when an English-language play is political, the views that it embodies will almost certainly be left-liberal.
This unanimity of opinion is responsible for what I called, in a 2009 Commentary essay about Miller, the “theater of concurrence.”1 Its practitioners, presumably because all of their colleagues share their political views, take for granted that their audiences will also share them. Hence they write political plays in which no attempt is made to persuade dissenters to change their minds, it being assumed that no dissenters are present in the theater. In the theater of concurrence, disagreement with left-liberal orthodoxy is normally taken to be the result either of invincible ignorance or a deliberate embrace of evil. In the U.S. and England alike, it has become rare to see old-fashioned Shavian political plays like David Hare’s Skylight (1995) in which the devil (in this case, a Thatcherite businessman in love with an upper-middle-class do-gooder) is given his due. Instead, we get plays whose villains are demoniacal monsters (Tony Kushner’s fictionalized portrayal of Roy Cohn in Angels in America is an example) rather than flawed humans who, like Tom in Skylight, have reached the point of no moral return.
All this being the case, it makes perfect sense that Donald Trump’s election should have come as so disorienting a shock to the American theater community, which took for granted that he was unelectable. No sooner were the votes tallied than theater people took to social media to angrily declare their unalterable resistance to the Trump presidency. Many of them believe both Trump and his supporters to be, in Hillary Clinton’s oft-quoted phrase, members of “the basket of deplorables . . . racist, sexist, homophobic, xenophobic, Islamophobic, you name it.”
What kind of theater is emerging from this shared belief? Building the Wall, the first dramatic fruit of the Trump era, is a two-character play set in the visiting room of a Texas prison. It takes place in 2019, by which time President Trump has been impeached after having responded to the detonation of a nuclear weapon in Times Square by declaring nationwide martial law and locking up every foreigner in sight. The bomb, it turns out, was a “false flag” operation planted not by terrorists but by the president’s men. Rick, the play’s principal character, has been imprisoned for doing something so unspeakably awful that he and his interlocutor, a sanctimonious black journalist who is interviewing him for a book, are initially reluctant to talk about it. At the end of an hour or so of increasingly broad hints, we learn that Rick helped the White House set up a Nazi-style death camp for illegal immigrants.
Schenkkan has described Building the Wall as “not a crazy or extreme fantasy,” an inadvertently revealing remark. It is possible to spin involving drama out of raging paranoia, but that requires a certain amount of subtlety, not to mention intelligence—and there is nothing remotely subtle or intelligent about Building the Wall. Rick is a blue-collar cartoon, a regular-guy Texan who claims not to be a racist but voted for Trump because “all our jobs were going to Mexico and China and places like that and then the illegals here taking what jobs are left and nobody gave a damn.” Gloria, his interviewer, is a cartoon of a different kind, a leftsplaining virtue signal in human form who does nothing but emit smug speeches illustrating her own enlightened state: “I mean, at some point in the past we were all immigrants, right, except for Native Americans. And those of us who didn’t have a choice in the matter.” The New York production of Building the Wall closed a month ahead of schedule, having received universally bad reviews (the New York Times described it as “slick and dispiriting”).
The Public Theater’s Julius Caesar, by contrast, received mixed but broadly positive reviews. But it, too, was problematic, albeit on an infinitely higher level of dramatic accomplishment. Here, the fundamental problem was that Eustis had superimposed a gratuitous directorial gloss on Shakespeare’s play. There have been many other high-concept productions of Julius Caesar, starting with Orson Welles’s 1937 modern-dress Broadway staging, which similarly transformed Shakespeare’s play into an it-can-happen-here parable of modern-day fascism. But Eustis’s over-specific decision to turn Caesar into a broad-brush caricature of Trump hijacked the text instead of illuminating it. Rather than allowing the audience to draw its own parallels to the present situation, he pandered to its prejudices. The result was a quintessential example of the theater of concurrence, a staging that undercut its not-inconsiderable virtues by reducing the complexities of the Trump phenomenon to little more than boob-baiting by a populist vulgarian.
Darko Tresjnak committed a venial version of the same sin in his Hartford Stage revival of Shaw’s Heartbreak House (1919), which opened around the same time as Building the Wall and Julius Caesar. Written in the wake of World War I, Heartbreak House is a tragicomedy about a group of liberal bohemians who lack the willpower to reconstruct their doomed society along Shaw’s preferred socialist lines. Tresjnak’s lively but essentially traditional staging hewed to Shaw’s text in every way but one: He put a yellow Trump-style wig on Boss Mangan, the bloated, parasitical businessman who is the play’s villain. The effect was not unlike dressing a character in a play in a T-shirt with a four-letter word printed across the chest. The wig triggered a loud laugh on Mangan’s first entrance, but you were forced to keep on looking at it for the next two hours, by which time the joke had long since grown numbingly stale. It was a piece of cheap point-making unworthy of a production that was otherwise distinguished.How might contemporary theater artists engage with the Trump phenomenon in a way that is both politically and artistically serious?
For playwrights, the obvious answer is to follow Shaw’s own example by allowing Trump (or a Trump-like character) to speak for himself in a way that is persuasive, even seductive. Shaw himself did so in Major Barbara (1905), whose central character is an arms manufacturer so engagingly urbane that he persuades his pacifist daughter to give up her position with the Salvation Army and embrace the gospel of high explosives. But the trouble with this approach is that it is hard to imagine a playwright willing to admit that Trump could be persuasive to anyone but the hated booboisie.
Then there is Lynn Nottage’s Sweat, which transferred to Broadway last March after successful runs at the Oregon Shakespeare Festival and the Public Theater. First performed in the summer of 2015, around the time that Trump announced his presidential candidacy, Sweat is an ensemble drama about a racially diverse group of unemployed steel workers in Reading, the Pennsylvania city that has become synonymous with deindustrialization. Trump is never mentioned in the play, which takes place between 2000 and 2008 and is not “political” in the ordinary sense of the word, since Nottage did not write it to persuade anyone to do anything in particular. Her purpose was simply to show how the people of Reading feel, and try to explain why they feel that way. Tightly structured and free of sermonizing, Sweat is a wholly personal drama whose broader political implications are left unsaid. Instead of putting Trump in the pillory, it takes a searching look at the lives of the people who voted for him, and it portrays them sympathetically, making a genuine good-faith attempt to understand why they chose to embrace Trumpian populism.
Sweat is a model for serious political art—artful political art, if you will. Are more such plays destined to be written about Donald Trump and his angry supporters? Perhaps, if their authors heed the wise words of Joseph Conrad: “My task which I am trying to achieve is, by the power of the written word, to make you hear, to make you feel—it is, before all, to make you see.” Only the very best artists can make political art with that kind of revelatory power. Shaw and Bertolt Brecht did it, and so has Lynn Nottage. Will Tracy Letts and Beau Willimon follow suit, or will they settle for the pandering crudities of Building the Wall? The answer to that question will tell us much about the future of political theater in the Age of Trump.
1 “Concurring with Arthur Miller” (Commentary, June 2009)