To commemorate COMMENTARY’s sixtieth anniversary, and in an effort to advance discussion of the present American position in the world, the editors addressed the following statement and questions to a group of leading thinkers:
In response to a radically changed world situation since the Islamist attacks of 9/11, the United States under George W. Bush has adopted a broad new approach to national security. The Bush Doctrine, as this policy has come to be known, emphasizes the need for preemption in order to “confront the worst threats before they emerge.” It also stresses the need to transform the cultures that breed hatred and fanaticism by—in a historically stunning move—actively promoting democracy and liberty in the Middle East and beyond. In the President’s words, “We live in a time when the defense of freedom requires the advance of freedom.”
This sweeping redirection of policy has provoked intense controversy, especially but not only over its practicality, and especially but not only over its application to Iraq. At issue as well are the precise nature of the threats faced by the United States and the West, the specific tactics adopted by the Bush administration in meeting them, American capabilities and staying power, relations with traditional allies, the larger intentions and moral bona fides of U.S. foreign policy, and much else besides. Opinion on these matters is divided not only between the Left and the Right in political and intellectual life but, quite sharply, among American conservatives themselves.
- Where have you stood, and where do you now stand, in relation to the Bush Doctrine? Do you agree with the President’s diagnosis of the threat we face and his prescription for dealing with it?
- How would you rate the progress of the Bush Doctrine so far in making the U.S. more secure and in working toward a safer world environment? What about the policy’s longer-range prospects?
- Are there particular aspects of American policy, or of the administration’s handling or explanation of it, that you would change immediately?
- Apart from your view of the way the Bush Doctrine has been defined or implemented, do you agree with its expansive vision of America’s world role and the moral responsibilities of American power?
The responses, 36 in all, appear below in alphabetical order.
This symposium is sponsored by the Edwin Morris Gale Memorial Fund.
The Bush Doctrine contains two strands of analysis that, pushing in opposite directions, have produced gigantic failures in American policy. The doctrine’s first strand affirms that the United States and the world are threatened by rogue states and some dangerous non-state actors, whose motivations are, at bottom, self-interested. These enemies ought to be brought to heel by swift military action, keeping the power of command in American hands and relying on the latest gizmos of high-tech weaponry.
The Bush Doctrine’s second strand asserts that the United States and the world are threatened by full-scale ideological movements calling for aggressive violence and random slaughter, and resembling in some ways the classic totalitarian ideologies of the past. These movements, being popular, will never be defeated by armies alone. They will be defeated, instead, by countermovements that will engage the totalitarians in argument and that will secure their triumphs only by constructing the kinds of institutions that favor liberal and rationalist ideas. The countermovements will have to build, in short, a new political culture in key regions. Military force might well be required to give the anti-totalitarians a boost—to lift them into power, in certain cases, and to help them stay there. But ultimately the victories will have to be political and ideological.
The champions of the Bush Doctrine, to my knowledge, have never laid out the principles of this second strand of thought in much detail. President Bush has delivered some intelligent speeches about totalitarianism and “ideologies of hate,” but when he has spoken off the cuff he has sometimes recast the ideological battle in terms that might seem appropriate to a rustic Christian preacher, all of which suggests a somewhat casual or non-committal attitude.
In any fight against mass movements that are animated by mad ideological beliefs, the first thing to do is to mount a campaign of ideas—a campaign to identify the totalitarian doctrines and expose their flaws. The Bush administration has never managed to mount anything of the sort, at least not on the eye-catching and ambitious scale that our current predicament would seem to require (though I’m aware that, here and there within the government, some people are doing their best). Instead, the administration has launched public-relations programs in the Muslim world, which have been laughable—reinforcing the impression that the Bush Doctrine’s second strand has been conceived as an afterthought and is valued mainly for its oratorical opportunities.
The second strand does have military implications, and these are easy to identify, even to a military non-expert like me. The main purpose of military action, from this viewpoint, ought to be to support the political development and popular strength of the anti-totalitarian movements. Toward this end, military action ought to be designed to promote liberal and rationalist goals—and therefore ought to be consistent, as much as possible, with liberal principles. There is an obvious way to go about launching military actions that deploy large numbers of troops and observe liberal principles and encourage a new political culture, and this obvious way is to make use of the elephantine mechanisms of law and multilateral institutions. The first strand of the Bush Doctrine emphasizes the military value of being sleek, agile, and indifferent to world opinion, but the second emphasizes the military value of actions that are plodding, punctilious, and popular.
President Bush has tried to meld these strands together. It can’t be done. He has described the enemy every which way, and in so doing has left most of the world, including our own part of it, fatefully confused. It is shocking to me that, four years after 9/11, the White House has generated no consensus, none at all, about the general nature of the enemies we face. We invaded Iraq on the military basis of the first strand, only to discover our urgent need for the military qualities implied in the second. And disasters have followed.
The first Bush administration, back in 1991, badly underestimated the Baathists and ended up allowing Saddam to achieve a victory, if only by allowing him to remain in power. The administration thereby betrayed the Kurds and Shiites of Iraq, who were slaughtered in droves. The second Bush administration has committed precisely the same error. Thus the United States has for the second time created a situation in which huge masses of Iraqis, our own allies, have been slaughtered by their and our enemies. This is surely one of the worst things the U.S. has ever done in modern times—something disgraceful yet somewhat understandable the first time, and beyond disgraceful the second time.
If I had my druthers, I would love to see President Bush fire every one of his top advisers, and keep on firing them, the way that Lincoln did during the Civil War, until a new Ulysses Grant, or several of them, civilian and military, somehow emerged. I would love to see the President reach out to those people within the European Left, not to mention the American Democrats, who share the values of the second strand. Okay, I’m dreaming. This administration is much too sectarian to do anything of the sort. Besides, the administration radiates an air of “what, me worry?” incompetence, which will inhibit any effort to undo the disasters of the past.
Is there something to be said, at least, for the Bush Doctrine’s expansive vision of American responsibility? In my view, it is a mistake to bang too heavily on an American drum. The administration has managed to reduce the gigantic question of resisting the totalitarian and fascist movements of our time to a simple question of American hegemony. We should be emphasizing something else—the need for liberal and democratic societies of many kinds to establish a hegemony of principles of human decency and mutual respect. We ought to rid ourselves of every single aspect of what is called the Bush Doctrine, except for those aspects that could just as well be called the Franklin Roosevelt Doctrine of the Four Freedoms. The United States with its wealth and power and military capabilities should certainly make outsized contributions to the foreign-policy programs of the future, but these programs ought to be conceived in a light of practical internationalism instead of incoherent nationalism.
I applaud the Bush Doctrine. I think it was the right response—the only possible response—to the horror of 9/11. In light of the very real prospect that millions of Americans may be killed by biological or nuclear weapons, it would be madness to sit back and rely on the law-enforcement approach that failed on 9/11. While President Bush has improved the effectiveness of homeland-security efforts, he has correctly placed the emphasis on a forward defense strategy. This means killing or detaining terrorists even before they attack; denying them sanctuary; and trying to dry up their sources of support by promoting a constructive alternative for the Muslim world—namely, liberal democracy.
This policy has been largely successful. Who would have dreamed in September 2001 that we would soon see the fall of the Taliban in Afghanistan and the Baathists in Iraq, or the establishment of nascent democracies in their place; the withdrawal of Syrian troops from Lebanon; the renunciation by Libya of its WMD program; the breakup of the biggest nuclear-smuggling ring in history, run by Pakistan’s A.Q. Khan; the establishment of pro-Western democracies in Ukraine and Georgia; and, perhaps most importantly of all, not a single major terrorist attack on U.S. soil? Not all of these facts can be ascribed solely or even mainly to American action; some might even be due to sheer, temporary luck. But even if we are hit again tomorrow, a four-year respite is pretty good—and more than almost everyone (myself included) expected.
That said, I think there are major problems with the way the Bush Doctrine has been implemented—or, more accurately, not implemented. After 9/11, the President vowed that “you are either with us or against us.” Pakistan, Egypt, and Saudi Arabia appear not to have gotten the message.
In all three of these supposed American allies, the news media—which remain under the thumb of the state—continue to spew anti-American rhetoric of startling virulence and breathtaking falsity. Pakistan is allowing Islamist extremist groups to use its soil as a base for attacks on U.S. troops in Afghanistan. Egypt has responded to U.S. demands for democracy with sham elections in which Hosni Mubarak won a Saddam-style 88 percent of the vote. And, despite some efforts to curtail terrorist financing, Saudi Arabia continues to bankroll madrassas and mosques around the world that remain breeding grounds of fanaticism. What consequences have they suffered? None that I’m aware of.
Admittedly these are hard problems; in all three cases there is reason to fear that any alternative regime might be even worse. But what about Syria? Bashar Assad—the world’s sole remaining Baathist dictator—allows jihadist killers to use his country as a transit point into Iraq, where they murder many Americans and even more Iraqis. Syria has been warned for more than two years to shape up or face the consequences. Yet none has been forthcoming. This cannot be for fear of bringing to power a Syrian government even more inimical to U.S. interests than the current one; it is hard to imagine such a regime.
The failure better to police the Iraq-Syria border—which would probably necessitate military action in Syria itself—has been one of the biggest problems with the U.S. liberation of Iraq, but it is far from the only one. The lack of pre-invasion diplomacy, the lack of post-invasion planning, the lack of ground troops, the lack of intelligence, the lack of coordination and oversight, the lack of armor, the lack of electricity—all these errors have been noted ad nauseam. There has been some exaggeration of them by the President’s political opponents, along with an implausible attempt to dump the blame on a handful of “neocon” appointees while ignoring the culpability of senior military officers and non-neocon civilians. But in essence most of the charges are true.
There is no question that the war has been bungled in many respects. And yet, that doesn’t make the Iraq war very different from any other—including World War II, where many blunders (Anzio, Dieppe, Iwo Jima) killed more Allied troops in a single day than died during the first two years of fighting in Iraq. If we win in Iraq—and, despite everything that has gone wrong, victory is still the most likely outcome—the missteps along the way will be forgotten.
To his credit, President Bush has not made the most serious mistake of all, which would be to lose his nerve. His steely determination to stay the course, notwithstanding the baying of the press and the Democrats (forgive the redundancy), is giving Iraqis the breathing room they need to build political and security institutions that might be able to survive a drawdown (though not a total pullout) of U.S. forces.
We’re finally on the right course in Iraq, though it has taken a while to get there. I am not so sure we’re on any course at all in dealing with the looming threat of the Iranian and North Korean nuclear-weapons programs. In both cases, the administration has so far been satisfied with toothless multilateral diplomacy that has merely bought time for atomic assembly lines to ramp up. There are no easy answers here, and military action is not a terribly palatable option. But why hasn’t the U.S. done more to try to bring about peaceful regime change? The President has talked eloquently about the “non-negotiable demands of human dignity.” I wish he had done more to promote those demands in the two remaining members of the “axis of evil.”
Lest I end on a sour note, some perspective is in order. No President can achieve everything or please everyone. Even as Bush’s poll ratings go south for the winter, it helps to remember how reviled Harry Truman was when he left office in 1953. His reputation revived in subsequent years when it became clear that he had set in place the containment policies that ultimately won the cold war. So, too, I suspect George W. Bush will one day be seen as the President who set us on the long road to winning the war on Islamist terror.
William F. Buckley, Jr.
I do not count myself a supporter of the Bush Doctrine, though I count myself a supporter of Bush. The President’s “diagnosis” of the threat we faced—or were facing—or continue to face—requires more parsing than I think the editors of COMMENTARY would wish from me. The threat he singled out in 2002 focused on the accumulation of weapons of mass destruction by an enemy of freedom. Here was a dictator who had succeeded, in his own country, in ending freedom, and was putatively determined to succeed beyond his shores, reaching, perhaps, to our own.
I do not think that the President, since the invasion, established retrospectively either the capability of Saddam Hussein to extend his threat or his determination to attempt to do so. I think the President acted on the intelligence at hand. But even if he acted as we’d wished he had, his actions did not bring about the termination of a prior foreign-policy doctrine for the United States or any doctrinal prescription for dealing with such threats in the future.
Bush’s success has to be weighed by—there is no other way—the success of the Iraqi venture. Something that very much needed doing, after 9/11, was a demonstration of U.S. resolve and capability. We demonstrated both in Afghanistan. The undertaking was decisive, rapid, and exemplary in other aspects as well. The ensuing campaign, against Iraq, has required for its justification a kind of empirical success we have not yet achieved. We have not defeated the insurgency or united the Iraqi nation. If we do achieve those ends, and if they bring on a step forward in the direction of Iraqi security and constitutional government, the President will rightly be acclaimed for having dared to undertake something that vastly reorders life and hope in a critical part of the world. If the venture fails, he will justly be held accountable for imprudence.
Are there aspects of our policy that I would change? This is a tough question. As the costs increase, so also should the scale of our visionary purpose. It is inappropriate for the President to abbreviate, let alone abandon, a rhetoric that underwrites a great enterprise. If the Iraq venture were merely one more great-power gymnastic exercise, he would find the ongoing costs hard to justify. As these costs mount, the purpose of expending the necessary funds and other resources cannot be undermined. As we have come this far, and done what has been done, I do not see anything of a military character to be done differently from what we are doing, and I cannot see any prospect of a substantial geostrategic modification of the thinking that brought us to where we are.
But, to address the final question, I do not believe that Bush’s expanded view of the U.S. role is wise. Our goals, as pronounced once by Woodrow Wilson and now by George Bush, remain organically commendable as free societies are themselves commendable. In the nature of things, however, rescue missions to tormented nations of the world have to be selective—a geostrategic art form.
This is so obviously the case that it is embarrassing to undertake to remake it. “What do you call dictators of countries that have nuclear bombs?” the saw began, decades ago.
We are not about to extend the President’s concern for freedom to an energetic concern for freedom in mainland China. We cannot even rev up the political energy to do anything about the genocide in Sudan. Every now and then the stars arrange themselves to give us an ideological mission we can handle, as in Grenada under Reagan—and before that, on an entirely different scale, the war against Hitler. But accompanying doctrines are to be reserved for political oratory. In days and decades ahead, the U.S. will do good for other countries and for humankind, but not, I think, as a doctrinal exercise traceable to a “Bush Doctrine.”
Eliot A. Cohen
I have never understood the supposed novelty of the Bush Doctrine. The right to preemption is inherent in the functioning of a more or less anarchical society of states. Were the French to face a probable attack from, say, Tunisia, and if they thought they could do something about it in advance, they would. So would any other state not run by cowards or fools.
Nor is it a matter of great novelty that the path to security from Islamic terror lies in some liberalization of the Middle East—the spread, not so much of democracy in the sense of plebiscites or even regular elections, but of limited government, free press, the rule of law, and a regular rotation of leaders who can be evicted from power by something other than illness, death, or coup. What are the alternatives, really? To wall off the Middle East from all contact with the developed world? To turn the rule of turbulent societies over to reliable thugs? To accept Islamic fanatics in their rise to power, with the hope that its exercise would moderate them? The first is impossible, the second and third have been tried and failed, and even in the most appeasement-prone capitals of Old Europe or Asia, you will not find anyone who seriously believes in them. Indeed, only a handful of American academics, intoxicated with theories that deny the importance of religion as a force in the life of humanity, believe that we have the option of sitting pat, and waiting for the forces of political realism to work their inexorable and presumably beneficent will.
In the short term, doctrines do not change the world: action does. The much underrated removal of al Qaeda’s base in Afghanistan and the killing or arrest of most of its pre-9/11 leadership (and the scattering of the rest) did not remove the fundamental problem, but it did severely weaken an exceptionally dangerous organization. To be sure, the ideology of al Qaeda lives, and numerous cells remain dormant or have sprouted up around the world. But smashing and dispersing the core hierarchy probably prevented more mega-terrorist events; while dealing with loosely networked terrorists is difficult, counteracting a well-organized and coordinated enemy of this kind would be even more difficult.
About the long term we simply do not know. The liberation of Iraq was a good thing in and of itself; the language of freedom that accompanied it has had a salutary effect in Lebanon, Egypt, and elsewhere in the Arab world; and American military prowess, and our demonstrated will to use it, produced good results in Libya. But it is no doubt true that the war increased antipathy to the United States in the Arab world, and in the short run has stimulated the recruitment of terrorists inflamed by the lies of al Jazeera as well as a bitterly anti-American Arab, and in some cases European and Asian, intelligentsia.
Launching a war is like rolling a giant stone down a mountain slope strewn with rocks: we cannot predict where the avalanche will go. Whether Iraq is a success or a failure (and what success and failure mean is open to debate), the consequences will be prodigious, for good or for ill. This is a bold and determined administration; the war was a bold stroke, and boldness has both risks and rewards.
There are three things the administration could do, in ascending order of difficulty and descending order of likelihood, to make its doctrine effective. The first is to speak plainly about the nature of the enemy—Islamic extremism—and to do so in ways that do not misstate its argument, its appeal, or its roots. Administration spokesmen shrink from using the word “Islam,” for fear of being accused of bigotry. Anodyne formulations like “a perversion of a great religion” or “a few extremists” do not capture the power of this movement. There is a great need for a sober, detailed, and educational rhetoric about whom we are fighting. Happy talk to the Muslim world about what nice people Americans are is not only no substitute—it fools only those who utter it.
Second, the administration wrongly steered away from asking the American people to sacrifice anything in this war. Lowering taxes, it hampered its own ability to raise defense budgets. More importantly, it allowed the spirit of patriotism and resolve that flooded the country after 9/11 to dissipate over time. If you do not ask people to lend their money or their children to a fight, they will not think that they are at war. Nor was the administration willing to accept the political pain of a serious effort to undermine the grip of oil on the economy—a grip that indirectly feeds the infrastructure of terror—by imposing taxes that would reduce consumption and stimulate alternative fuels or thriftier uses of those we have. If this is war—and it is—then it demands sacrifice and an appeal for service.
Finally, the administration has suffered from its insularity, its overwhelming emphasis on loyalty to the exclusion of all other virtues, its suspicion of those with whom it could have made common cause, its refusal to admit missteps or failure, its inability to fire the incompetent (as opposed to the merely disgruntled). Huddled now in its bunker, assaulted not only for a botched war abroad but for a bumbling reaction to natural catastrophe at home, it is unlikely to open itself up; but it would be better if it could.
The expansive vision of the Bush administration seems to me broadly right, and I admire unreservedly the courage and determination with which it has pressed the fight. But how I wish that the spine of steel had found its match in an eloquence suitable to the moment; how I would have desired as great a stress on talent as on fidelity; how much better if the commitment to a vision of freedom abroad were matched with an equal and effective commitment to greatness at home; how ironic and sad that competence—the quality upon which this administration prided itself when it came to office—has, for too long, been in such short supply.
In my book Colossus: The Rise and Fall of the American Empire (2004), I argued that the Bush Doctrine was less radical as a doctrine than was widely thought when it was promulgated.
The administration’s key document, the September 2002 National Security Strategy of the United States, argued that because “deliverable weapons of mass destruction in the hands of a terror network or murderous dictator . . . constitute as grave a threat as can be imagined,” the President should, at his discretion, act preemptively to forestall any such threat, even if the threat was not imminent in the traditional sense of armies massing on borders. Many critics seized upon this as a dangerous new departure. Yet the idea of preemption had been asserted by more than one President during the cold war, and had been assumed by them all. The radical aspect of the Bush Doctrine was not so much the theory as the practice.
Even before the invasion of Iraq in 2003, it became clear that the White House intended to use the doctrine of preemption to justify violating the national sovereignty of certain “rogue regimes” and using military means to neutralize perceived future threats, preferably by changing those regimes. In Empire: The Rise and Fall of the British World Order and the Lessons for Global Power (2003), I had expressed some doubts as to whether the United States had the economic, military, and political capabilities to make a success of what was, in all but name, an imperial undertaking. Unlike many critics of the Bush administration, I did not dismiss the project as morally wrong. On the contrary, I argued that there were a number of regimes around the world that were likely to cease sponsoring terrorism, acquiring nuclear weapons, or murdering their own people only as a result of effective foreign intervention. My qualms have all along related to the ability of the United States successfully to execute such interventions.
I have no doubt that the 2002 National Security Strategy was right in its diagnosis of the dangers posed to the United States. Nor do I doubt that a preemptive strike to avert the use of weapons of mass destruction against American targets would be legitimate. But I would add two qualifications.
First, terror networks are a proven threat even when they do not have WMD. Second, it now seems clear that Saddam Hussein did not pose even a distant threat to the United States in 2003, though it was impossible to be sure of that at the time. As I contend in Colossus, the claims made by the American and British governments in connection with Iraq’s WMD capability and links to al Qaeda lacked credibility. There were good reasons for overthrowing Saddam, but these were not among them.
Is, then, the United States more secure today than in 2000? From the point of view of U.S. military personnel, it is less secure, in the sense that they are much more likely to be killed or wounded by hostile action than during the 1990’s. How far this increased risk is outweighed by the reduced threat from a jailed Saddam is not clear.
On the other hand, we cannot know the degree to which actions taken by the Bush administration in Afghanistan, Iraq, and—perhaps more importantly—in the American homeland have reduced the ability of organizations like al Qaeda to attack the United States. My hunch is that another 9/11-type attack could happen even while this President is still in the White House; there are too many ways for terrorists to enter the country and operate undetected, and too many targets to protect. There is also good reason to think that the disruption of al Qaeda’s leadership structures has been compensated for by the formation of new cells and the recruitment of new operatives, notably in Europe. This may turn out to be one of the most important unintended consequences of the invasion of Iraq.
The longer-range prospects of the Bush Doctrine are bleak. The next President will need to come up with a national-security strategy that commands much greater legitimacy abroad. It might make more sense in the future to keep the doctrine of preemption tacit.
Are there particular aspects of American policy that I would change immediately? Secretary of State Rice has already made the single most important change that I would have recommended to the administration last year, namely, to revive the art of diplomacy. The United States came perilously close to less-than-splendid isolation in 2004, not least because the administration came to believe its own rhetoric about the viability of “acting alone” (another component of the National Security Strategy). But success in Iraq cannot be achieved with the support of Tony Blair alone. The resources needed to contain the burgeoning civil war in Iraq must come from outside as well as inside the English-speaking world.
As for what the editors call the Bush Doctrine’s “expansive vision of America’s world role and the moral responsibilities of American power,” I revert once more to the wording of the National Security Strategy. I am all for “actively work[ing] to bring the hope of democracy, development, free markets, and free trade to every corner of the world.” The same goes for promoting “the rule of law; limits on the absolute power of the state; free speech; freedom of worship; equal justice; respect for women; religious and ethnic tolerance; and respect for private property.” But a further defect of the National Security Strategy was its assumption that doing these things would necessarily enhance American national security. On the contrary: the more the United States represents itself as a messianic force spreading freedom around the world, the more resentment it will arouse; see the history of the British empire, passim.
Aaron L. Friedberg
Since 9/11, the “Bush Doctrine” label has been applied to various aspects of administration policy, from the President’s initial “with us or against us” warning to state sponsors of terrorism, to his declared willingness to act preemptively (and, if need be, unilaterally) to head off the danger of covert WMD attack, to his assertion that final victory in the global war on terror depends on the spread of liberty across the Middle East and throughout the Islamic world. I will focus on this final usage, which is likely to prove the most lasting.
Is a campaign aimed at the political transformation of the “broader Middle East” essential to the defeat of terrorism? If so, how can it be carried forward to a successful conclusion at an acceptable cost? The first of these questions is easier to answer than the second.
I believe the administration’s assessment of the Islamist threat is fundamentally correct. In al Qaeda and its affiliates, we confront an enemy who aims to inflict as much pain on us and our allies as possible, thereby dividing the West, forcing a retraction of American power, and clearing the way for the overthrow of local “apostate” regimes and their absorption into a transnational caliphate. Having concocted quasi-theological justifications for their actions, the terrorists put no limit on the numbers they are willing to kill to achieve their goals; all that stands in their way is, for the moment, an apparent lack of means.
The menace we face may not be “existential,” in the same sense as the cold-war threat from the Soviet Union. Al Qaeda cannot rain down tens of thousands of nuclear warheads on American cities. But, with a few well-placed dirty bombs or vials of anthrax, it could impose terrible human and financial costs and radically alter, perhaps for a generation or more, the character of our open society and the extent of our integration into the global economy. The passage of time since 9/11, and the absence thus far of a follow-on attack on American soil, have caused some observers to lose sight of these dangers and even to argue that they have been grossly exaggerated. I know of no one involved in the conduct of the war on terror who shares this sense of complacency.
The ideology that motivates the jihadists has now metastasized and spread, so that it finds adherents even in free societies. But it sprang to life first in the diverse despotisms of the broader Middle East, and these are the sources from which it still feeds and which continue, either deliberately or indirectly, to sustain it. Even if it were possible to wave a wand and transform these societies overnight into functioning liberal democracies, the jihadist movement would likely live on, at least for a time. But unless and until progress is made in this direction, it seems certain to survive, and to thrive. The absence of liberty fuels frustration and extremism by cutting off avenues for more moderate forms of political expression, reinforcing social and economic stagnation, and feeding a sense of collective weakness, shame, and rage.
The other key elements of U.S. strategy—stronger homeland defenses and a relentless global offensive against Islamist terror networks—are necessary to keep the enemy off balance and reduce the risk of future attack; but they will not be sufficient, in themselves, to achieve a lasting peace. Jihadism cannot be defeated on the defensive, or even by cutting back its visible offshoots. It must be pulled up by the roots.
There are alternatives to a strategy that has transformation as its ultimate goal. If pressed, most liberal critics of the Bush Doctrine would say they agree with its ends but differ over means (more “soft” power and less “hard,” more multilateralism and less unilateralism). While the differences are in some respects overstated, there is a serious debate to be had here and a consensus to be hammered out, though controversies over Iraq have made this all but impossible for the moment.
More distinct are the options offered by advocates of what can only be called a policy of appeasement, on the one hand, and the self-described “realists,” on the other. The first group asserts that by leaving Iraq, cutting support for Israel, and perhaps withdrawing altogether from the Middle East, we may be able eventually to deprive the jihadists of their base of support. Despite the evident moral and strategic bankruptcy of these arguments, they have begun to gain ground recently in academic circles, where books “bravely” questioning our ties to Israel and “proving” that suicide terrorists are motivated solely by a desire to free their homes from occupation are currently the rage. Fortunately, such ideas seem unlikely for now to exert much influence on practical policy.
It is the “realists” who most stand to gain if American policy in Iraq comes to be seen as a costly failure. Such an outcome would be taken as proof that the pursuit of liberalization in the broader Middle East is a fool’s errand and that, instead of criticizing “friendly” local regimes and pressuring them to reform, we should be content to make common cause in wiping out the jihadists. What is needed, in this view, is a more effective and if need be a more ruthless version of the policy that existed before 9/11. The fact that this approach has already proved its ineffectiveness may not lessen its appeal, at least for a while.
In the long run, and whatever happens in Iraq, some variant of the Bush Doctrine will remain an essential part of overall U.S. strategy for defeating Islamist terrorism. The questions facing this administration as it enters its final quarter are more practical than theoretical. How to tailor the right mix of pressures and inducements to move “friendly” regimes toward meaningful reforms, and how to deal with openly hostile holdouts? How to minimize the inevitable risks of transition (the “one man, one vote, one time” problem)? How to institutionalize the “forward strategy of freedom” within the U.S. government and the Western alliance? And how to ensure continuing domestic political support for a goal that is both necessary and just?
I believe that the Bush Doctrine’s central assumption—that the United States had to transform the politics of the Middle East as a means of solving the post-9/11 terrorist threat—was misguided, and that the problem was greatly compounded by extremely poor policy execution before and after the Iraq war. For the record, I made up my mind that the war was a bad idea by the fall of 2002, i.e., before the war began, when I was asked to lead part of a Pentagon study on strategy in the war on terrorism, and not in response to events as they unfolded after the war.
There is no question that the 9/11 attacks exposed a very new kind of threat, and that the usual tools of the cold war—containment and deterrence—would not work against suicide terrorists armed with weapons of mass destruction. The Afghan war was a fully justified exercise in prevention, where we dismantled terrorist networks that were clearly of danger to us.
The problem was that the Bush administration merged the terrorist/WMD problem with the problem of Iraq and rogue-state proliferators more generally. The latter was and continues to be a very serious issue, but it was never clear that a rogue state—which (unlike stateless terrorists) has a return address—would go to all the trouble of developing nuclear weapons only to give them to a terrorist organization.
The bigger problem lay with the diagnosis of the root causes of the terrorism, and the prescription for fixing it. Radical Islamism is in no way an assertion of traditional Muslim values or religiosity. Olivier Roy has argued persuasively in Globalized Islam that it needs to be seen as an essentially modern phenomenon driven by the deterritorialization of Islam, primarily in Western Europe, and by the forces of globalization and modernization that we otherwise celebrate. In a traditional Muslim society, your identity is fixed by the society into which you are born; only when you live in a non-Muslim environment does it occur to you to ask who you are. The profound alienation that results makes poorly assimilated second- and third-generation Muslims susceptible to a pure, universalistic ideology like that of Osama bin Laden. Mohammed Atta and the other organizers of 9/11, the Madrid and London conspirators, and Mohammed Bouyeri, murderer of the Dutch filmmaker Theo van Gogh, all fall into this category.
This means that more democracy and more modernization will not solve our near-term terrorism problem, but may well exacerbate it. I believe that both democracy and modernization are good things and should be promoted in the Middle East for their own sake. But we will continue to have a serious terrorist problem in democratic Western Europe, regardless of what happens in Egypt or Lebanon.
Even if one accepted the view that the Middle East needed to be “fixed,” it was hard to understand what made us think that we were capable of fixing it. So much of what neoconservatives have written over the past decades has concerned the unanticipated consequences of overly ambitious social engineering, and how the effort to get at root causes of social problems is a feckless task. If this has been true of efforts to combat crime or poverty in U.S. cities, why should anyone have believed we could get at the root causes of alienation and terrorism in a part of the world that we didn’t understand particularly well, and where our policy instruments were very limited?
The other constraint is very specific to the United States. We have gotten involved in nation-building efforts in many places over the years: Reconstruction of the South after the Civil War, occupation of the Philippines and the various Monroe Doctrine interventions, Japan, Germany, South Korea, and South Vietnam, and finally the humanitarian interventions of the post-cold-war era in Somalia, Haiti, the Balkans, and other places. Of these, only Japan, Germany, and South Korea were clear successes, and these were places where U.S. occupation forces came and basically never left.
Americans have a habit of starting such projects enthusiastically and then losing interest after things go bad, usually at about the five-year mark; this is what happened with Reconstruction, in Nicaragua between 1927 and 1932, in South Vietnam, and in many other places. We sign up local allies, make a stab at giving them modern institutions, and then pull the plug. I was fearful that we would repeat this pattern in Iraq prior to the war, and nothing that has happened since then has alleviated that concern.
We need to win militarily in Afghanistan and Iraq. It is extremely important that we resist pressures to reduce numbers of American forces prematurely. But we also need to conceive of the broader war on terrorism as a classic counterinsurgency campaign fought out on a global scale. In that campaign, winning hearts and minds is as important as neutralizing the hard-core terrorists. I strongly believe in the need for an expansive foreign policy that shapes the insides of states and not just their external behavior. But it is American soft power, not hard, that will be the primary instrument for promoting democracy and development around the world, and we need thoroughly to rethink the structure and funding of the instruments we have for doing this.
After the first four years of the Bush Doctrine, the United States has created a new terrorist haven in Iraq and a power vacuum that will destabilize regional politics for some time to come. While allies may seek to restore good relations with Washington at an elite level, at a popular level there has been a seismic shift in the way that much of the world perceives the United States. Our image, fairly or not, is no longer the Statue of Liberty but the hooded prisoner at Abu Ghraib. Fixing this problem is a project that will preoccupy us for many years to come.
Frank J. Gaffney, Jr.
I heartily agree with the Bush Doctrine as described by the editors and as outlined in the 2002 National Security Strategy of the United States.
We are once again engaged in a global conflict imposed upon us by a dangerous, totalitarian ideology that has properly come to be known as Islamo-fascism. Its adherents seek to implement their vision of a global caliphate governed by a Taliban-style repressive version of shari’a law. They will employ all available means to accomplish that goal.
In a world in which Islamofascists and their state sponsors and allies can reasonably be expected to have access to weapons of mass destruction, a proactive, offensive, and, where necessary, preemptive American strategy is indispensable. Nothing less is at stake than our survival as a free, democratic, and secular nation.
If we are to defeat the Islamofascists, however, we are going to need something more: the help of non-Islamist Muslims, who are as much at risk from this intolerant ideology as are those in the non-Muslim world. We must legitimate and empower our natural allies in this war. The President is right that one means of doing so is to help them establish governments that are representative, accountable, and conducive to economic growth—in stark contrast to the repression and privation associated with Islamist misrule.
All that said, I am happier with the Bush Doctrine conceptually than with its implementation. In defining the enemy in this war, the administration has largely refused to go beyond euphemisms like “terror” and “an evil ideology.” The unwillingness to declare Islamofascism the force that drives our foes has made problematic the devising—let alone the successful implementation—of strategies for defeating them.
This failure has had negative consequences for the war effort abroad and at home. The President’s bold assertion that “you are either with us or against us” has been undermined by the administration’s practice of certifying as “with us” the nation that is arguably most responsible for the worldwide spread of Islamofascism: Saudi Arabia. Despite the President’s admirable rhetoric about spreading freedom, two other nations demonstrably not “with us”—Iran and North Korea—have moved from being members of the “axis of evil” to being negotiating partners. At the insistence of putative friends like China and Russia and the connivance of sometime allies like France and Germany, these odious regimes are being assured of our willingness to support their continued misrule in exchange for still more fraudulent promises of non-proliferation.
The administration is also confusing elections with the establishment of institutions essential to functioning and enduring democracies. Elections in Afghanistan, Iraq, Lebanon, and Gaza have helped to empower Islamofascists. Even in Turkey, with its well-established secular democracy, an elected Islamist regime is mounting a classic takeover of the institutions of civil society. Ignoring these realities is a formula for still greater setbacks down the road.
Unfortunately, the same disconnect between rhetoric and practice is evident in the administration’s outreach to the Muslim community here at home. While it talks of rooting out domestic support cells, charities, and front organizations that enable terrorists here and abroad, it has repeatedly embraced many who have been leaders of and sympathizers with such efforts. This has afforded Islamists access and influence and added to the incoherence of U.S. war policies, while demoralizing truly non- or anti-Islamist Muslims.
Unless promptly corrected, such practices augur ill for needed security improvements over both the short and long terms. The most urgent change, apart from clarifying the nature of the enemy, is to put the country on a war footing. Four years after the attacks of 9/11, too many Americans have come to believe that the conflict in which we are engaged is the problem of the U.S. military, the President, our allies, or somebody else. That this sentiment is widely held owes much to the fact that the public has been encouraged to think of its job in this conflict as nothing more than going shopping.
There are many ways in which the American people can be asked to assist the war effort. Here are three of the most important.
First, stopping the underwriting of terror. Unbeknownst to most American investors, significant portions of their public-pension funds, mutual funds, life insurance, and private portfolios include stocks of privately held companies that partner with state sponsors of terror. Were that money to be divested, it could have a profound effect on the ability of terror-sponsoring states to underwrite the war the Islamofascists and their friends are waging against us.
Next, enhancing energy security. The public can help deny financial succor to our enemies by reducing our dependence on foreign oil—much of which is purchased from the same nations that are supporting Islamofascism and its allies. There are various ways this can begin to be accomplished. The least painful near-term approach would be to enable domestically produced alcohol-based fuels and electricity to be used on a greatly expanded basis as means of powering the transportation sector.
Third, securing the homeland. Perhaps the most basic step in protecting against future attacks requires the American people to increase their vigilance in monitoring domestic threats. In addition, the nation needs to involve its citizens much more fully in planning for and preparing against future attacks. As Hurricane Katrina reminds us, such capabilities may prove to be of great value in future emergencies, whether natural or man-made.
As for “America’s world role and the moral responsibilities of American power,” I subscribe to an expansive presidential vision that predated and underpins the Bush Doctrine: namely, President Reagan’s conviction that America is “the last best hope of mankind.” From this flows the belief that we should be engaged in the world, not out of some sense of noblesse oblige, but rather because it is essential to our own survival in the face of enemies who wish to destroy us and everything we stand for.
Reagan’s philosophy recognized that international peace is best preserved through American strength. In practice, this requires a robust presence across the globe—one able to respond to the full spectrum of threats, ideally by nipping them in the bud, but in any event confronting them in whatever way is most efficacious before they endanger our lives and freedoms.
Reuel Marc Gerecht
Although president George W. Bush didn’t invade Iraq in order to bring democracy to the Middle East—and neoconservatives, with exceptions, didn’t advocate war with that in mind—building democracy now defines U.S. policy in the entire region. If democracy succeeds in Iraq, then America, regardless of who sits in the White House, will certainly become more active in promoting representative government. If democracy fails there, then we will become much more timid in encouraging political reform.
Despite the numerous, serious mistakes of the Bush administration in the occupation of Iraq, democracy’s chances there remain decent so long as the Shiite political center behind Grand Ayatollah Ali Sistani holds. But failure in Iraq may not necessarily dim the prospects of democracy elsewhere in the Muslim world.
The fall of Saddam Hussein has already accelerated convulsive democratic debates in Arab lands and in their more combative and open expatriate media. The region’s dictators and kings may have a difficult time stuffing this discontent and dissent back into the tried-and-true shibboleths—principally anti-Zionism and anti-Americanism—that have consumed the intellectual energy of so many and offered the autocrats a safety valve for popular dissatisfaction with the regimes in place. Arab left-wing intellectuals seem today less domesticated than they were just a few years back, when they eagerly turned most of their venom toward Israel and Ariel Sharon. Muslim fundamentalists, especially in Egypt, still the lodestone among Arab nations, seem much less likely to play along, and are increasingly backing the popular push for more open political systems.
Failure in Iraq would mean a civil war between Sunni and Shiite Arabs that would allow for the rise of a Shiite strongman in Baghdad. Even so, however, this might not at all be seen by Egyptians as a sufficient reason to keep President Hosni Mubarak’s family in power. The rest of the Arab world is, like Egypt, overwhelmingly Sunni. With the exceptions of Syria, tiny Bahrain, and Lebanon, democracy in the Arab world would be an intra-Sunni squabble.
Which brings us to a series of important questions for the Bush administration and its successor. Let us suppose that, regardless of what happens in Iraq, the democratic movement among Arabs pushes forward, but, as is probable, with Muslim fundamentalists in the lead. Will the administration shy away from democracy promotion if and when it becomes clear that Muslim fundamentalists will initially do very well in most Arab lands where free elections are allowed?
I myself would argue that the political and cultural evolution of Sunni fundamentalism is central to the death of bin Ladenism, and that democratic politics are an essential part of that evolution. This means that democracy’s advance in the Middle East is likely to be a very anti-American process. (Think Latin American anti-Yanquism on speed.) To my mind, this is a painful but necessary step in the evolution of Islamic activism.
Has the Bush administration thought this through? Has it tried to explain to itself, let alone to the American people, how democracy may unfold in the Muslim Middle East? Has the President, Secretary of State Condoleezza Rice, or Karen Hughes, the new public-diplomacy czarina, called a conclave to figure out what the administration actually believes? It would not appear so.
As for those in the administration who believe that Muslim liberals, progressives, and moderates are the real key to democracy’s future in the region—a view that I find in error, but certainly an estimable aspiration—have they troubled to explain how we are going to locate and support such individuals over the heads of the present dictators and kings? Will we endorse open elections where fundamentalists can compete with liberals and others, or will we advocate banning fundamentalists from the election process even when liberals in these countries tell us that doing so will undermine them and us? Should we treat Muslim fundamentalists as beyond the pale, or even as Nazis, as some have argued? (Given that Iran is full of fallen hard-core fundamentalists who now sincerely advocate democracy, the parallel seems strained.)
Another question is useful in considering this complex of issues: are Muslim democracies that restrict women’s social rights in practice morally superior to Muslim dictatorships that advance them in theory? I think the answer is an emphatic yes, but the administration has so far shown little desire to argue this possibility, thereby allowing the New York Times columnist Maureen Dowd to suggest that Saddam Hussein, who was the first Middle Eastern dictator to institute rape as an official means of mind control, was more pro-woman than the democratically sanctioned constituent assembly that drafted Iraq’s proposed constitution. Women’s rights are a hot-button issue in the United States. It would be wise for the administration to explain how it intends to handle this issue in the socially conservative Middle East.
George W. Bush is one of our most revolutionary Presidents, but regrettably his administration shows little more intellectual ferment than his father’s. That is in part because many inside the critical institutions of foreign policy—the State Department, the National Security Council, the Central Intelligence Agency, and the Pentagon—don’t really believe in expanding democracy, at least not in the Muslim Middle East. And even among those who share the President’s commitment to expanding representative government, and who understand that democracy is an essential component in the big-picture fight against Islamic extremism, there is enormous nervousness about significant change in the status quo. Truth be told, the Bush administration was not that upset when Egyptian President Hosni Mubarak stole his reelection.
Four years after 9/11, it is still possible to see the United States wavering in its commitment to democracy more than in its commitment to the rulers of the Middle East. It is not hard to imagine Washington’s bureaucracies trying hard, once again, to cast the fight against Islamic extremism as essentially a police and intelligence action, which would mean drawing closer to the dictators and kings who run the Middle East’s security and intelligence services. If the President isn’t vigilant, we could soon be living again in a pre-9/11 world, in which democracy seemed a premature idea for people more suited to prayer and despotism.
Victor Davis Hanson
According to opinion polls, most Americans are now critical of the President’s foreign policy. They are uncertain not merely over the daily fare of explosions in Iraq. Rather, the sustained public attack on American action abroad, emanating from both the Left and the hard Right, has led to bipartisan and broadly-shared condemnation. Even some who once were adherents of preemption have bailed out, claiming that although they supported the removal of Saddam Hussein, they are appalled by what followed. Or, translated, “In hindsight I remain in favor of my near-perfect military campaign, but not your messy reconstruction”—as if America’s past wars were not fraught with tragic lapses and muddled operations.
But for all the media hysteria and the indisputable errors of implementation, the Bush Doctrine is, in fact, moving ahead. Soon it will bear long-term advantage. Despite our inability to articulate the dangers and stakes of the war against radical Islam and our failure to muster the full military potential of the United States, and despite the fact that our own southern border remains vulnerable to terrorist infiltration, there has been enormous progress in the past four years.
We have removed both the Taliban and Saddam Hussein. Those efforts have cost us over 2,000 American combat deaths, a hard loss and to be mourned, but still two-thirds of the number of American civilians killed on September 11, 2001, the first day of the war. Thanks to our forward policy of hitting rogue regimes abroad and staying on to help the reconstruction, coupled with increased vigilance at home, the United States has not been struck since then.
Inside Iraq there is a constitutional government grinding ahead, and a series of elections slated for ratification and/or amendment. Much is rightly made of Sunni intransigence, yet this minority population, with no oil and with a disreputable past of support for either Saddam or the Zarqawi terrorists, or both, has been put in an untenable position. Its clerics call for Iraqi Sunnis to vote no on the constitution even as Sunni radicals like Zarqawi threaten to kill any who would vote at all.
There has also been a radical transformation in regional mentalities. The elections in Egypt, though boycotted and rigged, were an unprecedented event, and the irregularities quickly ignited popular demonstrations. Events elsewhere are no less significant, as Libya and Pakistan have renounced their nuclear commerce, Syrians are out of Lebanon, and rudimentary parliaments are forming in the Gulf. Even on the Palestinian question, the death of Arafat, Israel’s building of a protective fence and its withdrawal from Gaza, and the removal of Saddam have strengthened the hand of beleaguered reformers in the West Bank and beyond. The onus for policing their miscreants is gradually shifting to the Palestinians themselves, which is where it belongs.
There are, of course, no Swiss cantons arising in the Middle East. Rather, we see the initial tremors of massive tectonic shifts, as the old plates of Islamic radicalism or secular autocracy give way to something new and more democratic. The United States is the primary catalyst of this dangerous but long-overdue upheaval. It has taken the risk almost alone; the ultimate reward will be a more stable world for all.
Much is made of global anti-Americanism and hatred of George Bush. But under closer examination, the furor is mostly confined to Western Europe, the autocratic Middle East, and our own elites here at home. In Europe, our most vocal critics, Jacques Chirac in France and Gerhard Schroeder in Germany, have lost considerable domestic support, and are under challenge by realists worried about their own unassimilated minorities and appreciative of American consistency in the war against radical Islam. In the meantime, Eastern Europeans, Japanese, Australians, and Indians have never been closer to the United States. Russia and China have little beef with our war on terror.
Here at home, the relative lack of bipartisan support is due partly to the media culture of the Left, partly to the turmoil and resentment of an out-of-power Democratic party, partly to uncertainty as to how it will all turn out. On the far Right, some see only too much money being spent, too much proliferation of government, and too much Israel in the background.
What lies ahead? We must continue to navigate the dangerous narrows between the two unacceptable alternatives of secular dictatorship and rule by Islamic law, even as we prod recipients of American aid or military support like Mubarak, Musharraf, and the Saudi royal family to reform. At home, unless we come up with a viable policy combining increased oil production, conservation, and alternative fuels, our ability to protect ourselves from international blackmail will soon begin to erode. Most forbiddingly, nuclear weapons in the hands of Iran or any other non-democratic Middle Eastern country could destroy much if not all of what has been accomplished. What would have happened in the late 1930’s had America found itself dependent on Romanian oil or German coal, or learned that Hitler, Mussolini, or Franco was close to obtaining atomic weapons?
I continue without reserve to support our efforts in Afghanistan and Iraq, and our pressure for reform in the Middle East at large. No
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Defending and Advancing Freedom
Must-Reads from Magazine
Exactly one week later, a Star Wars cantina of the American extremist right featuring everyone from David Duke to a white-nationalist Twitter personality named “Baked Alaska” gathered in Charlottesville, Virginia, to protest the removal of a statue honoring the Confederate general Robert E. Lee. A video promoting the gathering railed against “the international Jewish system, the capitalist system, and the forces of globalism.” Amid sporadic street battles between far-right and “antifa” (anti-fascist) activists, a neo-Nazi drove a car into a crowd of peaceful counterprotestors, killing a 32-year-old woman.
Here, in the time span of just seven days, was the dual nature of contemporary American anti-Semitism laid bare. The most glaring difference between these two displays of hate lies not so much in their substance—both adhere to similar conspiracy theories articulating nefarious, world-altering Jewish power—but rather their self-characterization. The animosity expressed toward Jews in Charlottesville was open and unambiguous, with demonstrators proudly confessing their hatred in the familiar language of Nazis and European fascists.
The socialists in Chicago, meanwhile, though calling for a literal second Holocaust on the shores of the Mediterranean, would fervently and indignantly deny they are anti-Semitic. On the contrary, they claim the mantle of “anti-fascism” and insist that this identity naturally makes them allies of the Jewish people. As for those Jews who might oppose their often violent tactics, they are at best bystanders to fascism, at worst collaborators in “white supremacy.”
So, whereas white nationalists explicitly embrace a tribalism that excludes Jews regardless of their skin color, the progressives of the DSA and the broader “woke” community conceive of themselves as universalists—though their universalism is one that conspicuously excludes the national longings of Jews and Jews alone. And whereas the extreme right-wingers are sincere in their anti-Semitism, the socialists who called for the elimination of Israel are just as sincere in their belief that they are not anti-Semitic, even though anti-Semitism is the inevitable consequence of their rhetoric and worldview.
The sheer bluntness of far-right anti-Semitism makes it easier to identify and stigmatize as beyond the pale; individuals like David Duke and the hosts of the “Daily Shoah” podcast make no pretense of residing within the mainstream of American political debate. But the humanist appeals of the far left, whose every libel against the Jewish state is paired with a righteous invocation of “justice” for the Palestinian people, invariably trigger repetitive and esoteric debates over whether this or that article, allusion, allegory, statement, policy, or political initiative is anti-Semitic or just critical of Israel. What this difference in self-definition means is that there is rarely, if ever, any argument about the substantive nature of right-wing anti-Semitism (despicable, reprehensible, wicked, choose your adjective), while the very existence of left-wing anti-Semitism is widely doubted and almost always indignantly denied by those accused of practicing it.T o be sure, these recent manifestations of anti-Semitism occur on the left and right extremes. And statistics tell a rather comforting story about the state of anti-Semitism in America. Since the Anti-Defamation League began tracking it in 1979, anti-Jewish hate crime is at an historic low; indeed, it has been declining since a recent peak of 1,554 incidents in 2006. America, for the most part, remains a very philo-Semitic country, one of the safest, most welcoming countries for Jews on earth. A recent Pew poll found Jews to be the most admired religious group in the United States.1 If American Jews have anything to dread, it’s less anti-Semitism than the loss of Jewish peoplehood through assimilation, that is being “loved to death” by Gentiles.2 Few American Jews can say that anti-Semitism has a seriously deleterious impact on their life, that it has denied them educational or employment opportunities, or that they fear for the physical safety of themselves or their families because of their Jewish identity.
The question is whether the extremes are beginning to move in on the center. In the past year alone, the DSA’s rolls tripled from 8,000 to 25,000 dues-paying members, who have established a conspicuous presence on social media reaching far beyond what their relatively miniscule numbers attest. The DSA has been the subject of widespread media coverage, ranging from the curious to the adulatory. The white supremacists, meanwhile, found themselves understandably heartened by the strange difficulty President Donald Trump had in disavowing them. He claimed, in fact, that there had been some “very fine people” among their ranks. “Thank you President Trump for your honesty & courage to tell the truth about #Charlottesville,” tweeted David Duke, while the white-nationalist Richard Spencer said, “I’m proud of him for speaking the truth.”
Indeed, among the more troubling aspects of our highly troubling political predicament—and one that, from a Jewish perspective, provokes not a small amount of angst—is that so many ideas, individuals, and movements that could once reliably be categorized as “extreme,” in the literal sense of articulating the views of a very small minority, are no longer so easily dismissed. The DSA is part of a much broader revival of the socialist idea in America, as exemplified by the growing readership of journals like Jacobin and Current Affairs, the popularity of the leftist Chapo Trap House podcast, and the insurgent presidential campaign of self-described democratic socialist Bernie Sanders—who, according to a Harvard-Harris poll, is now the most popular politician in the United States. Since 2015, the average age of a DSA member dropped from 64 to 30, and a 2016 Harvard poll found a majority of Millennials do not support capitalism.
Meanwhile, the Republican Party of Donald Trump offers “nativism and culture war wedges without the Reaganomics,” according to Nicholas Grossman, a lecturer in political science at the University of Illinois. A party that was once reliably internationalist and assertive against Russian aggression now supports a president who often preaches isolationism and never has even a mildly critical thing to say about the KGB thug ruling over the world’s largest nuclear arsenal.
Like ripping the bandage off an ugly and oozing wound, Trump’s presidential campaign unleashed a bevy of unpleasant social forces that at the very least have an indirect bearing on Jewish welfare. The most unpleasant of those forces has been the so-called alternative right, or “alt-right,” a highly race-conscious political movement whose adherents are divided on the “JQ” (Jewish Question). Throughout last year’s campaign, Jewish journalists (this author included) were hit with a barrage of luridly anti-Semitic Twitter messages from self-described members of the alt-right. The tamer missives instructed us to leave America for Israel, others superimposed our faces onto the bodies of concentration camp victims.3
I do not believe Donald Trump is himself an anti-Semite, if only because anti-Semitism is mainly a preoccupation—as distinct from a prejudice—and Trump is too narcissistic to indulge any preoccupation other than himself. And there is no evidence to suggest that he subscribes to the anti-Semitic conspiracy theories favored by his alt-right supporters. But his casual resort to populism, nativism, and conspiracy theory creates a narrative environment highly favorable to anti-Semites.
Nativism, of which Trump was an early and active practitioner, is never good for the Jews, no matter how affluent or comfortable they may be and notwithstanding whether they are even the target of its particular wrath. Racial divisions, which by any measure have grown significantly worse in the year since Trump was elected, hurt all Americans, obviously, but they have a distinct impact on Jews, who are left in a precarious position as racial identities calcify. Not only are the newly emboldened white supremacists of the alt-right invariably anti-Semites, but in the increasingly racialist taxonomy of the progressive left—which more and more mainstream liberals are beginning to parrot—Jews are considered possessors of “white privilege” and, thus, members of the class to be divested of its “power” once the revolution comes. In the racially stratified society that both extremes envision, Jews lose out, simultaneously perceived (by the far right) as wily allies and manipulators of ethnic minorities in a plot to mongrelize America and (by the far left) as opportunistic “Zionists” ingratiating themselves with a racist and exploitative “white” establishment that keeps minorities down.T his politics is bad for all Americans, and Jewish Americans in particular. More and more, one sees the racialized language of the American left being applied to the Middle East conflict, wherein Israel (which is, in point of fact, one of the most racially diverse countries in the world) is referred to as a “white supremacist” state no different from that of apartheid South Africa. In a book just published by MIT Press, ornamented with a forward by Cornel West and entitled “Whites, Jews, and Us,” a French-Algerian political activist named Houria Bouteldja asks, “What can we offer white people in exchange for their decline and for the wars that will ensue?” Drawing the Jews into her race war, Bouteldja, according to the book’s jacket copy, “challenges widespread assumptions among the left in the United States and Europe—that anti-Semitism plays any role in Arab–Israeli conflicts, for example, or that philo-Semitism doesn’t in itself embody an oppressive position.” Jew-hatred is virtuous, and appreciation of the Jews is racism.
Few political activists of late have done more to racialize the Arab–Israeli conflict—and, through insidious extension of the American racial hierarchy, designate American Jews as oppressors—than the Brooklyn-born Arab activist Linda Sarsour. An organizer of the Women’s March, Sarsour has seamlessly insinuated herself into a variety of high-profile progressive campaigns, a somewhat incongruent position given her reactionary views on topics like women’s rights in Saudi Arabia. (“10 weeks of PAID maternity leave in Saudi Arabia,” she tweets. “Yes PAID. And ur worrying about women driving. Puts us to shame.”) Sarsour, who is of Palestinian descent, claims that one cannot simultaneously be a feminist and a Zionist, when it is the exact opposite that is true: No genuine believer in female equality can deny the right of Israel to exist. The Jewish state respects the rights of women more than do any of its neighbors. In an April 2017 interview, Sarsour said that she had become a high-school teacher for the purpose of “inspiring young people of color like me.” Just three months earlier, however, in a video for Vox, Sarsour confessed, “When I wasn’t wearing hijab I was just some ordinary white girl from New York City.” The donning of Muslim garb, then, confers a racial caste of “color,” which in turn confers virtue, which in turn confers a claim on political power.
This attempt to describe the Israeli–Arab conflict in American racial vernacular marks Jews as white (a perverse mirror of Nazi biological racism) and thus implicates them as beneficiaries of “structural racism,” “white privilege,” and the whole litany of benefits afforded to white people at birth in the form of—to use Ta-Nehisi Coates’s abstruse phrase—the “glowing amulet” of “whiteness.” “It’s time to admit that Arthur Balfour was a white supremacist and an anti-Semite,” reads the headline of a recent piece in—where else? —the Forward, incriminating Jewish nationalism as uniquely perfidious by dint of the fact that, like most men of his time, a (non-Jewish) British official who endorsed the Zionist idea a century ago held views that would today be considered racist. Reading figures like Bouteldja and Sarsour brings to mind the French philosopher Pascal Bruckner’s observation that “the racialization of the world has to be the most unexpected result of the antidiscrimination battle of the last half-century; it has ensured that the battle continuously re-creates the curse from which it is trying to break free.”
If Jews are white, and if white people—as a group—enjoy tangible and enduring advantages over everyone else, then this racially essentialist rhetoric ends up with Jews accused of abetting white supremacy, if not being white supremacists themselves. This is one of the overlooked ways in which the term “white supremacy” has become devoid of meaning in the age of Donald Trump, with everyone and everything from David Duke to James Comey to the American Civil Liberties Union accused of upholding it. Take the case of Ben Shapiro, the Jewish conservative polemicist. At the start of the school year, Shapiro was scheduled to give a talk at UC Berkeley, his alma matter. In advance, various left-wing groups put out a call for protest in which they labeled Shapiro—an Orthodox Jew—a “fascist thug” and “white supremacist.” An inconvenient fact ignored by Shapiro’s detractors is that, according to the ADL, he was the top target of online abuse from actual white supremacists during the 2016 presidential election. (Berkeley ultimately had to spend $600,000 protecting the event from leftist rioters.)
A more pernicious form of this discourse is practiced by left-wing writers who, insincerely claiming to have the interests of Jews at heart, scold them and their communal organizations for not doing enough in the fight against anti-Semitism. Criticizing Jews for not fully signing up with the “Resistance” (which in form and function is fast becoming the 21st-century version of the interwar Popular Front), they then slyly indict Jews for being complicit in not only their own victimization but that of the entire country at the hands of Donald Trump. The first and foremost practitioner of this bullying and rather artful form of anti-Semitism is Jeet Heer, a Canadian comic-book critic who has achieved some repute on the American left due to his frenetic Twitter activity and availability when the New Republic needed to replace its staff that had quit en masse in 2014. Last year, when Heer came across a video of a Donald Trump supporter chanting “JEW-S-A” at a rally, he declared on Twitter: “We really need to see more comment from official Jewish groups like ADL on way Trump campaign has energized anti-Semitism.”
But of course “Jewish groups” have had plenty to say about the anti-Semitism expressed by some Trump supporters—too much, in the view of their critics. Just two weeks earlier, the ADL had released a report analyzing over 2 million anti-Semitic tweets targeting Jewish journalists over the previous year. This wasn’t the first time the ADL raised its voice against Trump and the alt-right movement he emboldened, nor would it be the last. Indeed, two minutes’ worth of Googling would have shown Heer that his pronouncements about organizational Jewish apathy were wholly without foundation.4
It’s tempting to dismiss Heer’s observation as mere “concern trolling,” a form of Internet discourse characterized by insincere expressions of worry. But what he did was nastier. Immediately presented with evidence for the inaccuracy of his claims, he sneered back with a bit of wisdom from the Jewish sage Hillel the Elder, yet cast as mild threat: “If I am not for myself, who will be for me?” In other words: How can you Jews expect anyone to care about your kind if you don’t sufficiently oppose—as arbitrarily judged by moi, Jeet Heer—Donald Trump?
If this sort of critique were coming from a Jewish donor upset that his preferred organization wasn’t doing enough to combat anti-Semitism, or a Gentile with a proven record of concern for Jewish causes, it wouldn’t have turned the stomach. What made Heer’s interjection revolting is that, to put it mildly, he’s not exactly known for being sympathetic toward the Jewish plight. In 2015, Heer put his name to a petition calling upon an international comic-book festival to drop the Israeli company SodaStream as a co-sponsor because the Jewish state is “built on the mass ethnic cleansing of Palestinian communities and sustained through racism and discrimination.” Heer’s name appeared alongside that of Carlos Latuff, a Brazilian cartoonist who won second place in the Iranian government’s 2006 International Holocaust Cartoon Competition. For his writings on Israel, Heer has been praised as being “very good on the conflict” by none other than Philip Weiss, proprietor of the anti-Semitic hate site Mondoweiss.
In light of this track record, Heer’s newfound concern about anti-Semitism appeared rather dubious. Indeed, the bizarre way in which he expressed this concern—as, ultimately, a critique not of anti-Semitism per se but of the country’s foremost Jewish civil-rights organization—suggests he cares about anti-Semitism insofar as its existence can be used as a weapon to beat his political adversaries. And since the incorrigibly Zionist American Jewish establishment ranks high on that list (just below that of Donald Trump and his supporters), Heer found a way to blame it for anti-Semitism. And what does that tell you? It tells you that—presented with a 16-second video of a man chanting “JEW-S-A” at a Donald Trump rally—Heer’s first impulse was to condemn not the anti-Semite but the Jews.
Heer isn’t the only leftist (or New Republic writer) to assume this rhetorical cudgel. In a piece entitled “The Dismal Failure of Jewish Groups to Confront Trump,” one Stephen Lurie attacked the ADL for advising its members to stay away from the Charlottesville “Unite the Right Rally” and let police handle any provocations from neo-Nazis. “We do not have a Jewish organizational home for the fight against fascism,” he quotes a far-left Jewish activist, who apparently thinks that we live in the Weimar Republic and not a stable democracy in which law-enforcement officers and not the balaclava-wearing thugs of antifa maintain the peace. Like Jewish Communists of yore, Lurie wants to bully Jews into abandoning liberalism for the extreme left, under the pretext that mainstream organizations just won’t cut it in the fight against “white supremacy.” Indeed, Lurie writes, some “Jewish institutions and power players…have defended and enabled white supremacy.” The main group he fingers with this outrageous slander is the Republican Jewish Coalition, the implication being that this explicitly partisan Republican organization’s discrete support for the Republican president “enables white supremacy.”
It is impossible to imagine Heer, Lurie, or other progressive writers similarly taking the NAACP to task for its perceived lack of concern about racism, or castigating the Human Rights Campaign for insufficiently combating homophobia. No, it is only the cowardice of Jews that is condemned—condemned for supposedly ignoring a form of bigotry that, when expressed on the left, these writers themselves ignore or even defend. The logical gymnastics of these two New Republic writers is what happens when, at base, one fundamentally resents Jews: You end up blaming them for anti-Semitism. Blaming Jews for not sufficiently caring enough about anti-Semitism is emotionally the same as claiming that Jews are to blame for anti-Semitism. Both signal an envy and resentment of Jews predicated upon a belief that they have some kind of authority that the claimant doesn’t and therefore needs to undermine.T his past election, one could not help but notice how the media seemingly discovered anti-Semitism when it emanated from the right, and then only when its targets were Jews on the left. It was enough to make one ask where they had been when left-wing anti-Semitism had been a more serious and pervasive problem. From at least 1996 (the year Pat Buchanan made his last serious attempt at securing the GOP presidential nomination) to 2016 (when the Republican presidential nominee did more to earn the support of white supremacists and neo-Nazis than any of his predecessors), anti-Semitism was primarily a preserve of the American left. In that two-decade period—spanning the collapse of the Oslo Accords and rise of the Second Intifada to the rancorous debate over the Iraq War and obsession with “neocons” to the presidency of Barack Obama and the 2015 Iran nuclear deal—anti-Israel attitudes and anti-Semitic conspiracy made unprecedented inroads into respectable precincts of the American academy, the liberal intelligentsia, and the Democratic Party.
The main form that left-wing anti-Semitism takes in the United States today is unhinged obsession with the wrongs, real or perceived, of the state of Israel, and the belief that its Jewish supporters in the United States exercise a nefarious control over the levers of American foreign policy. In this respect, contemporary left-wing anti-Semitism is not altogether different from that of the far right, though it usually lacks the biological component deeming Jews a distinct and inferior race. (Consider the left-wing anti-Semite’s eagerness to identify and promote Jewish “dissidents” who can attest to their co-religionists’ craftiness and deceit.) The unholy synergy of left and right anti-Semitism was recently epitomized by former CIA agent and liberal stalwart Valerie Plame’s hearty endorsement, on Twitter, of an article written for an extreme right-wing website by a fellow former CIA officer named Philip Giraldi: “America’s Jews Are Driving America’s Wars.” Plame eventually apologized for sharing the article with her 50,000 followers, but not before insisting that “many neocon hawks are Jewish” and that “just FYI, I am of Jewish descent.”
The main fora in which left-wing anti-Semitism appears is academia. According to the ADL, anti-Semitic incidents on college campuses doubled from 2014 to 2015, the latest year that data are available. Writing in National Affairs, Ruth Wisse observes that “not since the war in Vietnam has there been a campus crusade as dynamic as the movement of Boycott, Divestment, and Sanctions against Israel.” Every academic year, a seeming surfeit of controversies erupt on campuses across the country involving the harassment of pro-Israel students and organizations, the disruption of events involving Israeli speakers (even ones who identify as left-wing), and blatantly anti-Semitic outbursts by professors and student activists. There was the Oberlin professor of rhetoric, Joy Karega, who posted statements on social media claiming that Israel had created ISIS and had orchestrated the murderous attack on Charlie Hebdo in Paris. There is the Rutgers associate professor of women’s and gender studies, Jasbir Puar, who popularized the ludicrous term “pinkwashing” to defame Israel’s LGBT acceptance as a massive conspiracy to obscure its oppression of Palestinians. Her latest book, The Right to Maim, academically peer-reviewed and published by Duke University Press, attacks Israel for sparing the lives of Palestinian civilians, accusing its military of “shooting to maim rather than to kill” so that it may keep “Palestinian populations as perpetually debilitated, and yet alive, in order to control them.”
One could go on and on about such affronts not only to Jews and supporters of Israel but to common sense, basic justice, and anyone who believes in the prudent use of taxpayer dollars. That several organizations exist solely for the purpose of monitoring anti-Israel and anti-Semitic agitation on American campuses attests to the prolificacy of the problem. But it’s unclear just how reflective these isolated examples of the college experience really are. A 2017 Stanford study purporting to examine the issue interviewed 66 Jewish students at five California campuses noted for “being particularly fertile for anti-Semitism and for having an active presence of student groups critical of Israel and Zionism.” It concluded that “contrary to widely shared impressions, we found a picture of campus life that is neither threatening nor alarmist…students reported feeling comfortable on their campuses, and, more specifically, comfortable as Jews on their campuses.” To the extent that Jewish students do feel pressured, the report attempted to spread the blame around, indicting pro-Israel activists alongside those agitating against it. “[Survey respondents] fear that entering political debate, especially when they feel the social pressures of both Jewish and non-Jewish activist communities, will carry social costs that they are unwilling to bear.”
Yet by its own admission, the report “only engaged students who were either unengaged or minimally engaged in organized Jewish life on their campuses.” Researchers made a study of anti-Semitism, then, by interviewing the Jews least likely to experience it. “Most people don’t really think I’m Jewish because I look very Latina…it doesn’t come up in conversation,” one such student said in an interview. Ultimately, the report revealed more about the attitudes of unengaged (and, thus, uninformed) Jews than about the state of anti-Semitism on college campuses. That may certainly be useful in its own right as a means of understanding how unaffiliated Jews view debates over Israel, but it is not an accurate marker of developments on college campuses more broadly.
A more extensive 2016 Brandeis study of Jewish students at 50 schools found 34 percent agreed at least “somewhat” that their campus has a hostile environment toward Israel. Yet the variation was wide; at some schools, only 3 percent agreed, while at others, 70 percent did. Only 15 percent reported a hostile environment towards Jews. Anti-Semitism was found to be more prevalent at public universities than private ones, with the determinative factor being the presence of a Students for Justice in Palestine chapter on campus. Important context often lost in conversations about campus anti-Semitism, and reassuring to those concerned about it, is that it is simply not the most important issue roiling higher education. “At most schools,” the report found, “fewer than 10 percent of Jewish students listed issues pertaining to either Jews or Israel as among the most pressing on campus.”F or generations, American Jews have depended on anti-Semitism’s remaining within a moral quarantine, a cordon sanitaire, and America has reliably kept this societal virus contained. While there are no major signs that this barricade is breaking down in the immediate future, there are worrying indications on the political horizon.
Surveying the situation at the international level, the declining global position of the United States—both in terms of its hard military and economic power relative to rising challengers and its position as a credible beacon of liberal democratic values—does not portend well for Jews, American or otherwise. American leadership of the free world, has, in addition to ensuring Israel’s security, underwritten the postwar liberal world order. And it is the constituent members of that order, the liberal democratic states, that have served as the best guarantor of the Jews’ life and safety over their 6,000-year history. Were America’s global leadership role to diminish or evaporate, it would not only facilitate the rise of authoritarian states like Iran and terrorist movements such as al-Qaeda, committed to the destruction of Israel and the murder of Jews, but inexorably lead to a worldwide rollback of liberal democracy, an outcome that would inevitably redound to the detriment of Jews.
Domestically, political polarization and the collapse of public trust in every American institution save the military are demolishing what little confidence Americans have left in their system and governing elites, not to mention preparing the ground for some ominous political scenarios. Widely cited survey data reveal that the percentage of American Millennials who believe it “essential” to live in a liberal democracy hovers at just over 25 percent. If Trump is impeached or loses the next election, a good 40 percent of the country will be outraged and susceptible to belief in a stab-in-the-back theory accounting for his defeat. Whom will they blame? Perhaps the “neoconservatives,” who disproportionately make up the ranks of Trump’s harshest critics on the right?
Ultimately, the degree to which anti-Semitism becomes a problem in America hinges on the strength of the antibodies within the country’s communal DNA to protect its pluralistic and liberal values. But even if this resistance to tribalism and the cult of personality is strong, it may not be enough to abate the rise of an intellectual and societal disease that, throughout history, thrives upon economic distress, xenophobia, political uncertainty, ethnic chauvinism, conspiracy theory, and weakening democratic norms.
1 Somewhat paradoxically, according to FBI crime statistics, the majority of religiously based hate crimes target Jews, more than double the amount targeting Muslims. This indicates more the commitment of the country’s relatively small number of hard-core anti-Semites than pervasive anti-Semitism.
4 The ADL has had to maintain a delicate balancing act in the age of Trump, coming under fire by many conservative Jews for a perceived partisan tilt against the right. This makes Heer’s complaint all the more ignorant — and unhelpful.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Review of 'The Once and Future Liberal' By Mark Lilla
Lilla, a professor at Columbia University, tells us that “the story of how a successful liberal politics of solidarity became a failed pseudo-politics of identity is not a simple one.” And about this, he’s right. Lilla quotes from the feminist authors of the 1977 Combahee River Collective Manifesto: “The most profound and potentially most radical politics come directly out of our own identity, as opposed to working to end somebody else’s oppression.” Feminists looked to instantiate the “radical” and electrifying phrase which insisted that “the personal is political.” The phrase, argues Lilla, was generally seen in “a somewhat Marxist fashion to mean that everything that seems personal is in fact political.”
The upshot was fragmentation. White feminists were deemed racist by black feminists—and both were found wanting by lesbians, who also had black and white contingents. “What all these groups wanted,” explains Lilla, “was more than social justice and an end to the [Vietnam] war. They also wanted there to be no space between what they felt inside and what they saw and did in the world.” He goes on: “The more obsessed with personal identity liberals become, the less willing they become to engage in reasoned political debate.” In the end, those on the left came to a realization: “You can win a debate by claiming the greatest degree of victimization and thus the greatest outrage at being subjected to questioning.”
But Lilla’s insights into the emotional underpinnings of political correctness are undercut by an inadequate, almost bizarre sense of history. He appears to be referring to the 1970s when, zigzagging through history, he writes that “no recognition of personal or group identity was coming from the Democratic Party, which at the time was dominated by racist Dixiecrats and white union officials of questionable rectitude.”
What is he talking about? Is Lilla referring to the Democratic Party of Lyndon Johnson, Hubert Humphrey, and George McGovern? Is he referring obliquely to George Wallace? If so, why is Wallace never mentioned? Lilla seems not to know that it was the 1972 McGovern Democratic Convention that introduced minority seating to be set aside for blacks and women.
At only 140 pages, this is a short book. But even so, Lilla could have devoted a few pages to Frankfurt ideologist Herbert Marcuse and his influence on the left. In the 1960s, Marcuse argued that leftists and liberals were entitled to restrain centrist and conservative speech on the grounds that the universities had to act as a counterweight to society at large. But this was not just rhetoric; in the campus disruption of the early 1970s at schools such as Yale, Cornell, and Amherst, Marcuse’s ideals were pushed to the fore.
If Lilla’s argument comes off as flaccid, perhaps that’s because the aim of The Once and Future Liberal is more practical than principled. “The only way” to protect our rights, he tells the reader, “is to elect liberal Democratic governors and state legislators who’ll appoint liberal state attorneys.” According to Lilla, “the paradox of identity liberalism” is that it undercuts “the things it professes to want,” namely political power. He insists, rightly, that politics has to be about persuasion but then contradicts himself in arguing that “politics is about seizing power to defend the truth.” In other words, Lilla wants a better path to total victory.
Given what Lilla, descending into hysteria, describes as “the Republican rage for destruction,” liberals and Democrats have to win elections lest the civil rights of blacks, women, and gays are rolled back. As proof of the ever-looming danger, he notes that when the “crisis of the mid-1970s threatened…the country turned not against corporations and banks, but against liberalism.” Yet he gives no hint of the trail of liberal failures that led to the crisis of the mid-’70s. You’d never know reading Lilla, for example, that the Black Power movement intensified racial hostilities that were then further exacerbated by affirmative action and busing. And you’d have no idea that, at considerable cost, the poverty programs of the Great Society failed to bring poorer African Americans into the economic mainstream. Nor does Lilla deal with the devotion to Keynesianism that produced inflation without economic growth during the Carter presidency.
Despite his discursive ambling through the recent history of American political life, Lilla has a one-word explanation for identity politics: Reaganism. “Identity,” he writes, is “Reaganism for lefties.” What’s crucial in combating Reaganism, he argues, is to concentrate on our “shared political” status as citizens. “Citizenship is a crucial weapon in the battle against Reaganite dogma because it brings home that fact that we are part of a legitimate common enterprise.” But then he asserts that the “American right uses the term citizenship today as a means of exclusion.” The passage might lead the reader to think that Lilla would take up the question of immigration and borders. But he doesn’t, and the closing passages of the book dribble off into characteristic zigzags. Lilla tells us that “Black Lives Matter is a textbook example of how not to build solidarity” but then goes on, without evidence, to assert the accuracy of the Black Lives Matter claim that African-Americans have been singled out for police mistreatment.
It would be nice to argue that The Once and Future Liberal is a near miss, a book that might have had enduring importance if only it went that extra step. But Lilla’s passing insights on the perils of a politically correct identity politics drown in the rhetoric of conventional bromides that fill most of the pages of this disappointing book.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
n Athens several years ago, I had dinner with a man running for the national parliament. I asked him whether he thought he had a shot at winning. He was sure of victory, he told me. “I have hired a very famous political consultant from Washington,” he said. “He is the man who elected Reagan. Expensive. But the best.”
The political genius he then described was a minor political flunky I had met in Washington long ago, a more-or-less anonymous member of the Republican National Committee before he faded from view at the end of Ronald Reagan’s second term. Mutual acquaintances told me he still lived in a nice neighborhood in Northern Virginia, but they never could figure out what the hell he did to earn his money. (This is a recurring mystery throughout the capital.) I had to come to Greece to find the answer.
It is one of the dark arts of Washington, this practice of American political hacks traveling to faraway lands and suckering foreign politicians into paying vast sums for splashy, state-of-the-art, essentially worthless “services.” And it’s perfectly legal. Paul Manafort, who briefly managed Donald Trump’s campaign last summer, was known as a pioneer of the globe-trotting racket. If he hadn’t, as it were, veered out of his gutter into the slightly higher lane of U.S. presidential politics, he likely could have hoovered cash from the patch pockets of clueless clients from Ouagadougou to Zagreb for the rest of his natural life and nobody in Washington would have noticed.
But he veered, and now he and a colleague find themselves indicted by Robert Mueller, the Inspector Javert of the Russian-collusion scandal. When those indictments landed, they instantly set in motion the familiar scramble. Trump fans announced that the indictments were proof that there was no collusion between the Trump campaign and the Russians—or, in the crisp, emphatic phrasing of a tweet by the world’s Number One Trump Fan, Donald Trump: “NO COLLUSION!!!!” The Russian-scandal fetishists in the press corps replied in chorus: It’s still early! Javert required more time, and so will Mueller, and so will they.
A good Washington scandal requires a few essential elements. One is a superabundance of information. From these data points, conspiracy-minded reporters can begin to trace associations, warranted or not, and from the associations, they can infer motives and objectives with which, stretched together, they can limn a full-blown conspiracy theory. The Manafort indictment released a flood of new information, and at once reporters were pawing for nuggets that might eventually form a compelling case for collusion.
They failed to find any because Manafort’s indictment, in essence, involved his efforts to launder his profits from his international political work, not his work for the Trump campaign. Fortunately for the obsessives, another element is required for a good scandal: a colorful cast. The various Clinton scandals brought us Asian money-launderers and ChiCom bankers, along with an entire Faulkner-novel’s worth of bumpkins, sharpies, and backwoods swindlers, plus that intern in the thong. Watergate, the mother lode of Washington scandals, featured a host of implausible characters, from the central-casting villain G. Gordon Liddy to Sam Ervin, a lifelong segregationist and racist who became a hero to liberals everywhere.
Here, at last, is one area where the Russian scandal has begun to show promise. Manafort and his business partner seem too banal to hold the interest of anyone but a scandal obsessive. Beneath the pile of paper Mueller dumped on them, however, another creature could be seen peeking out shyly. This would be the diminutive figure of George Papadopoulos. An unpaid campaign adviser to Trump, Papadopoulos pled guilty to lying to the FBI about the timing of his conversations with Russian agents. He is quickly becoming the stuff of legend.
Papadopoulos is an exemplar of a type long known to American politics. He is the nebbish bedazzled by the big time—achingly ambitious, though lacking the skill, or the cunning, to climb the greasy pole. So he remains at the periphery of the action, ever eager to serve. Papadopoulos’s résumé, for a man under 30, is impressively padded. He said he served as the U.S. representative to the Model United Nations in 2012, though nobody recalls seeing him there. He boasted of a four-year career at the Hudson Institute, though in fact he spent one year there as an unpaid intern and three doing contract research for one of Hudson’s scholars. On his LinkedIn page, he listed himself as a keynote speaker at a Greek American conference in 2008, but in fact he participated only in a panel discussion. The real keynoter was Michael Dukakis.
With this hunger for achievement, real or imagined, Papadopoulos could not let a presidential campaign go by without climbing aboard. In late 2015, he somehow attached himself to Ben Carson’s campaign. He was never paid and lasted four months. His presence went largely unnoticed. “If there was any work product, I never saw it,” Carson’s campaign manager told Time. The deputy campaign manager couldn’t even recall his name. Then suddenly, in April 2016, Papadopoulos appeared on a list of “foreign-policy advisers” to Donald Trump—and, according to Mueller’s court filings, resolved to make his mark by acting as a liaison between Trump’s campaign and the Russian government.
While Mueller tells the story of Papadopoulos’s adventures in the dry, Joe Friday prose of a legal document, it could easily be the script for a Peter Sellers movie from the Cold War era. The young man’s résumé is enough to impress the campaign’s impressionable officials as they scavenge for foreign-policy advisers: “Hey, Corey! This dude was in the Model United Nations!”
Papadopoulus (played by Sellers) sets about his mission. A few weeks after signing on to the campaign, he travels to Europe, where he meets a mysterious “Professor” (Peter Ustinov). “Initially the Professor seemed uninterested in Papadopoulos,” says Mueller’s indictment. A likely story! Yet when Papadopoulus lets drop that he’s an adviser to Trump, the Professor suddenly “appeared to take great interest” in him. They arrange a meeting in London to which the Professor invites a “female Russian national” (Elke Sommer). Without much effort, the femme fatale convinces Papadopoulus that she is Vladimir Putin’s niece. (“I weel tell z’American I em niece of Great Leader! Zat idjut belief ennytink!”) Over the next several months our hero sends many emails to campaign officials and to the Professor, trying to arrange a meeting between them. As far we know from the indictment, nothing came of his mighty efforts.
And there matters lay until January 2017, when the FBI came calling. Agents asked Papadopoulos about his interactions with the Russians. Even though he must have known that hundreds of his emails on the subject would soon be available to the FBI, he lied and told the agents that the contacts had occurred many months before he joined the campaign. History will record Papadopoulos as the man who forgot that emails carry dates on them. After the FBI interview, according to the indictment, he tried to destroy evidence with the same competence he has brought to his other endeavors. He closed his Facebook account, on which several communications with the Russians had taken place. He threw out his old cellphone. (That should do it!) After that, he began wearing a blindfold, on the theory that if he couldn’t see the FBI, the FBI couldn’t see him.
I made that last one up, obviously. For now, the great hope of scandal hobbyists is that Papadopoulus was wearing a wire between the time he secretly pled guilty and the time his plea was made public. This would have allowed him to gather all kinds of incriminating dirt in conversations with former colleagues. And the dirt is there, all right, as the Manafort indictment proves. Unfortunately for our scandal fetishists, so far none of it shows what their hearts most desire: active collusion between Russia and the Trump campaign.
Choose your plan and pay nothing for six Weeks!
An affair to remember
All this changed with the release in 1967 of Arthur Penn’s Bonnie and Clyde and Mike Nichols’s The Graduate. These two films, made in nouveau European style, treated familiar subjects—a pair of Depression-era bank robbers and a college graduate in search of a place in the adult world—in an unmistakably modern manner. Both films were commercial successes that catapulted their makers and stars into the top echelon of what came to be known as “the new Hollywood.”
Bonnie and Clyde inaugurated a new era in which violence on screen simultaneously became bloodier and more aestheticized, and it has had enduring impact as a result. But it was The Graduate that altered the direction of American moviemaking with its specific appeal to younger and hipper moviegoers who had turned their backs on more traditional cinematic fare. When it opened in New York in December, the movie critic Hollis Alpert reported with bemusement that young people were lining up in below-freezing weather to see it, and that they showed no signs of being dismayed by the cold: “It was as though they all knew they were going to see something good, something made for them.”
The Graduate, whose aimless post-collegiate title character is seduced by the glamorous but neurotic wife of his father’s business partner, is part of the common stock of American reference. Now, a half-century later, it has become the subject of a book-length study, Beverly Gray’s Seduced by Mrs. Robinson: How The Graduate Became the Touchstone of a Generation.1 As is so often the case with pop-culture books, Seduced by Mrs. Robinson is almost as much about its self-absorbed Baby Boomer author (“The Graduate taught me to dance to the beat of my own drums”) as its subject. It has the further disadvantage of following in the footsteps of Mark Harris’s magisterial Pictures at a Revolution: Five Movies and the Birth of the New Hollywood (2008), in which the film is placed in the context of Hollywood’s mid-’60s cultural flux. But Gray’s book offers us a chance to revisit this seminal motion picture and consider just why it was that The Graduate spoke to Baby Boomers in a distinctively personal way.T he Graduate began life in 1963 as a novella of the same name by Charles Webb, a California-born writer who saw his book not as a comic novel but as a serious artistic statement about America’s increasingly disaffected youth. It found its way into the hands of a producer named Lawrence Turman who saw The Graduate as an opportunity to make the cinematic equivalent of Salinger’s The Catcher in the Rye. Turman optioned the book, then sent it to Mike Nichols, who in 1963 was still best known for his comic partnership with Elaine May but had just made his directorial debut with the original Broadway production of Barefoot in the Park.
Both men saw that The Graduate posed a problem to anyone seeking to put it on the screen. In Turman’s words, “In the book the character of Benjamin Braddock is sort of a whiny pain in the fanny [whom] you want to shake or spank.” To this end, they turned to Buck Henry, who had co-created the popular TV comedy Get Smart with Mel Brooks, to write a screenplay that would retain much of Webb’s dryly witty dialogue (“I think you’re the most attractive of all my parents’ friends”) while making Benjamin less priggish.
Nichols’s first major act was casting Dustin Hoffman, an obscure New York stage actor pushing 30, for the title role. No one but Nichols seems to have thought him suitable in any way. Not only was Hoffman short and nondescript-looking, but he was unmistakably Jewish, whereas Benjamin is supposedly the scion of a newly monied WASP family from southern California. Nevertheless, Nichols decided he wanted “a short, dark, Jewish, anomalous presence, which is how I experience myself,” in order to underline Benjamin’s alienation from the world of his parents.
Nichols filled the other roles in equally unexpected ways. He hired the Oscar winner Anne Bancroft, only six years Hoffman’s senior, to play the unbalanced temptress who lures Benjamin into her bed, then responds with volcanic rage when he falls in love with her beautiful daughter Elaine. He and Henry also steered clear of on-screen references to the campus protests that had only recently started to convulse America. Instead, he set The Graduate in a timeless upper-middle-class milieu inhabited by people more interested in social climbing than self-actualization—the same milieu from which Benjamin is so alienated that he is reduced to near-speechlessness whenever his family and their friends ask him what he plans to do now that he has graduated.
The film’s only explicit allusion to its cultural moment is the use on the soundtrack of Simon & Garfunkel’s “The Sound of Silence,” the painfully earnest anthem of youthful angst that is for all intents and purposes the theme song of The Graduate. Nevertheless, Henry’s screenplay leaves little doubt that the film was in every way a work of its time and place. As he later explained to Mark Harris, it is a study of “the disaffection of young people for an environment that they don’t seem to be in sync with.…Nobody had made a film specifically about that.”
This aspect of The Graduate is made explicit in a speech by Benjamin that has no direct counterpart in the novel: “It’s like I was playing some kind of game, but the rules don’t make any sense to me. They’re being made up by all the wrong people. I mean, no one makes them up. They seem to make themselves up.”
The Graduate was Nichols’s second film, following his wildly successful movie version of Edward Albee’s Who’s Afraid of Virginia Woolf?. Albee’s play was a snarling critique of the American dream, which he believed to be a snare and a delusion. The Graduate had the same skeptical view of postwar America, but its pessimism was played for laughs. When Benjamin is assured by a businessman in the opening scene that the secret to success in America is “plastics,” we are meant to laugh contemptuously at the smugness of so blinkered a view of life. Moreover, the contempt is as real as the laughter: The Graduate has it both ways. For the same reason, the farcical quality of the climactic scene (in which Benjamin breaks up Elaine’s marriage to a handsome young WASP and carts her off to an unknown fate) is played without musical underscoring, a signal that what Benjamin is doing is really no laughing matter.
The youth-oriented message of The Graduate came through loud and clear to its intended audience, which paid no heed to the mixed reviews from middle-aged reviewers unable to grasp what Nichols and Henry were up to. Not so Roger Ebert, the newly appointed 25-year-old movie critic of the Chicago Sun-Times, who called The Graduate “the funniest American comedy of the year…because it has a point of view. That is to say, it is against something.”
Even more revealing was the response of David Brinkley, then the co-anchor of NBC’s nightly newscast, who dismissed The Graduate as “frantic nonsense” but added that his college-age son and his classmates “liked it because it said about the parents and others what they would have said about us if they had made the movie—that we are self-centered and materialistic, that we are licentious and deeply hypocritical about it, that we try to make them into walking advertisements for our own affluence.”
A year after the release of The Graduate, a film-industry report cited in Pictures at a Revolution revealed that “48 percent of all movie tickets in America were now being sold to filmgoers under the age of 24.” A very high percentage of those tickets were to The Graduate and Bonnie and Clyde. At long last, Hollywood had figured out what the Baby Boomers wanted to see.A nd how does The Graduate look a half-century later? To begin with, it now appears to have been Mike Nichols’s creative “road not taken.” In later years, Nichols became less an auteur than a Hollywood director who thought like a Broadway director, choosing vehicles of solid middlebrow-liberal appeal and serving them faithfully without imposing a strong creative vision of his own. In The Graduate, by contrast, he revealed himself to be powerfully aware of the same European filmmaking trends that shaped Bonnie and Clyde. Within a naturalistic framework, he deployed non-naturalistic “new wave” cinematographic techniques with prodigious assurance—and he was willing to end The Graduate on an ambiguous note instead of wrapping it up neatly and pleasingly, letting the camera linger on the unsure faces of Hoffman and Ross as they ride off into an unsettling future.
It is this ambiguity, coupled with Nichols’s prescient decision not to allow The Graduate to become a literal portrayal of American campus life in the troubled mid-’60s, that has kept the film fresh. But The Graduate is fresh in a very particular way: It is a young person’s movie, the tale of a boy-man terrified by the prospect of growing up to be like his parents. Therein lay the source of its appeal to young audiences. The Graduate showed them what they, too, feared most, and hinted at a possible escape route.
In the words of Beverly Gray, who saw The Graduate when it first came out in 1967: “The Graduate appeared in movie houses just as we young Americans were discovering how badly we wanted to distance ourselves from the world of our parents….That polite young high achiever, those loving but smothering parents, those comfortable but slightly bland surroundings: They combined to form an only slightly exaggerated version of my own cozy West L.A. world.”
Yet to watch The Graduate today—especially if you first saw it when much younger—is also to be struck by the extreme unattractiveness of its central character. Hoffman plays Benjamin not as the comically ineffectual nebbish of Jewish tradition but as a near-catatonic robot who speaks by turns in a flat monotone and a frightened nasal whine. It is impossible to understand why Mrs. Robinson would want to go to bed with such a mousy creature, much less why Elaine would run off with him—an impression that has lately acquired an overlay of retrospective irony in the wake of accusations that Hoffman has sexually harassed female colleagues on more than one occasion. Precisely because Benjamin is so unlikable, it is harder for modern-day viewers to identify with him in the same way as did Gray and her fellow Boomers. To watch a Graduate-influenced film like Noah Baumbach’s Kicking and Screaming (1995), a poignant romantic comedy about a group of Gen-X college graduates who deliberately choose not to get on with their lives, is to see a closely similar dilemma dramatized in an infinitely more “relatable” way, one in which the crippling anxiety of the principal characters is presented as both understandable and pitiable, thus making it funnier.
Be that as it may, The Graduate is a still-vivid snapshot of a turning point in American cultural history. Before Benjamin Braddock, American films typically portrayed men who were not overgrown, smooth-faced children but full-grown adults, sometimes misguided but incontestably mature. After him, permanent immaturity became the default position of Hollywood-style masculinity.
For this reason, it will be interesting to see what the Millennials, so many of whom demand to be shielded from the “triggering” realities of adult life, make of The Graduate if and when they come to view it. I have a feeling that it will speak to a fair number of them far more persuasively than it did to those of us who—unlike Benjamin Braddock—longed when young to climb the high hill of adulthood and see for ourselves what awaited us on the far side.
1 Algonquin, 278 pages
Choose your plan and pay nothing for six Weeks!
“I think that’s best left to states and locales to decide,” DeVos replied. “If the underlying question is . . .”
Murphy interrupted. “You can’t say definitively today that guns shouldn’t be in schools?”
“Well, I will refer back to Senator Enzi and the school that he was talking about in Wapiti, Wyoming, I think probably there, I would imagine that there’s probably a gun in the school to protect from potential grizzlies.”
Murphy continued his line of questioning unfazed. “If President Trump moves forward with his plan to ban gun-free school zones, will you support that proposal?”
“I will support what the president-elect does,” DeVos replied. “But, senator, if the question is around gun violence and the results of that, please know that my heart bleeds and is broken for those families that have lost any individual due to gun violence.”
Because all this happened several million outrage cycles ago, you may have forgotten what happened next. Rather than mention DeVos’s sympathy for the victims of gun violence, or her support for federalism, or even her deference to the president, the media elite fixated on her hypothetical aside about grizzly bears.
“Betsy DeVos Cites Grizzly Bears During Guns-in-Schools Debate,” read the NBC News headline. “Citing grizzlies, education nominee says states should determine school gun policies,” reported CNN. “Sorry, Betsy DeVos,” read a headline at the Atlantic, “Guns Aren’t a Bear Necessity in Schools.”
DeVos never said that they were, of course. Nor did she “cite” the bear threat in any definitive way. What she did was decline the opportunity to make a blanket judgment about guns and schools because, in a continent-spanning nation of more than 300 million people, one standard might not apply to every circumstance.
After all, there might be—there are—cases when guns are necessary for security. Earlier this year, Virginia Governor Terry McAuliffe signed into law a bill authorizing some retired police officers to carry firearms while working as school guards. McAuliffe is a Democrat.
In her answer to Murphy, DeVos referred to a private meeting with Senator Enzi, who had told her of a school in Wyoming that has a fence to keep away grizzly bears. And maybe, she reasoned aloud, the school might have a gun on the premises in case the fence doesn’t work.
As it turns out, the school in Wapiti is gun-free. But we know that only because the Washington Post treated DeVos’s offhand remark as though it were the equivalent of Alexander Butterfield’s revealing the existence of the secret White House tapes. “Betsy DeVos said there’s probably a gun at a Wyoming school to ward off grizzlies,” read the Post headline. “There isn’t.” Oh, snap!
The article, like the one by NBC News, ended with a snarky tweet. The Post quoted user “Adam B.,” who wrote, “‘We need guns in schools because of grizzly bears.’ You know what else stops bears? Doors.” Clever.
And telling. It becomes more difficult every day to distinguish between once-storied journalistic institutions and the jabbering of anonymous egg-avatar Twitter accounts. The eagerness with which the press misinterprets and misconstrues Trump officials is something to behold. The “context” the best and brightest in media are always eager to provide us suddenly goes poof when the opportunity arises to mock, impugn, or castigate the president and his crew. This tendency is especially pronounced when the alleged gaffe fits neatly into a prefabricated media stereotype: that DeVos is unqualified, say, or that Rick Perry is, well, Rick Perry.
On November 2, the secretary of energy appeared at an event sponsored by Axios.com and NBC News. He described a recent trip to Africa:
It’s going to take fossil fuels to push power out to those villages in Africa, where a young girl told me to my face, “One of the reasons that electricity is so important to me is not only because I won’t have to try to read by the light of a fire, and have those fumes literally killing people, but also from the standpoint of sexual assault.” When the lights are on, when you have light, it shines the righteousness, if you will, on those types of acts. So from the standpoint of how you really affect people’s lives, fossil fuels is going to play a role in that.
This heartfelt story of the impact of electrification on rural communities was immediately distorted into a metaphor for Republican ignorance and cruelty.
“Energy Secretary Rick Perry Just Made a Bizarre Claim About Sexual Assault and Fossil Fuels,” read the Buzzfeed headline. “Energy Secretary Rick Perry Says Fossil Fuels Can Prevent Sexual Assault,” read the headline from NBC News. “Rick Perry Says the Best Way to Prevent Rape Is Oil, Glorious Oil,” said the Daily Beast.
“Oh, that Rick Perry,” wrote Gail Collins in a New York Times column. “Whenever the word ‘oil’ is mentioned, Perry responds like a dog on the scent of a hamburger.” You will note that the word “oil” is not mentioned at all in Perry’s remarks.
You will note, too, that what Perry said was entirely commonsensical. While the precise relation between public lighting and public safety is unknown, who can doubt that brightly lit areas feel safer than dark ones—and that, as things stand today, cities and towns are most likely to be powered by fossil fuels? “The value of bright street lights for dispirited gray areas rises from the reassurance they offer to some people who need to go out on the sidewalk, or would like to, but lacking the good light would not do so,” wrote Jane Jacobs in The Death and Life of Great American Cities. “Thus the lights induce these people to contribute their own eyes to the upkeep of the street.” But c’mon, what did Jane Jacobs know?
No member of the Trump administration so rankles the press as the president himself. On the November morning I began this column, I awoke to outrage that President Trump had supposedly violated diplomatic protocol while visiting Japan and its prime minister, Shinzo Abe. “President Trump feeds fish, winds up pouring entire box of food into koi pond,” read the CNN headline. An article on CBSNews.com headlined “Trump empties box of fish food into Japanese koi pond” began: “President Donald Trump’s visit to Japan briefly took a turn from formal to fishy.” A Bloomberg reporter traveling with the president tweeted, “Trump and Abe spooning fish food into a pond. (Toward the end, @potus decided to just dump the whole box in for the fish).”
Except that’s not what Trump “decided.” In fact, Trump had done exactly what Abe had done a few seconds before. That fact was buried in write-ups of the viral video of Trump and the fish. “President Trump was criticized for throwing an entire box of fish food into a koi pond during his visit to Japan,” read a Tweet from the New York Daily News, linking to a report on phony criticism Trump received because of erroneous reporting from outlets like the News.
There’s an endless, circular, Möbius-strip-like quality to all this nonsense. Journalists are so eager to catch the president and his subordinates doing wrong that they routinely traduce the very canons of journalism they are supposed to hold dear. Partisan and personal animus, laziness, cynicism, and the oversharing culture of social media are a toxic mix. The press in 2017 is a lot like those Japanese koi fish: frenzied, overstimulated, and utterly mindless.
Choose your plan and pay nothing for six Weeks!
Review of 'Lessons in Hope' By George Weigel
Standing before the eternal flame, a frail John Paul shed silent tears for 6 million victims, including some of his own childhood friends from Krakow. Then, after reciting verses from Psalm 31, he began: “In this place of memories, the mind and heart and soul feel an extreme need for silence. … Silence, because there are no words strong enough to deplore the terrible tragedy of the Shoah.” Parkinson’s disease strained his voice, but it was clear that the pope’s irrepressible humanity and spiritual strength had once more stood him in good stead.
George Weigel watched the address from NBC’s Jerusalem studios, where he was providing live analysis for the network. As he recalls in Lessons in Hope, his touching and insightful memoir of his time as the pope’s biographer, “Our newsroom felt the impact of those words, spoken with the weight of history bearing down on John Paul and all who heard him: normally a place of bedlam, the newsroom fell completely silent.” The pope, he writes, had “invited the world to look, hard, at the stuff of its redemption.”
Weigel, a senior fellow at the Ethics and Public Policy Center, published his biography of John Paul in two volumes, Witness to Hope (1999) and The End and the Beginning (2010). His new book completes a John Paul triptych, and it paints a more informal, behind-the-scenes portrait. Readers, Catholic and otherwise, will finish the book feeling almost as though they knew the 264th successor of Peter. Lessons in Hope is also full of clerical gossip. Yet Weigel never loses sight of his main purpose: to illuminate the character and mind of the “emblematic figure of the second half of the twentieth century.”
The book’s most important contribution comes in its restatement of John Paul’s profound political thought at a time when it is sorely needed. Throughout, Weigel reminds us of the pope’s defense of the freedom of conscience; his emphasis on culture as the primary engine of history; and his strong support for democracy and the free economy.
When the Soviet Union collapsed, the pope continued to promote these ideas in such encyclicals as Centesimus Annus. The 1991 document reiterated the Church’s opposition to socialist regimes that reduce man to “a molecule within the social organism” and trample his right to earn “a living through his own initiative.” Centesimus Annus also took aim at welfare states for usurping the role of civil society and draining “human energies.” The pope went on to explain the benefits, material and moral, of free enterprise within a democratic, rule-of-law framework.
Yet a libertarian manifesto Centesimus Annus was not. It took note of free societies’ tendency to breed spiritual poverty, materialism, and social incohesion, which in turn could lead to soft totalitarianism. John Paul called on state, civil society, and people of God to supply the “robust public moral culture” (in Weigel’s words) that would curb these excesses and ensure that free-market democracies are ordered to the common good.
When Weigel emerged as America’s preeminent interpreter of John Paul, in the 1980s and ’90s, these ideas were ascendant among Catholic thinkers. In addition to Weigel, proponents included the philosopher Michael Novak and Father Richard John Neuhaus of First Things magazine (both now dead). These were faithful Catholics (in Neuhaus’s case, a relatively late convert) nevertheless at peace with the free society, especially the American model. They had many qualms with secular modernity, to be sure. But with them, there was no question that free societies and markets are preferable to unfree ones.
How things have changed. Today all the energy in those Catholic intellectual circles is generated by writers and thinkers who see modernity as beyond redemption and freedom itself as the problem. For them, the main question is no longer how to correct the free society’s course (by shoring up moral foundations, through evangelization, etc.). That ship has sailed or perhaps sunk, according to this view. The challenges now are to protect the Church against progressivism’s blows and to see beyond the free society as a political horizon.
Certainly the trends that worried John Paul in Centesimus Annus have accelerated since the encyclical was issued. “The claim that agnosticism and skeptical relativism are the philosophy and the basic attitude which correspond to democratic forms of political life” has become even more hegemonic than it was in 1991. “Those who are convinced that they know the truth and firmly adhere to it” increasingly get treated as ideological lepers. And with the weakening of transcendent truths, ideas are “easily manipulated for reasons of power.”
Thus a once-orthodox believer finds himself or herself compelled to proclaim that there is no biological basis to gender; that men can menstruate and become pregnant; that there are dozens of family forms, all as valuable and deserving of recognition as the conjugal union of a man and a woman; and that speaking of the West’s Judeo-Christian patrimony is tantamount to espousing white supremacy. John Paul’s warnings read like a description of the present.
The new illiberal Catholics—a label many of these thinkers embrace—argue that these developments aren’t a distortion of the idea of the free society but represent its very essence. This is a mistake. Basic to the free society is the freedom of conscience, a principle enshrined in democratic constitutions across the West and, I might add, in the Catholic Church’s post–Vatican II magisterium. Under John Paul, religious liberty became Rome’s watchword in the fight against Communist totalitarianism, and today it is the Church’s best weapon against the encroachments of secular progressivism. The battle is far from lost, moreover. There is pushback in the courts, at the ballot box, and online. Sometimes it takes demagogic forms that should discomfit people of faith. Then again, there is a reason such pushback is called “reaction.”
A bigger challenge for Catholics prepared to part ways with the free society as an ideal is this: What should Christian politics stand for in the 21st century? Setting aside dreams of reuniting throne and altar and similar nostalgia, the most cogent answer offered by Catholic illiberalism is that the Church should be agnostic with respect to regimes. As Harvard’s Adrian Vermeule has recently written, Christians should be ready to jettison all “ultimate allegiances,” including to the Constitution, while allying with any party or regime when necessary.
What at first glance looks like an uncompromising Christian politics—cunning, tactical, and committed to nothing but the interests of the Church—is actually a rather passive vision. For a Christianity that is “radically flexible” in politics is one that doesn’t transform modernity from within. In practice, it could easily look like the Vatican Ostpolitik diplomacy that sought to appease Moscow before John Paul was elected.
Karol Wojtya discarded Ostpolitik as soon as he took the Petrine office. Instead, he preached freedom and democracy—and meant it. Already as archbishop of Krakow under Communism, he had created free spaces where religious and nonreligious dissidents could engage in dialogue. As pope, he expressed genuine admiration for the classically liberal and decidedly secular Vaclav Havel. He hailed the U.S. Constitution as the source of “ordered freedom.” And when, in 1987, the Chilean dictator Augusto Pinochet asked him why he kept fussing about democracy, seeing as “one system of government is as good as another,” the pope responded: No, “the people have a right to their liberties, even if they make mistakes in exercising them.”
The most heroic and politically effective Christian figure of the 20th century, in other words, didn’t follow the path of radical flexibility. His Polish experience had taught him that there are differences between regimes—that some are bound to uphold conscience and human dignity, even if they sometimes fall short of these commitments, while others trample rights by design. The very worst of the latter kind could even whisk one’s boyhood friends away to extermination camps. There could be no radical Christian flexibility after the Holocaust.