Foreign policy has once again become a matter of consequential dispute in American political life. Twenty-one respondents give their views…
Foreign policy has once again become a matter of consequential dispute in American political life. But as Norman Podhoretz observed last month in “Strange Bedfellows: A Guide to the New Foreign-Policy Debates,” any number of well-known figures at different points on the political and ideological spectrum seem to have altered their accustomed views of the U.S. role in the world. His essay drew a taxonomy of these shifting attitudes over the last quarter-century and especially over the last few years, before articulating a position of its own (in a phrase, “post-Reagan Reaganism”) with regard to the uses of American power in the period we have lately entered.
In an effort to broaden and extend this discussion, the editors solicited reactions to Mr. Podhoretz’s analysis, mostly but not entirely from the diverse individuals whose positions he cited. Our invitation to them noted that, in addition to commenting on specific points raised by “Strange Bedfellows,” “we’d be especially grateful for a reasoned statement of your own ‘take’ on the current American role in the world and on the proper direction of American foreign policy now and in the years ahead.”
The responses, 21 in all, appear below in alphabetical order.
This symposium is sponsored by the Edwin Morris Gale Memorial Fund.
Anyone instructed in international relations during the last two centuries would know about the centrality of the balance of power. But the recent emergence of the United States as the dominant world power constitutes a radical change from that condition. The key question we now face is whether to preserve this dominance, or whether to view it as a danger to ourselves and others.
As a neoconservative and neo-Reaganite, labels Norman Podhoretz places on me and that I accept, my own answer is obvious: preserving our dominance will not only advance our own national interests but will preserve peace and promote the cause of democracy and human rights. Since America’s emergence as a world power roughly a century ago, we have made many errors, but we have been the greatest force for good among the nations of the earth. A diminution in American power or influence bodes ill for our country, our friends, and our principles.
What might threaten our leading role? I am less sanguine than Podhoretz may be and than Henry Kissinger clearly is about the People’s Republic of China. Just as during the cold war the problem was not Russia but Soviet Communism, in the PRC today the problem is rule by a Communist elite whose interests contradict those of its own people—and ours. As the strategist Coral Bell notes in the Fall 1999 issue of the National Interest, items at issue in the “possible collision course” between Washington and Beijing include the survival of Taiwan, the fate of North Korea, the U.S. alliance with Japan, the American naval and troop presence in East Asia, the prospect of a missile-defense umbrella over Japan and Taiwan, and PRC human-rights violations, not least in Tibet. With all these very much in mind, the PRC has increased its military spending by half during a decade when the rest of the world has used the post-cold-war calm to reduce defense expenditures. Chinese rulers are rushing to build a modern force that can dominate East Asia, and are supplying pariah states around the world with the latest missile and nuclear technology.
The Chinese regime, like all Communist regimes, is fundamentally insecure because it does not rest on popular support. Although it is trying to win a measure of legitimacy by improving living standards and allowing some additional personal autonomy, this will not work: as always with “goulash Communism,” whatever the local recipe, people will like the goulash but not the Communism. The recent crackdown on the apparently harmless Falun Gong movement and the continuing refusal to allow freedom of religion demonstrate how limited personal autonomy is likely to be and how extraordinarily insecure the government feels. As in the Soviet case, the regime will seek military power and success as a means of improving its legitimacy—and of cowing both its own people and its Asian neighbors.
In the Chinese case, however, there is a key difference Podhoretz does not mention: American business is pro-Chinese in a way in which it was never pro-Soviet. Too little money was at stake in Russia. By contrast, the corporate community has persuaded itself that, despite current setbacks, there are vast fortunes to be made even in a Communist China. Consequently, those promoting a tough line toward China’s human-rights violations and its aggressive foreign policy face resistance not only in Beijing but in the U.S. Chamber of Commerce. As reaction to the recent agreement on Chinese membership in the World Trade Organization suggests, this is already causing divisions within the Republican party between the business community and all stripes of conservatives.
Pointedly acquitting Henry Kissinger of the “scurrilous charge that his ideas about China have been shaped by the commercial interests some clients of his consulting firm have there,” Podhoretz also assures us that realists like Kissinger will advocate containment of the Chinese if the latter “show signs of developing imperial aspirations.” But what “signs” do we need, beyond the extraordinary buildup of conventional ground and naval forces as well as strategic nuclear forces, and China’s renewed threats to Taiwan? To say that this suggests no “imperial aspirations,” not even in Asia, is reminiscent of the line (properly dismissed by Kissinger and Podhoretz alike) that Soviet expansionism in Eastern Europe was “really” defensive and was motivated by fear of the West.
An issue to which few commentators other than Podhoretz have devoted adequate attention is changing notions of sovereignty. Commenting on Kosovo, he writes: “I find it hard to quarrel with the emerging idea that the principle of sovereignty should no longer embrace the right of political leaders to butcher their own people.” Coral Bell, too, points to the “new norms,” especially concerning human rights and the environment, that “legitimate great-power intervention in the crises of lesser powers to a degree seldom envisaged in previous diplomatic history,” and optimistically concludes that “Washington’s current and immediate future generations of diplomatic strategists have as large an opportunity (and as complete a set of tactical choices) before them as those of 1946-47.” But there are real dangers here in addition to opportunities. These “new norms” can be invoked to challenge American power as easily as to justify its use.
If, for instance, a single judge in Spain can force the seizure of General Pinochet in London against the wishes of both the Spanish and Chilean governments, what have we wrought? Can a system in which sovereignty may be breached not only by great powers but by any ambitious jurist really survive, and—a no less urgent question—can it be counted on to protect the rights of Americans? The new International Criminal Court, which treats sovereignty as a mere formality, presents similar difficulties. A long list of treaties now regulates matters once thought to be questions of domestic law.
Podhoretz likens our situation to that of 1919. I lean to Coral Bell’s comparison of the coming decade (or the next presidential term) to the fateful period of 1946-47. Whichever analogy one prefers, in this period, as Podhoretz notes, we will surely face painful decisions about when and where to intervene. But I come out as he does: better to face those decisions than the far worse ones we will encounter if we let our position slip away. Whether the current generation of politicians and strategists is up to the task before them, we shall soon find out.
William F. Buckley, Jr.
Manifestly, Uncle Norman is having trouble outlining a paradigmatic foreign policy, and I do not blame him. He is very good at noting contradictions, anomalies, inconsistencies, and we have had many of them during the Clinton years. They have not, however, resulted in the crystallization of a foreign policy agreeable to—whom? Whom do we want to satisfy? God? Locke? The Founders? Americans with continuing fire in their compassionate bellies? Or maybe the old soldiers, who echo John Quincy Adams in urging Americans concerned for freedom abroad to wish its prospects well, while cultivating it with manual effort only in our own garden.
We know now more clearly than we did when perestroika took over in the Soviet Union and annulled Lenin that the mix that harnessed the energies and direction of so many of us during the cold war—traditional isolationists, traditional interventionists, traditional anti-militarists and their opposites—is gone and is not prospectively replaceable. When all is said and done, Podhoretz concludes that he would rather ally himself with an idealistic interventionism than with a prudent isolationism—a very insufficient term, for which I would substitute measured internationalism. But the difficulty he confronts in sorting things out and coming to this conclusion rests in part on his determination to face the comprehensive problems of foreign policy—what to do when, to whom, under what circumstances—a priori.
That is the intellectually challenging way of doing things, to seek out policy templates. As in, “When a nation threatens other nations, we will intervene.” Or even, “When a nation threatens its own people, we will intervene.” Or maybe even, “When a non-democratic nation can reasonably be assumed to be developing an ABC (atomic/biological/chemical) weapons capability, we will intervene.” How to intervene, at risk of how much sacrifice, is a subordinate question, and of course subordinate questions generate sub-subordinate questions. Podhoretz balks at answering categorically any of these questions, leaving us only with his inclination in favor of an interventionist over a restrictive policy.
Well, coward that I am, I am going to proceed not a priori, but a fortiori: not by laying down constitutional formulations, but by asking one question. What are we going to do about Taiwan? In exploring that, perhaps light will be shed on structural questions as well.
About once every two years, beginning in 1962, I have counseled a foreign policy encouraging Taiwan to declare its independence of the Communist mainland. I knew how adamantly secession was opposed by Taiwan itself after my first visit there in 1962, and was still opposed, though with less than unanimity, when I was last there in 1992 on my fourth visit. Although it violates my rule against retrospective improvisation, I nevertheless observe that if we had succeeded in the 1960’s in persuading the Taiwanese to take progressive steps to detach themselves from Communist China—seminars, rallies, plebiscites, declarations of independence, applications to the United Nations—the gestation of a democratically governed independent state of twenty million people, appealing to the Supreme Judge of the world for the rectitude of their intentions and unfurling their own flag, would have considerably hampered Chinese irredentism.
But now there is the critical problem that the China of the new millennium will be making its claims with an arsenal of nuclear weapons and missiles and will be asserting jurisdictional sovereignty over an island that itself claims to be the legitimate government of China but that successive United States administrations, bowing to Beijing, have treated as a constituent part of China—whatever the peculiarities of a Chinese province that governs itself, has its own army and navy, and swears not to submit to the political authority of Beijing until there is a change there of counterrevolutionary dimensions.
At the same time, we are “pledged” to the qualified defense of Taiwan. It is fair then to ask what we will do if the day comes when Beijing issues an ultimatum—for instance, by demanding that Taipei must disband its military forces and receive a mainland delegation that will take effective control of the government. We all know that the object of statecraft is to abort crisis, and so far this has worked—Taiwan is self-governed and we are at peace with China. But what will we do when, confident of its resources and of its cause, Beijing looks us in the face and asks, Do you want Taiwan so badly as to countenance a nuclear bomb on Honolulu?
Podhoretz might plead that, in such a hypothetical situation, we are simply back to a reenactment of Mutual Assured Destruction. But I want to hear it said: do we favor running the risk of nuclear war in order to preserve Taiwan’s independence? My own answer would be yes, an answer given not in anticipation of the probable loss of Honolulu but in anticipation of a Chinese backing-down. But I solicit debate on this point, after which—arguing, as I say, a fortiori—I will be able to reason back to fundamental assertions concerning U.S. foreign policy.
Lacking a consensus, would we at that point start asking questions like: is Taiwan really vital to our strategic position? Or, what is a vital interest in a nuclear age? On the other hand, what about our tacit pledges to the continuing integrity of Taiwan? Did we not pass the Taiwan Relations Act in 1979, and have we not reaffirmed our commitment to ensure its de-facto independence? Are there other Taiwans here and there about the globe, little Kuwaits we are pledged to defend?
In a symposium like this one, we are, perhaps, castrated by the knowledge that things change. Governments change—ours, theirs—and language changes. The State Department in mid-November gave out the word that we would consider ending our blockade of the former Yugoslavia if there were free elections there, even if these renewed the mandate of Slobodan Milosevic. The same President who okayed this compromise told us a year ago that Milosevic was this day-and-time’s equivalent of Hitler. So if Hitler is democratically reaffirmed, Hitler becomes okay? Or was “Hitler” just an inappropriate metaphor? Are there other inappropriate metaphors hanging around? Our commitment to Taiwan?
Eliot A. Cohen
The cold-war foreign-policy debate offered its consolations, not least of which were those of comradeship. Conservatives of all stripes could band together in the knowledge that the foreign enemy really did represent a streak of evil in human nature; domestic opponents seemed, and often were, naive or worse. True, by the late 1980’s, splits had begun to appear in the conservative camp, as some began to realize that the Soviet Union was, in fact, in very sad shape indeed; but by and large, we were all in it together.
No longer, as Norman Podhoretz’s essay makes clear. But although I sympathize with his attempt to distinguish schools of thought—isolationist/realists versus new-American/imperialists—I have doubts about the enterprise. Would the “realist” Charles Krauthammers of this world watch Taiwan be bullied into submission without wanting American power to have a say? I doubt it. Would the “interventionist” William Kristols of this world endorse the dispatch of American troops to keep the peace in the now-independent former Soviet republic of Georgia? I doubt that, too. There are, to be sure, tendencies, sensibilities, and predilections at play, but these do not amount to doctrines. Nor should they.
The international environment in which the United States operates does not lend itself to programmatic statements. Take the case of China, which does loom as a foreign-policy problem for the U.S. It is not the evil empire of the cold-war past, but rather a complex, turbulent society burdened with a corrupt and ideologically bankrupt regime that is, nonetheless, tolerating (or unable to resist) the gradual spread of some political as well as economic freedoms. There is nationalism and paranoia—and a tremendous desire to attend Western colleges and universities and to take advantage of the fruits of the West’s economic system. Some nuanced mixture of confrontation and engagement is surely the right policy, and sweeping concepts will not tell us what that mix is.
Or take Kosovo, where, on balance, military intervention stopped some truly dreadful events in their course, although it did not (and could not) bring about the multiethnic, autonomous-but-not-independent Balkan Switzerland sought by the Clinton administration. The critical judgments required in that case were prudential: what price would the United States pay for intervening, and what for refraining? The Clinton administration got it right, and its opponents, on the whole, got it wrong, but theoretical rigor had nothing to do with it.
The world is a far more complicated place today than it was during the cold war. The creative destruction of information-driven capitalism; the deeper forces of demography and Americanization; the undeniable (if not altogether desirable) increase in the power of international media and non- or supra-state actors; the working-out of the long-term consequences of the collapse of old empires; the impossible-to-anticipate shocks of disease and ecological disaster—all have created a messy world that doctrines do not fit. There are, of course, a few first principles. No one (probably not even some members of the Chinese Politburo) would like to see the United States lose its status in the international system as the fundamental guarantor of an open trading order. By and large, democracy is better than dictatorship, not only in terms of fundamental decency and civil rights but as a source of peaceful relations among states. For the application of these and a few principles like them, however, no hard and fast rules can be found.
Prudence, then, should be the unsatisfactory and platitudinous watchword of American foreign policy. Podhoretz’s anxiety about American isolation, which he shares with so many commentators on American foreign affairs, is excessive. America has been engaged with the outside world since the founding of colonies more than three and a half centuries ago. No President for the last 50 years, and no serious presidential candidate now or in prospect, has advocated anything like isolationism. None has proposed that this country be anything other than the world’s dominant military power, or that it simply disregard the political problems of distant regions like the Persian Gulf or East Asia. There is a great deal more healthy good sense out there than one might think, and although one side may run foreign policy more or less competently than the other, even an administration as prone to pratfalls and as anti-realistic (in the foreign-policy sense) as the current one has done not too badly, all things considered. Saddam Hussein is still under constraint; our alliances are intact; China has been deterred from muscling Taiwan too much; Russia, if disgruntled, is not actively hostile.
The poisonous legacy of foreign-policy partisanship left by the cold war obscures the fundamental consensus that is a tremendous source of national strength. A season of presidential campaigns will probably heighten artificial distinctions between the parties, as well as highlighting the real ones that do exist. But sober sense would suggest that Americans acknowledge their good fortune in being able to agree about many things.
This consensus will prove necessary, because considerable foreign-policy problems loom ahead. The rise of China will require a sophisticated blend of measures to shape, accommodate, and in some cases contain the emergence of this new power with no previous experience of operating as an equal in the international system. The collapse of states as different as Colombia and Pakistan—and others may follow—will pose severe challenges to neighbors and ultimately to the United States. The diffusion of weapons of mass destruction is real, and with it the possibility of ghastly attacks on civilian populations. The tight linkage of financial markets through the new information economy may produce unanticipated economic calamities, while the disparities between rich and poor states may only grow.
All of these, however, are large problems to be managed and handled; pragmatic skill, not ideological clarity, is what the times call for.
In his usual incisive way, Norman Podhoretz has dissected the strange permutations and combinations of attitudes toward foreign policy that have arisen in the decade since the fall of the Berlin Wall. There are, however, three gaps in the story that need to be filled in.
The first has to do with isolationism on the Right, which Podhoretz tends to underplay. He is, of course, quite correct to denounce the Clinton administration’s shortsighted partisanship in attacking “Republican isolationism” in the wake of the Senate’s failure to ratify the Comprehensive Test Ban Treaty. And it would be convenient if right-wing isolationism began and ended with Patrick J. Buchanan. But Buchanan is not a major force in American politics, while the Republicans in Congress are; and there lies a problem.
Many in this group are not, like Buchanan, isolationist out of principle. Rather, they are driven to the same position simply by catering to their constituents. It is a Republican-majority Congress that has been complicit in underfunding the military, with budget hawks like John Kasich leading the charge. This is the group that routinely gets applause lines by attacking foreign aid (the lion’s share of which continues to support Israel and Egypt and the peace process). Their small-mindedness is similarly on display in repeated efforts to cut the National Endowment for Democracy’s minuscule funding. During the Kosovo conflict, the House Republicans voted, just as congressional Democrats had done in the early 1970’s, to cut off funds for U.S. forces already deployed in the field. Although there was a time when Republicans could be counted on to support free trade against Democratic opposition, today, in spite of record low levels of unemployment and high levels of U.S. competitiveness, their enthusiasm has eroded substantially.
How serious right-wing isolationism is remains to be seen. Many congressional Republicans have been driven crazy by President Clinton and, in the wake of the impeachment scandal, will simply oppose anything he supports. Since this is an opportunistic rather than a principled isolationism, it may be that a Republican President in 2000 could lead them out of their morass. Or it may be that Republican internationalists are actually much more alone in their own party than they realize.
The second gap in the argument is related to the first. The relative value of realism (in its various versions) as opposed to what Podhoretz labels the “neo-Reaganite” foreign policy espoused by the Weekly Standard has to be assessed not just in terms of the intrinsic merits of each position but also by the extent to which one or the other can serve as the basis for a conservative internationalism, and thus as a means of guiding conservatives back from the isolationist abyss.
Podhoretz quotes William Kristol and Robert Kagan to the effect that “Without a broad, sustaining foreign-policy vision, the American people will be inclined to withdraw from the world and will lose sight of their abiding interest in vigorous world leadership.” Well, maybe. There is a fundamental ambiguity at the core of the Weekly Standard position concerning the exact mix of “interests and ideals” that should define American foreign policy. Podhoretz is right that the neoconservative position is more hard-headed than that of (say) Anthony Lewis, insofar as it recognizes the importance of American credibility and prestige. But the ends this policy serves are expansive ones that often amount to support for “humanitarian intervention.”
In such circumstances, the need to defend U.S. credibility becomes a self-fulfilling prophecy: if you intervene in areas not vital to your strategic interests, your credibility will indeed be tested. Had the U.S. not threatened to bomb Kosovo in support of the Rambouillet accords (something the realist Henry Kissinger opposed at the time), NATO would never have faced the severe credibility crisis that ensued, from which it emerged by the skin of its teeth.
It is fine to be nostalgic for Reaganism, but Reagan (like Churchill before him) was rhetorically successful because he actually faced an evil empire against which he could rouse a sleeping public. It is not clear that the American public can be similarly roused in situations that amount to humanitarian intervention, disguised as it may be by a thin coating of national interest. More than the different varieties of realism, a policy that cannot clearly define where we should not intervene risks being seriously out of touch with the American public.
A final gap in Podhoretz’s account is his silence on the subject of globalization. Twenty-five years ago, when I was studying international politics in graduate school, high politics revolved around nuclear alerts and Henry Kissinger’s shuttlings to the Middle East. Today, high politics is IMF Director Michel Camdessus jetting off to Seoul or Jakarta to arrange a bailout. Save for Buchanan and Edward N. Luttwak (both of whom seem to have gone off the rails when it comes to economics), none of the figures mentioned in Podhoretz’s article has tried to come to terms with the global economy or has anything particularly interesting to say about it.
True, the old world of power politics does rear its head occasionally, in peripheral areas like the Balkans or Transcaucasus; and it may come roaring back in a big way among the great powers at some point in the future. But in the meantime, the major issues in global affairs will concern exchange-rate stability, wage inequality, the World Trade Organization, institution-building in transitional economies, the impact of information technology, capital controls, and a host of other difficult policy issues. Today, internationalism and engagement are more properly matters of how the U.S. and the international financial institutions can help Russia or China or Ukraine build free markets and democracy, rather than the conditions under which the U.S. will or will not use military force.
Frank J. Gaffney, Jr.
The old line, “You can’t tell the players without a scorecard,” never seemed more apt than at present, in what might be called the post-post-cold-war environment, with its bewildering array of specialists, pundits, and practitioners. Norman Podhoretz has performed a singularly useful service in providing such an annotated scorecard, and it is delicious that, in the process, he has settled a few scores as well.
I have few serious disagreements with the way Podhoretz has parsed the various schools of thought and explicated the often torturesome paths by which some have arrived at their present stances and intellectual alliances. His most helpful contribution, however, may be the attention he has focused on the two camps that have deviated the least from their principles—those determined to leave no stone unturned in the cause of constraining American military power, and those who believe that American power is indispensable not only to protecting our interests but to promoting a relatively peaceable world.
To be sure, as Podhoretz recounts, some prominent adherents of the first school—variously described as unilateral disarmers, aggressive multilateralists, and Utopian One Worlders—have prominently backed U.S. participation in controversial military operations from Haiti to Kosovo. These include Bill Clinton, Madeleine Albright, and Samuel Berger. I would argue, however, that in such advocacy they have actually deviated less from their policy roots than might appear at first glance.
In fact, the diversion of America’s might and its dissipation in the service of “selfless” international causes have been hardy perennials of the self-described “global-security” camp. In the past, this predilection has taken the form of support for a standing UN army, to which the assets and personnel of our armed forces would be permanently assigned or subordinated. The hallmark here is the proposition that only military operations authorized by the Security Council are “legitimate.” Although Albright and Berger have on occasion found themselves sharply attacked by others who share their objectives (famously at Ohio State University when the Secretary of State and National Security Adviser, together with Secretary of Defense William S. Cohen, tried haplessly to sell Clinton’s war option for Iraq to jeering antiwar activists), their agenda in toto is largely indistinguishable from that of their sometime critics on the Left.
This reality can be seen most clearly in the microcosm of arms control, where no perceptible difference separates the Clinton administration’s stated ambitions for ridding the world of weapons of mass destruction through negotiated agreements from the position of more radical activists. The Senate’s courageous rejection of the Comprehensive Test Ban Treaty—no product of partisanship, let alone isolationism—offered proof positive of how far removed Clinton and Gore are from sensible and responsible thinking about nuclear deterrence and its requirements, and how squarely they fit into the traditions of the ban-the-bomb and nuclear-freeze movements.
I am proud to be placed by Norman Podhoretz in the second of the two policy schools that have remained true to their principles—the so-called “neo-Reaganite” camp. While, as he notes, there are some differences within this group, they mostly center on how to apply those shared principles in specific cases, rather than on the principles themselves.
Fundamentally, we agree that the main threat arises not from the United States’ being too powerful but from its being perceived abroad as weak and irresolute. That perception, alas, is generally the result of our acting that way at home—a phenomenon all too much in evidence during the Clinton years. It is no coincidence that during this period we have witnessed serious erosion in America’s alliances, escalating proliferation, an ominous “strategic partnership” being forged between the Russians and Chinese, and the growing power of rogue states and terrorist organizations. These are tectonic shifts in the geopolitical plate structure with which we will have to contend for years to come.
Given history’s harsh treatment of the “disarm-the-one-you’re-with” school, and its general vindication of the school of peace-through-strength, it is remarkable indeed that there is still something to debate. And yet, the former camp is not only still a factor in policy deliberations; it is having a disproportionate influence over them. This is due in part to the fact that many in the senior ranks of the Clinton administration are among its adherents. But it is also a byproduct of an effort that has not received the critical attention it warrants: in an increasingly coordinated manner, left-of-center philanthropies are investing more than $140 million per year in the work of academic and activist proponents of an agenda variously called “Human Security,” “Global Security,” “Cooperative Security,” “One World,” and “Secure World.”
Thanks in particular to the sustained generosity of the MacArthur, Ford, W Alton Jones, Ploughshares, Hewlett, Carnegie, Merck, and Rockefeller foundations, thousands of professors and graduate students have been trained to advance revisionist views of the Reagan legacy, to argue the illegitimacy of American power, and to promote the necessity of global governance. Dozens of think tanks and organizations—some of whose more radical views were in evidence last month during the demonstrations in Seattle against the World Trade Organization—have been underwritten to espouse and legitimate these ideas, and efforts are being expended to cultivate sympathetic treatment in the media. Absent a comparable effort in behalf of more realistic and robust security policies, those who remain committed to what Jeane J. Kirkpatrick once rightly ridiculed as a “blame-America-first” philosophy may exploit the confusion and incoherence of the contemporary debate to dictate its outcome.
Already, the Clinton-Gore administration is far along in its efforts to institutionalize multilateral arrangements that threaten sharply to limit American sovereignty and freedom of action. Its phony arms-control agreements and generally vapid “peace processes,” its hollowing-out of the U.S. military, and its coddling of adversaries at the expense of America’s allies portend a legacy frighteningly reminiscent of the prewar period of the 1930’s. Norman Podhoretz is to be commended for warning of this danger, and for pointing the way if we are to avoid a calamity that might, if anything, exceed that which followed an earlier heyday of pacifism and utopianism.
Norman Podhoretz’s essay is mainly about discontinuity: how, over the last decade, positions have altered and alignments shifted, and how strange and unpredictable the ideological landscape has become.
As usual, Podhoretz gets to the heart of the matter, even if he does so in slightly loaded terms. For the current divisions among conservatives about foreign policy are essentially between those who believe that the policies and habits of mind that characterized the cold war are still valid and appropriate, and those who do not. This division cannot be usefully discussed in terms of “steadfastness” and “abandonment,” as if what is involved is some test of character. What the argument should be about is not consistency but appropriateness—the appropriateness of old policies and assumptions in radically changed circumstances.
For over 40 years, we anti-Communist cold warriors insisted that the United States, and indeed Western civilization, faced a life-threatening crisis. Our adversary was powerful, ruthless, and bent on our destruction. His values were inimical to ours. The struggle was global.
In these very special circumstances, and as a matter of survival, we insisted that foreign and defense policies should be given overriding priority, that an extraordinary level of global commitment and activity was justified. The exceptional nature of the times required exceptional measures.
Then, abruptly, the cold war ended. This left the United States victorious, supreme, and about as unthreatened as a great power can be in a system of independent states. In these entirely changed circumstances, it seemed to me that a fundamental reassessment of American foreign policy was not only desirable but essential—certainly in terms of the country’s needs, but also in terms of our moral and intellectual integrity as cold warriors. For if the existence of the cold war had been so special, requiring a profound reevaluation of our role in the world, then the same was surely true of the ending of the cold war. To turn around now and treat as normal and permanent policies that had been adopted and justified because of an existential crisis seemed fraudulent.
Those who now argue for continuation of cold-war levels of activism and commitments usually invoke one or more of three grounds. First, they maintain that while there is no longer one great threat facing us, there are a number of smaller ones and that these are collectively as dangerous as the threat posed by Communism a generation ago. This, I believe, is nonsense, and its nonsensical character is exposed by the pathetic efforts to inflate the significance of petty tyrants by representing them as “Hitlers” or “Stalins.” Moreover, the arithmetic of this kind of aggregation is specious: a bad head cold plus a skin rash plus a slipped disk does not add up to a brain tumor.
Second, it is argued that as the world is intrinsically an incredibly dangerous place (for this purpose, anti-realists suddenly adopt the most extreme form of Hobbesian realism), the United States must involve itself everywhere and always, in every issue large and small, because, to quote Robert Kagan, otherwise “[t]here is no certainty that we can correctly distinguish between high-stake issues and small-stake issues in time to sound the alarm.” Thus a policy of relentless, universal, and perpetual busyness is advocated because of a complete lack of faith in the ability of the American government and people to exercise any judgment, foresight, or discrimination.
Third, there is the argument advanced by Podhoretz at the end of his article—that “we are in a situation resembling the one that developed after the end of World War I,” when the withdrawal of the United States helped create a power vacuum in which totalitarian systems were able to thrive. Such a mistake, it is insisted, must not be made again. As an argument against isolationism of the kind that prevailed in the United States between the wars, this is irrefutable. As an argument against a policy of discrimination and prudence that advocates stopping short of democratic crusades and hyperactive hegemonism, it is irrelevant. Indeed, the amount of attention Podhoretz devotes to isolationism strikes me as extravagant. Isolationism is surely a red herring, for no serious participant in the conservative foreign-policy debate advocates it. The real arguments take place in that broad territory that lies between the extremes of gung-ho crusading and isolationism.
In the case of NATO, for example, what is at issue is not the alliance’s continued existence, or U.S. membership in it, but its rapid and extensive expansion. Looking back at the 1970’s, Podhoretz makes the point that he disagreed with the Kissingerian policy of working with China to counter Russia because it confused the central issue: that the enemy was not Russia but Communism. Indeed, as he reminds us, far from Russians as a nation being the enemy, they were the first victims of Communism. That is certainly the way I saw things, and that is one principal reason why I now oppose the expansion of NATO eastward, at a time when Russia is struggling, however imperfectly, to find its feet as a democracy. But why, in terms of his own distinction between Communism (enemy) and Russians (victims), does Podhoretz support that expansion?
The reasons I urge restraint and prudence on the United States have nothing to do with isolationism. They are threefold. First, I have serious doubts concerning the willingness of the American people to sustain the level of engagement now being advocated vigorously by some of my conservative colleagues.
Second, I believe that, in any case, many of the goals being urged upon us to justify the extravagant commitment of resources and prestige are not sensible or sustainable ones. I am not an admirer of humanitarian wars; policies that result in semi-permanent American garrisons in the Balkans strike me as ill-considered; and a country that has signally failed to establish democracy in Haiti, after years of trying, should surely consider a more modest approach to democratic crusading.
Third, and most compelling, I advocate restraint because every dominant power in the last four centuries that has not practiced it—that has been excessively intrusive and demanding—has ultimately been confronted by a hostile coalition of other powers. Americans may believe that their country, being exceptional, need have no worries in this respect. I do not agree. It is not what Americans think of the United States but what others think of it that will decide the matter.
Imagine a wealthy businessman who has been fending off his main competitor for several decades. He has endured family feuds as well as industrial espionage, all the while managing to create a thriving enterprise. Suddenly his rival is revealed to have completely bungled his own affairs and is forced to plead bankruptcy. Overnight, our businessman goes from being merely wealthy to being a Croesus, able to buy, hire, and fire at will. But how do the business journals and the professors of management advise him to behave? You’ve never been in greater peril, they caution: sell off your assets, lest you provoke a coalition against you; better to run the risk of penury than to overextend yourself.
Does this sound ludicrous? It is essentially what newly revived isolationists have been counseling ever since the collapse of the Soviet Union. To be fair, the Nation has been sounding a similar dogma since Vietnam, but now elements on the Right have adopted it as well. In fact, the single most important development in the post-cold-war era may be the alacrity with which some leading conservatives have embraced a leftist set of beliefs that had seemed to sputter into insignificance with the disappearance of the Soviet Union. From Trent Lott declaring it was time to give peace a chance in Kosovo, to Charles Krauthammer sounding a Walter Lippmann-like warning about our commitments being out of balance with our means, to the crackpot fringes of Buchananism, isolationism, often in the gilded form of “realism,” has made a comeback.
In his penetrating essay, Norman Podhoretz provides the historical context that has been missing in most discussions of where American foreign policy is headed. One of his most important points is that the activist party in the 20th century has been not the GOP but the Democrats. The cold war, which really began not in 1945 but with the Bolshevik seizure of power in 1917, was finally countered by the Truman administration, even as Robert Taft & Co. on the Right, echoing Henry Wallace on the Left, called for the U.S. to stay out of Europe. Did some of the Republican zeal for prosecuting the cold war in the 1950’s derive from a realization that the GOP had been tardy in its response to the Communist threat, and that now it either had to outflank the Democrats or relegate itself to the sidelines?
As Podhoretz observes, Democratic ardor for confronting Communism did not begin to wane until Vietnam, when academics, journalists, and politicians seemed to perform ideological somersaults. But perhaps Vietnam was the aberration, albeit one with a profound effect on American elites, and perhaps the demise of the Soviet Union, coupled with Yugoslav ethnic cleansing, has jolted mainstream Democrats back to reality. Maybe the Republicans, as William Kristol and Robert Kagan have argued, are in danger of forfeiting their dominance in foreign policy, while the Democrats charge ahead, unafraid to challenge dictators and stand up for human rights.
Not exactly. True, the record of the Clinton administration in exercising American power abroad has not been unimpressive on its face. As Podhoretz notes, Clinton stared down the Haitian junta, sent destroyers off the Taiwanese coast, bombed in Afghanistan and Sudan, carried out an air war against the Serbs in Kosovo, and continues to conduct attacks against Iraq. To critics like Charles Krauthammer, this represents not only dangerous overstretch but most of all a quixotic commitment to justice and human rights. Yet if this is what it takes to get Democrats to act, then I am all for it.
The truth is that the realists vastly overblow the potential perils to the U.S. of intervention abroad. The American public does not. As a study carried out by the Triangle Institute and excerpted in the Washington Post reveals, the American public can distinguish between suffering casualties and suffering defeat, and what most Americans are interested in is victory. Still, the timorousness with which the Clinton administration has approached its foreign ventures, backing into rather than decisively entering them, does not inspire much confidence that any future Democratic administration will take a tougher stance toward dictators around the globe, particularly in China.
But neither would a George W. Bush-led administration be much different. Indeed, given the coloration of Bush’s present foreign-policy advisers, it might be even more cautious than a Democratic one. (Though it is often claimed that academia has no influence on the real world, academic models of realism have in fact had a profound effect on American foreign policy, whether we are speaking of Henry Kissinger’s conception of détente and triangular diplomacy or the desire of the elder Bush’s administration to preserve a supposed balance of power in the Middle East by leaving Saddam Hussein in the saddle.) The only consolation offered by the prospect of a second Bush administration is that it would at least not succumb to outright isolationism.
If, however, Bush (or John McCain, should he become the Republican nominee) were to lose the election, then I fear isolationism might well take hold in the GOP. And here is where Podhoretz may have pulled his punches. Despite Bill Clinton’s manifold imperfections, which hardly need to be enumerated, he of all people is closer to Podhoretz’s robust conception of American foreign policy than is much of today’s GOP. The virulence with which the Kosovo war was greeted among people like Jack Kemp amounted in my view to downright anti-Americanism. At a time when the United States has never been more flush economically, the claim of the realists that we are about to overextend ourselves is as fanciful as comparable warnings sounded by liberal doves in the 1980’s.
The new realists have it backward. America is not overcommitted. It is not committed enough.
Early in his outstanding essay, Norman Podhoretz expresses some amazement at the “unprecedented” reversals of the foreign-policy views expressed by both Left and Right over the past decade. Actually, such reversals are part of a long and honorable American tradition.
In the first two decades of the American republic, Jeffersonian Republicans and Hamiltonian Federalists altered their positions on foreign and defense policy rather dramatically, depending on which side was in power. (Jefferson, for instance, wanted to build a strong navy in the 1780’s, only to oppose a naval buildup when the Federalists held the White House in the 1790’s; Federalists returned the favor by opposing the Louisiana Purchase, thus reversing their own previously held position.) In the early part of this century, erstwhile Republican internationalists like Henry Cabot Lodge delivered their party to isolationists like William Borah in a fit of (perhaps justifiable) pique at Woodrow Wilson, while the previously pacifist and isolationist Democratic party of William Jennings Bryan rallied to Wilson’s internationalism.
There has been much serious and interesting debate over foreign policy among conservatives in recent years. But let us not pretend that this debate has been unaffected by political developments. For all the high-minded theoretical disputes that have filled our intellectual journals, the realist or neoisolationist tendencies of most conservatives (Patrick J. Buchanan notwithstanding) can be traced not to 1991 and the breakup of the Soviet Union—not, in other words, to an earnest attempt to grapple with the new realities of the post-Soviet strategic environment—but to the election of Bill Clinton.
Why, for instance, was the term “international social work,” coined by a disgruntled former Clinton adviser, applied by Republicans only to interventions that occurred after 1993? Perhaps the purest form of “humanitarian” action ever conducted by the United States was George Bush’s intervention in Somalia in 1992 to stop a politically-induced famine, but I do not recall much conservative outrage over that deployment of American troops on a humanitarian mission. A couple of years earlier, Bush had sent thousands of American soldiers to Panama to remove a thug from power and to establish a more functional democracy there. (Contrary to Republican revisionist history, the invasion of Panama was not aimed at protecting the Canal.) Panama was, in essence, Bush’s Haiti. Where were Republican criticisms then of our reckless adventures in “nation-building” and “democracy-promotion”? If a Republican President had taken the United States to war in Kosovo last year, I believe a majority of Republicans and conservatives would have supported him—and supported not just that specific intervention but the broad rationale for such interventions.
Now, there is much reason to prefer almost any Republican’s interventions to Bill Clinton’s, just as there was much reason to prefer Hamilton’s navy to Jefferson’s. But is the distinction really a matter of principle? Where many conservatives have erred these past eight years has been in elevating their justifiable mistrust of Clinton’s leadership in specific cases to the level of a general theory about how America ought to conduct its foreign policy.
Instead of opposing Clinton’s internationalism, such conservatives have opposed internationalism. Instead of opposing Clinton’s ineffective methods of intervention, they have opposed intervention. They have erected what they insist are enduring principles of American foreign policy—no “humanitarianism,” no “nation-building,” no exporting of democracy—for the purpose of indicting Clinton. The price of using doctrinal elephant guns to shoot a political flea is that conservatives have driven themselves into a neo-isolationist corner where they have no business being.
This may sound a bit cynical, but actually it is good news. Such politically-driven “reassessments” of American foreign policy are easily reversed. If George W. Bush or John McCain occupies the White House next year, we will find a majority of Republicans and conservatives largely cured of their neo-isolationism and even, I imagine, of their parsimonious “realism.” (By the same token, I suspect a Republican President in 2001—at least, a Republican President of internationalist disposition like Bush or McCain—will cure most Democrats and liberals of their current internationalism.) Anyone who watched or read Bush’s speech at the Reagan Library in November knows that he left much room for American intervention in future Bosnias and Kosovos. It was the episodic, inconstant, and ineffective nature of Clinton’s interventions that Bush criticized, not the appropriateness of the interventions themselves.
I do not wish to make light of the arguments advanced by our most astute and articulate realists. There has been and remains a legitimate theoretical debate over the principles that should guide American foreign policy in the post-cold-war era, over whether the United States should intervene abroad more or less frequently, and for what reasons. But I would suggest that the real test for conservative realists of the 1990’s vintage will come after January 2001, when a Bush or McCain leads the country to war in some less-than-central part of the world out of the same uncertain mixture of principle and interest that led his predecessors into Panama, Somalia, Haiti, Bosnia, and Kosovo.
Then it will be clearer if there has been a genuine shift in conservative opinion on foreign policy, or whether what we have witnessed over the past eight years has been just another spin of the electoral wheel. Maybe, with a Republican in office, conservatives and liberals alike will return to their respective beds and leave the company of strangers.
Norman Podhoretz describes the writings of recent commentators on foreign policy as marked by contradictions and inconsistencies, in contrast with the more cohesive postures taken during the cold war. Podhoretz is puzzled by this development. I am not. The fundamental reason is that nothing has jelled to replace the central strategic fault-line of the cold-war era—essentially a strategy of opposition to the Soviet Union.
The reason a grand strategy has not emerged is that we do not face a global adversary who threatens our very existence, and we therefore have the luxury of not being forced to choose among alternative designs. It is also true that, given the absence of a clear enemy, our political culture is disinclined toward any such designs. Although strong presidential leadership would have overcome this constraint, President Clinton has not provided it. In any event, the absence of an agreed-upon framework has placed the U.S. in a largely reactive mode, and this is what explains the inconsistencies and contradictions Podhoretz captures.
What might such a framework look like? In essence, three realistic alternatives are available: to give up leadership by facilitating the rise of a multipolar system; to embrace isolationism and focus on promoting prosperity at home; or to consolidate American preeminence by precluding the rise either of a global rival or of a multipolar system. In various writings since 1992, I have favored the last as the best long-term guiding principle for our national security.
The United States today is the most powerful state in the international system. Although preserving this position is not an end in itself, a world in which the U.S. continues to be the preeminent power will be more receptive to democracy, free markets, and the rule of law, and also will have a better chance of avoiding another global cold or hot war.
Three sets of interrelated factors could pose threats to this design. First, if one or more major countries became more powerful and challenged the United States. Second, if there were to be a substantial relative decline in U.S. economic and military power brought about, for example, by overextension. Third, if a hostile power were to gain control of a critical region.
Let me focus here on the first set of factors, with a comment about overextension. At present, our potential rivals are either too weak or are already our allies. In the near term, this situation is unlikely to change significantly, but for the longer term we need to maintain our alliances by focusing on current and emerging threats to joint interests. In Europe this means stabilizing the new democracies in Eastern Europe; increasing our common ability to deal with challenges from the South—threats to the energy supply, terrorism, weapons proliferation, and religious or political extremism; and hedging against uncertainties in Russia even as we encourage the rule of law, market reform, and democratization there. Together we must also incrementally extend the zone of democracy, peace, and prosperity.
In Asia, our alliances have not adapted to the changing environment. In addition to the threat from North Korea, the U.S. and its allies face the risks of Balkanization in Southeast Asia and the long-term possibility that China might seek regional hegemony. How China evolves will have the greatest impact of all.
Given the inherent uncertainties in China, a pure engagement strategy that seeks expanded relations in the hope of positively influencing Chinese policy or changing China into a friendly democratic power seems to me quite risky. By helping China to develop economically and technologically, it can create the basis for future strength, and if the assumption about democratization proves incorrect, it will also help China become a more threatening regional—and perhaps global—rival.
But a pure containment strategy is also unwise. Fatalistically assuming that China is bound to be an adversary overlooks the possibility of domestic change and of a positive evolution in our relations.
Instead of pure engagement or containment, what seems to me appropriate is “congagement”—a strategy somewhere between the two with elements of both. Under such a policy, we would continue to enhance economic, political, and cultural ties with China, but we would be less solicitous of Chinese sensitivities on issues like human rights. By tightening our export controls, we would do nothing directly to help increase Chinese military capability. We would also seek to strengthen relations among states that could form the core of an alliance against China should it push for regional hegemony, and likewise strengthen our own security relations with these countries. On Taiwan, we would preserve and stabilize the status quo for as long as China’s future remains uncertain. Through these measures, and by strengthening our own military posture in Southeast Asia, including, in the long term, establishing a military base there, we would point out to China the costs of turning hostile.
A strategy of precluding the rise of global rivals will not succeed unless we maintain our military and economic strength. But one major risk we face is overextension, a mistake made by some great powers in the past. Given the absence of a systemic rivalry, the U.S. can be quite selective in its military involvement, but during the past several years we have not been selective enough. Unless recent trends are reversed, we will either have to fund a much larger armed force or erode our capability of dealing with threats to critical interests, such as in the Persian Gulf. Emphasizing greater selectivity does not mean indifference to humanitarian interests in places like Bosnia; but in such situations we should consider other options, including arming and training the victims of aggression.
Since the end of the cold war, the idea has gained ground that the world is now more uncertain. That is only partially correct. In the past the enemy was known, but it was not easy to predict either his behavior—“Kremlinology” was an almost mystical science—or other threatening developments around the world. We were, however, relatively certain of our overall objectives and the priorities among them. That is no longer so; it should be again.
Jeane J. Kirkpatrick
James Q. Wilson has observed that “elite beliefs are probably more important in explaining foreign-policy decisions than in accounting for decisions in other policy areas.” He is right as usual. Elites interpret our problems and our duty. They even try to tell us who we are and why it matters. When their views change, we know our culture has changed.
The views of our political elites seem to have changed dramatically since the end of the cold war, but the extent and character of the change have not been adequately explored. Like an earthquake of gigantic proportions, the collapse of Communism in the West shook all manner of political, strategic, and metaphysical certainties from their foundations, leaving questions and problems where dogma and habit had ruled. It is therefore a good thing that COMMENTARY is sponsoring this discussion.
On no issue is the change so marked as the use of force. When the Reagan administration offered arms and training to the Nicaraguan resistance, law professors all over America pronounced the policy illegal and utterly rejected the argument that force could under certain circumstances be used to restore or protect democracies. As late as 1989, when the U.S. intervened in Panama under George Bush, leading liberals in and out of Congress described this action as the clearest possible violation of the UN Charter’s prohibition on force, and professors of international law reminded all and sundry that the use of force against another state is never justified except in self-defense, a concept that was itself to be very narrowly interpreted.
These extremely negative attitudes also colored the liberal response to Saddam Hussein’s invasion of Kuwait, even after the UN Security Council authorized force against him. And when, Security Council resolutions in hand, George Bush sought authorization from the U.S. Congress—as required by the U.S. Constitution—the outcome of the Senate vote remained in doubt to the very end, even though Iraq’s invasion of Kuwait across an international border was as clear a case of aggression as could be imagined. The vocal opposition of leading Democrats to the U.S. role in the Gulf was finally silenced only by enthusiastic popular support for the performance of American armed forces and high-tech weapons.
But then came the great change. By the time they reentered government with Bill Clinton, a generation of 60’s liberals had rethought their views on force and intervention. With the definitive end of the cold war and the arrival in power of a new foreign-policy team, liberal isolationism was—almost overnight—replaced by a new doctrine of global engagement. Hostility to the use of force gave way to doctrines of “peacekeeping,” “democracy-building,” and “nation-building.” Over the past few years, force has been justified in the pursuit of diverse causes: separating parties to a conflict, disarming warlords in remote countries, “restoring democracy” (Haiti), “containing” conflict (Macedonia), providing “advisers” (Bosnia), and delivering Kosovo from ethnic cleansing.
And yet—not everything has changed after all. I believe the most important issues confronting America today are those of identity and survival. Because the United States is the strongest country in the world, more than a few foreign governments and their leaders, and more than a few activists here at home, seek to constrain and control American power by means of elaborate multilateral processes, global arrangements, and UN treaties that limit our capacity both to govern ourselves and to act abroad. Moreover, although most of the policy positions developed in the long bipolar competition with the Soviet Union are now obsolete, some—as it happens, the ones touching especially on the twin cardinal issues of our identity and our survival—are still preserved in the Washington policy community and among the intellectual elites like beetles in amber.
Missile defense is the most salient issue here. No hostile power of comparable size or strength to the Soviet Union exists today, but several dictatorships of violent tendency and hostile intent—Iraq, Iran, North Korea, Syria, Libya—are working hard to acquire ballistic missiles and weapons of mass destruction. North Korea has demonstrated its potential to cross the ICBM threshold. China must also be considered. But Democrats still oppose an effective missile-defense system to protect America and its allies.
From the time the Soviet Union developed the capacity to reach the United States with its nuclear weapons until today, leading liberals and Democrats have taken the view that there should be no defense against such an attack. In the past, they supported a strategy of Mutual Assured Destruction that left everyone vulnerable. In this mutual vulnerability they paradoxically saw common security. That common security, they still argue, is what will be destroyed if one side develops defenses against a nuclear attack by the other. That was the rationale behind the 1972 Antiballistic Missile treaty with the USSR, which was intended to prevent the “destabilizing” development of an effective defense, and it is the rationale we remain wedded to today.
True, some liberals and Democrats are quietly altering their stand on missile defense, but they still basically prefer to reinforce the balance of terror in the world by means of agreements to limit offensive weapons of all types, chemical and biological as well as nuclear. The problem with this arms-control approach to security is that the same governments against which we most need to protect ourselves are those most likely to violate the treaties—as the Soviet Union violated the ABM treaty, as Iraq and North Korea have violated the Nuclear Nonproliferation treaty, and so forth. The fact that Iraq and North Korea, India and Pakistan have ignored the “nonproliferation regime” in which we so fervently believe has had, thus far, remarkably little effect on the debate. Nor has the reality of our ongoing vulnerability penetrated the consciousness of many of our foreign-policy intellectuals. Perhaps we need to discuss that, too.
The principal practical disagreement between the “globalist” and “realist” schools of foreign policy concerns what Eliot A. Cohen calls “imperial policing.” This involves “ill-defined and usually secondary or tertiary national interests.”
Realists believe that superpowers don’t do windows. Globalists counter first by citing American credibility: if we do not act to stop the war in Bosnia, or sovereign savagery against a rebellious province in Kosovo, others will be emboldened to do as the Serbs have done. The Russians, for example.
Well, we did intervene in Bosnia, and in Kosovo, and the Russians then blithely ignored our example and ravaged Chechnya. So much for deterrence.
The other argument is psychological: we should exercise our power because we have it. If we allow ourselves to turn away from relatively minor conflicts, then, when the real test comes, we will be flabby and unprepared. Shirking becomes a habit, then a policy, then an ideology. A policy of national greatness requires the continual assertion of that greatness. We have the power. Use it or lose it.
The answer to this kind of psychological theory of interventionism has been best given by Francis Fukuyama. He makes the point that “the much greater danger is use it and lose it. That is: intervening for secondary and poorly-thought-out objectives and therefore wasting the political capital we have for large interventions later. We cannot work our way up to [large serious interventions] by swatting flies in anticipation.” The worst thing that we can do in the name of keeping ourselves globally engaged is to squander our strength and political capital on tasks as remote from the American national interest as reconstructing Somalia, democratizing Haiti, and pacifying the Balkans.
The only sound theory of global engagement for the post-cold-war world is a dry-powder theory of intervention. In a country with strong isolationist tendencies, you do not squander blood and treasure on teacup wars.
This is an anti-isolationist theory of (relative) noninterventionism, or, as I would prefer, of prudent and selective intervention. Why? Because one needs to preserve one’s strength for major exertions. In an era of relative quiet, you do not run around putting out small fires just because they are the only ones burning. You save your resources for the real strategic threats.
What are they?
First, containing, deterring, and, if necessary, disarming rogue states that are acquiring weapons of mass destruction, states that could threaten with unprecedented power not only our allies and our troops abroad, but eventually America itself.
Second, containing a rising China, a country whose position on the globe at the turn of the 21st century is comparable to that of Germany at the turn of the 20th—a large, growing, former have-not, seeking its place in the sun, pushing inexorably against its neighbors.
Third, maintaining vigilance against the possibility of a resurgent, revanchist Russia.
Fourth, maintaining order as the ultimate guarantor of international peace and stability. As the only nation that can project power anywhere in the world decisively and overwhelmingly, our role is to husband our resources to meet supraregional challenges—i.e., those that threaten not just a country or a region but the stability of the international system itself.
A prime example is the Gulf war. We went to war over Kuwait not because we were opposing aggression in the abstract but because we were trying to prevent a radical enemy from gaining control of the greatest oil reservoir on the planet and from feeding its appetite for weapons of mass destruction. This was no teacup. Saddam threatened to become a supraregional threat. It was therefore the role—the unique role—of the United States to intervene to slap him down. No one else could. That is our job.
In many ways, the strategic role of the United States is comparable to the role practiced classically by Britain. Britain was the balancer of last resort in Europe. The United States is the balancer of last resort in the world. We are needed to balance otherwise unbalanceable rogue states like North Korea and Iraq; to shore up the periphery against an expanding China; to guard against Russia until its destiny is settled. This role requires huge resources to maintain the forces that will stand ready to thwart those threats. And these are the resources that are being stretched and squandered on humanitarian missions best left to Sweden.
Is imperial policing not stabilizing? In some limited sense, yes. Any intervention is presumably triggered by some existing instability that could be at least temporarily suppressed by policing. Unfortunately, however, the stabilization is likely to be temporary. Our efforts in Somalia and Haiti have been written on water; we leave those places much as they were when we came. And the reason we are stuck in Bosnia and Kosovo is precisely the same: we know that if we leave, the deluge returns.
Moreover, the harm done by these minor interventions to the stability of the larger international system can be great. Note only the damage that Kosovo caused to American relations with Russia, China, and Greece. Much higher still is the long-term cost to the United States of these discretionary wars, the distraction from our primary and unique mission and the drain on the military, diplomatic, and domestic political resources necessary to sustain that mission.
The essence of the neorealist position is this: we shall have no lack of challenges. During the relative quiescence that has followed victory in the cold war, we should not dissipate our energies looking for trouble where it matters little. Trouble will soon come looking for us—from rising powers, from regional conflict in a place we may not even anticipate, and from the spread of weapons of mass destruction to outlaw states. We had better gird ourselves for those threats with our powder dry.
Do strange bedfellows make the best bedfellows? I really wouldn’t know. So far as I can tell, it has been pretty much the same old band of monogamous bedfellows hanging around the neo-Reaganite camp for the last few years. Perhaps that will change when prospective recruits learn from Norman Podhoretz how “ardent” Robert Kagan and I have been in advancing our cause; we can only hope.
Kagan and I have argued a very simple proposition: U.S. foreign policy was successful in the 1980’s because it was militarily strong, strategically robust, and morally assertive; and it should continue to be all of these things in the post-cold-war world.
To two groups of conservative critics, this proposition seems not just simple but simpleminded. The first group, who range from Charles Krauthammer to Owen Harries to Patrick J. Buchanan, could be called cold-war exceptionalists. Despite the considerable differences among them, all regard the cold war as a strange interlude for the U.S., requiring extraordinary foreign-policy measures and ambitions. Krauthammer, Harries, and Buchanan supported a vigorous political and ideological prosecution of the cold war, and do not (on the whole) regret having done so. But now they want a return to normalcy—to “a republic, not an empire” in the case of Buchanan, to a foreign policy based not on any form of “idealism” but on a modest (Harries) or a hard-headed (Krauthammer) view of our vital national interests.
In their focus on national interest, the cold-war exceptionalists find themselves in bed with the conservative realists, led by the redoubtable Henry Kissinger. The cold-war exceptionalists were willing to countenance “Wilsonian” means to fight Soviet Communism; not so the realists, who were fighting Russia, not the Soviet Union, all along. Indeed, the realists expended a lot of effort trying to ensure that Wilsonian rhetoric never corrupted their hard-headed policy prescriptions. Now that the cold war is over, the realists have redoubled their effort to extirpate America’s Wilsonian impulse.
So, adding liberals to the picture, one can group today’s foreign-policy debaters into three pretty distinct bedrooms: America as first among the United Nations (the Clintonians), America as a normal country (the conservative realists, joined by the cold-war exceptionalists), and America as an exceptional nation and world leader (the neo-Reaganites). We neo-Reaganites divide our time between explaining that there is a fundamental difference between us and the true Wilsonians—between, that is, the muscular patriotism of Teddy Roosevelt and Ronald Reagan and the Utopian multilateralism of Woodrow Wilson and Bill Clinton—and battling the realists over the future of the Republican party.
The two fights tend to merge. While in theory one might think the realists and the Clintonians far apart, in practice both tend to default to a foreign policy dominated by commercial interests. The realists can never really persuade Americans to follow the lead of Metternich, and the Clintonians can never really persuade us to defer to Kofi Annan. So they agree that we should trade with everyone and hope (in either a realistic or an idealistic way) that commerce will lead to peace and will minimize the challenges we face around the world.
Against this inclination to reduce the business of America to business, we neo-Reaganites try to make the case for freedom and greatness. We could use a few more bedfellows in that endeavor.
It is America’s inescapable mission to fight for the spread of democracy, and therefore against tyranny. This is so because of who we are: the most successful democratic society in history. Our specific policies are usually beside the point; tyrants hate us because their legitimacy is undermined by our very existence. Hitler, Lenin, Stalin, and Stalin’s heirs knew this, and so did the Japanese warlords and Saddam Hussein. The Chinese tyrants and Slobodan Milosevic know it today. Oppressed people everywhere care so deeply about the fortunes of the United States because they know that if we fall, they are doomed.
Whenever I hear policy-makers talk about the wonders of “stability,” I get the heebie-jeebies. That is for tired old Europeans and nervous Asians, not for us. In just about everything we do, from business and technology to cinema and waging war, we are the most revolutionary force on earth. We are not going to fight foreign wars or send our money overseas merely to defend the status quo; we must have a suitably glorious objective. We are therefore not going to stick by a government that conducts foreign policy on the basis of Realpolitik. Without a mission, it is only a matter of time before public opinion will turn against any American administration that acts like an old-fashioned European nation-state. Just ask Henry Kissinger.
That is why I find the realist position highly unrealistic. The only truly realistic American foreign policy is an ideological one that seeks to advance the democratic revolution wherever and whenever possible. I was sickened by the Bush administration’s failure to celebrate our victory in the cold war, and by the grotesque spectacle of President Bush asking Russians and Ukrainians to support a Communist regime instead of seeking freedom. I was discouraged when we failed to pursue the Gulf war to the logical and necessary conclusion of removing Saddam Hussein’s murderous regime, just as in the 1970’s I had been dismayed at our policy of détente with the Soviet Union. In like manner, I have been appalled by the Clinton administration’s appeasement of China’s Communist dictators, and by its refusal to use military power to destroy the Milosevic regime in Serbia.
That said, I often agree with the realists on questions of tactics. I was opposed to the bombing in Kosovo because I did not believe the Clinton administration was capable of using military power correctly (that is, to bring down the Milosevic regime and replace it with something more civilized). I was in favor of using military force early in the Bosnian conflict, and I found the various agreements and the ultimate deployment of “peacekeepers” to be unworthy of the world’s lone superpower. These were stopgap measures, not serious policies. Add to them Clinton’s shameful betrayal of the democratic forces in Iraq, and you have a dangerous message being sent to our enemies: America can be challenged, and you will live to fight another day.
China is the true litmus test for policy-makers and thinkers, and I was therefore disappointed that Norman Podhoretz dealt with it only briefly. None of us is smart enough to know with certainty what China is going to look like ten or even five years from now. Yes, it could effect a transition to democratic capitalism and become our greatest Asian ally. But it could also become a very nasty tyrannical enemy. And of course it could explode or implode; surely China’s problems are grave enough to produce a massive convulsion. In any case, we must do everything in our power to ensure that China does not have enough military power to destroy us.
But that is precisely what Bill Clinton and his henchmen have not done. With a singlemindedness that will astonish future historians, this administration has sold the Chinese our finest military technology, often at bargain-basement prices. We have sold them more supercomputers—the central nervous system of modern warfare—than we possess in our own military and intelligence agencies combined. We have sold them nuclear-modeling software, missile-guidance systems, MIRV technology, underwater sensors, fiber-optic cable, the technology for global positioning systems, and multi-axis machine tools to produce the special “skin” for advanced aircraft and cruise missiles.
This is the greatest crime committed by the Clinton administration, in cahoots with Congress, with a flock of distinguished realists who assure us there is nothing to worry about, and with a popular press that can only focus on one scandal at a time. If we get a real change in the executive branch next year, the new President will have to order a detailed damage assessment to figure out what sort of Chinese threat we may conceivably face, and how much time we have to field an army that can beat it.
It is hard to be optimistic, because the pattern of American interventionism in this century is remarkably consistent: we have not initiated action against major powers, but rather waited for them to attack us. We are never ready for the next war, and believing, against all the historical evidence, that peace is the normal condition of mankind, we demobilize after every victory.
If we were serious about foreign policy, we would never have dismantled the export-control system that deprived the Soviet Union of advanced military technology. Instead, we have unilaterally junked the system and armed China. If we were serious, and true to our national mission, we would encourage the democratic forces in China, and we would clearly support the democratic government of Taiwan. Instead, we find excuses for Chinese domestic repression, and waffle on Taiwan. In short, although we certainly have the wherewithal to advance the cause of the democratic revolution, it remains to be seen whether we have the wisdom, and the courage, to act.
Edward N. Luttwak
Norman Podhoretz is a good and wise man, whose well-known pugnacity derives not from mere temperament but rather from the intensity of his beliefs, most of which I share. Always guided by solid common sense in addition to intellect, he has long been the straight arrow in America’s discourse on foreign affairs, never captive to the modish temptations that many other intellectuals have failed to resist.
My business, however, is strategy—the realm of paradox, irony, and contradiction, in which nothing is solid and nothing is straight, because the presence of reacting adversaries confounds every straightforward proposition. Because Norman Podhoretz always made the right choices during the cold-war years in which I knew him (after 1972), I assumed that he understood strategy as well as so much else. For example, he was unimpressed by the widely influential contention that nuclear superiority was useless merely because none of its possible manifestations (multiplicity of warheads, expected accuracies, energy yields) could be exploited operationally in realistic scenarios. That the measures in question were purely theoretical meant nothing: any balance of power is a matter of perceptions unless and until war breaks out, when quite other capabilities come into effect. Podhoretz understood that, just as he understood that the “arms race” was not to be deplored or actually limited for it served to keep the peace, by nullifying optimistic war plans and by venting pressures that might otherwise induce far more dangerous land-grabs.
But as soon as the cold war was over, Podhoretz reverted to commonsense logic, i.e., unstrategic thinking, while I remained wedded to strategy, and thus we parted company. When Kuwait was invaded, he saw the opportunity to destroy Saddam Hussein’s vicious regime, and apparently did not ask himself what would happen in the aftermath. I instead saw the opportunity to use Saddam Hussein’s highly circumscribed power to good effect, as well as the disadvantages of defeating him too well. If you have an especially horrible enemy, destroy him by all means if you can, but only if you have no other enemies and rivals in the region and the world. But if you do have other acknowledged or insidious enemies and rivals (the normal predicament), it is usually advantageous to do no more than contain the horrible enemy and leave him strong, for he is horrible to others as well—who are therefore forced into cooperation with you, or even subservience—whereas destroying his power leaves others only slightly less dangerous free to inflict harm and releases the arrogance of former dependents, who may make all sorts of demands instead of praying for the continuation of their own protection.
While employed in the Pentagon as a target-selection consultant before and during the Gulf war—an essentially operational-level function at best—I did not stop thinking about the direct, indirect, and second-order repercussions of different outcomes at the level of grand strategy. For example, if the supply of Gulf oil as a whole should become nullified by blockade and reciprocal attacks, the United States, with oil now at $100 a barrel, would have had to become self-sufficient in energy, incidentally generating much high-wage employment to produce the needed coal, shale oil, etc.; Mexico, Venezuela, and indeed much of Latin America would have gained much in oil revenues, and so would the Russian Federation, whose democratic evolution would have been greatly favored thereby. It was chiefly the commercial rivals of the United States in Europe and the Far East that would have been the losers—all very prosperous countries with elastic economies that could easily afford the costs of oil substitution.
There were many other second-order repercussions to be calculated as well, including the end of Saudi oil revenues and with it the flow of money to Islamists everywhere. In the Arab-Israeli context, Yasir Arafat and King Hussein of Jordan would have been locked into place as Saddam Hussein’s allies instead of becoming claimants of American benevolence. Nor were my speculations dominated by the fear that Iraq might unveil competent strategic forces; the chaotic profusion of its efforts that seemed sinister to others I interpreted as evidence of massive incompetence. Just before the war, I wrote in the New York Times that even if the Iraqis launched every ballistic missile in their arsenal at Israel, there would be fewer than 50 fatalities at most, and not much destruction; in the event, the Iraqis employed only a part of their inventory against Israel, and the only fatalities were caused by the decision to distribute gas masks to the entire population, including people with impaired breathing, while most of the damage to housing was caused by the debris of Patriot missiles falling back to earth. (That, incidentally, reflects one more paradoxical rule of strategy: no defense can be cost-effective against an ineffectual threat.)
Today, at the level of grand strategy, Norman Podhoretz wants to make the United States stronger by achieving a greater degree of coherence in its foreign policy, and by spending more on the armed forces. I fully agree that it is chiefly the incoherence of its foreign policy that diminishes the perceived power of the United States. But I believe that American interests are well served thereby.
It was an accident of history that left the United States as the world’s only superpower. If its potential economic leverage (now mostly hijacked by commercial interests), cultural influence, and military superiority were employed coherently, in a disciplined, power-maximizing way, its weight in world affairs would be such that all political entities desirous of retaining their independence would coalesce against it, to oppose, resist, sabotage, and undermine all its initiatives (as the French already do), finally diminishing net U.S. power below present levels.
Not being a public official, it is not my duty to explain my particular applications of the paradoxical logic of strategy to all and sundry, but I had entertained the hope that Norman Podhoretz would understand, just as during the cold war he understood the usefulness of unused weapons and the beneficial role of the arms race.
Walter A. McDougall
Although I, too, cherish the memory of those heady days when we all fought the good fight together, Norman Podhoretz’s reading of the recent past cannot go unchallenged.
He begins by assuming that “historians have never stopped quarreling” over whether the United States has been or is by nature “isolationist.” In fact, few serious diplomatic historians would describe America’s relationship to the world as isolationist in any era, and none would do so without carefully defining the term. Unfortunately, Podhoretz forces the reader to tease out what he means by this dirty word, and ends up committing the same misdeed for which he faults Bill Clinton: reckless and promiscuous use of the term in such varied contexts that it ends up meaning nothing, or whatever his polemics need it to mean.
Thus, Podhoretz dismisses the Eisenhower administration—which made alliances all over the world, threatened aggressors with “massive retaliation,” lost no ground to known Communists (Castro hid his allegiance), and kept the peace—as all talk and no action, while implicitly praising the Democrats of that era for their bellicosity. Then, when Democrats turn dovish because of the Vietnam war, it comes as a “shock.”
What Podhoretz does not grasp is that the crusading ethos that intoxicated Democrats from Woodrow Wilson to Lyndon Johnson is the flip side of isolationist moral disarmament, as opposed to the businesslike internationalism of Republicans from Theodore Roosevelt and Charles Evans Hughes to Dwight D. Eisenhower and Richard Nixon. Thus, as soon as the Democrats lost faith in their own moral superiority, they lurched to the other, “Come Home, America,” extreme.
But instead of concluding that the real lesson of 1968 is that isolationism is a close cousin of self-righteous moralism, Podhoretz links it instead to pacifism. This false linkage allows him not only to condemn the McGovernites but to describe the efforts of Nixon and Kissinger to extricate America from Vietnam as reluctant bows to the new isolationism, culminating in their belief “that military intervention in analogous conflicts was now politically out of the question.” That would have come as news to the North Vietnamese under American B-52’s in 1972, not to mention the Israelis in 1973, for whom Nixon was willing to go on nuclear alert. For that matter, Jimmy Carter’s emphasis on human rights and third-world problems was no ostrich strategy, either.
It was Reagan, of course, who turned us around, but Podhoretz does not understand how. He calls it a “great irony” that Reagan wanted to abolish nuclear weapons, and an “even greater irony” that his “nuclear pacifism” contributed to the fall of the USSR. But it was not ironic at all: the Reagan arms buildup, climaxed by SDI, targeted the expensive arsenal of fear and extortion to which the Soviets had mortgaged their economy. Otherwise, Reagan, like Eisenhower, was extremely cautious, waging no wars and employing force only in Grenada and Libya—operations Podhoretz himself calls “mere pinpricks.” What helped to bring down the Soviet empire was Reagan’s brilliant combination of Wilsonian rhetoric and hard-headed geopolitics, with an assist from the bust-up of OPEC.
After the cold war, when “everyone seemed at sea where foreign policy was concerned,” Podhoretz predicted that the absence of a clear enemy would “give rise to a more blatant and widespread pacifism than we had seen since the 1930’s.” Hence it was to his “amazement” that George Bush waged the Gulf war, and even “more amazing” that Bush won public support for it. He would have been less amazed if he had questioned his working assumption about Americans’ innate isolationism and instead appreciated that their willingness to lead abroad is a function of leadership at home.
Having been amazed by Bush’s success, Podhoretz then failed to anticipate the Clinton phenomenon, which initially seemed to him to reflect “a full-fledged resurgence of isolationism and pacifism” after all. He is now forced to confess that he got Clinton wrong, too. Both Clinton and the liberals have emerged as “positively bloodthirsty” in their zeal to use the military abroad, a phenomenon Podhoretz found “incredible” until he figured out that they just favor multilateral, humanitarian wars in which, presumably, no real American interests are at stake. That, in turn, is what has sparked an apparent flip-flop on the part of some conservatives who are now opposed to crusading abroad.
But who are the floppers? Not Henry Kissinger, whom Podhoretz credits with geopolitical consistency. And not Charles Krauthammer, Peter W. Rodman, or Fareed Zakaria, whom he describes as realistic internationalists trying to adjust to the end of the ideological cold war. So the “strange bedfellows,” the new isolationists of the Right, the avatars of this terrible trend are reduced to just two, one of whom, Edward N. Luttwak, is dismissed as not influential. That leaves Patrick J. Buchanan, a minority of one.
Podhoretz concludes by restating his claim that “many liberals and conservatives alike” have made significant, even 180-degree, shifts. But his own evidence disproves this: the realists have been consistent, the only rightists to flip-flop are the inconsequential Luttwak and Buchanan, while leftists of the Anthony Lewis persuasion have been dogged in their opposition to the U.S. national interest. The purpose of invoking the myth of a “180-degree shift” seems to be to praise the sole group—the neoconservatives—that, in Podhoretz’s estimation, “has held steadfastly” to its perspective. That said, he proceeds to list all the issues on which neoconservatives disagree, coming around in the end to embrace the muscular Wilsonianism of William Kristol and Robert Kagan.
I have argued elsewhere that the Kristol-Kagan case for a “neo-Reaganite” foreign policy aimed at imposing a “benevolent American hegemony” is wrongheaded from the standpoint of history, strategy, and even domestic politics (Orbis, Winter 1998). Here let me say that the crusade preached by Kristol and Kagan is not the only alternative to isolationism, and that it does not amount to “making the world safe for democracy.” If it did, I would be on board myself. It amounts, rather, to “making the world democratic,” and differs from Clintonism only in the correct supposition that a crusade of such magnitude requires rather more weapons than we now possess.
The cogent taxonomy offered by Norman Podhoretz sent me back to his “Neoconservatism: A Eulogy” (COMMENTARY, March 1996). There he argued that on most issues, foreign policy included, neoconservatism was no longer distinguishable from conservatism. “[O]nly a tiny handful,” he wrote, “still advocate the expansive Wilsonian interventionism that grew out of the anti-Communist passions of the . . . cold war.”
Today, Podhoretz places himself and a number of others, including me, in a group that believes that “the proper strategic objective of the United States is ‘to make the world safe for democracy.’ ” My own impression is that most of those who once made up the distinctive neoconservative camp now identify with this position, albeit with notable exceptions like Charles Krauthammer. Perhaps, then, the eulogy was premature. Now that the smoke from the implosion of Communism has cleared, a peculiarly neoconservative foreign policy has indeed reemerged.
Podhoretz calls this position “post-Reagan Reaganism,” but we might also call it Wilsonian Reaganism or Reaganite Wilsonianism. The linking of the names of these two Presidents—one of them, as Podhoretz says, “a pillar of the liberal tradition, the other of the conservative”—underscores his point that what is distinctive about our position is that it aims to assert both American interests and American values. (In contrast, conservative realists would give more emphasis to the former, liberals to the latter.) This dual emphasis reflects not merely an effort to find a middle position or to avoid choices but rather the view that America’s interests and values are much easier to separate in theory than in practice.
The extension of American power and influence has been the most important engine of the advance of American values. This is hardly to deny that American power has been sometimes misused. Rather, it is to assert that American power has been the linchpin of the remarkable global spread of freedom and prosperity and the prevention of a third world war. As a corollary, the perdurance of these welcome circumstances is far likelier in an atmosphere of continued American power and influence.
Conversely, America has done well by doing good. Old World diplomats and their “realist” cousins on this side of the Atlantic often look upon America’s approach, with its strong admixture of idealism, as naive. But how do they account for America’s remarkable success? In truth, the higher content of idealism in U.S. policy is a source of strength. It has helped to evoke great sacrifice from the American people and to build a body of sympathetic opinion in other lands, and it has made the presence of American power more often welcomed than feared by other governments.
This is the answer to Owen Harries’s warnings about “the historical tendency . . . for other states to gang up on and challenge the No. 1 state.” He is thinking, I suppose, of Napoleon or Hitler, or perhaps of the USSR. These were conquerors posing a very real danger to all. But America poses no threat to the freedom or independence of others, only to their pride or their own imperial ambitions.
Still, there is a germ of truth in Harries’s argument: America’s vast supremacy surely does breed resentments that were suppressed as long as the USSR presented a repulsive counterweight. Such feelings may not lead others to fight America, but they can make them truculent. One need entertain no starry-eyed images of UN-based multilateralism to appreciate that effective U.S. policy requires cooperation from other states.
How can we accomplish this? The key is to devote our power to purposes that embody our interests and values and that can appeal to others as well. The essence of our platform must be peace and democracy.
That the spread of democracy serves American interests is proved by a wealth of experience; that it is also good for other nations is all but self-evident. One does, of course, hear that people in this or that society “don’t want” democracy; but such arguments, appealing as they must to the purported wishes of the people, are hopelessly self-contradictory. Promoting democracy is a goal that emphasizes our commonalities with other democracies and also brings us friends among those striving for democracy in places lacking it.
Peace is a trickier desideratum. Although it is universally proclaimed, there are conflicting ideas about how to pursue it. Rare are the cases where one must repress for the sake of democracy; many are the cases where one must “kill for peace,” as mocking anti-Vietnam protesters used to think themselves clever for saying. Thus, if we are to be taken seriously, we must spell out what we mean.
The cornerstone of our peace policy ought to be international law. Article 2.4, the heart of the UN Charter, defines and forbids aggression. Short of committing ourselves to going to war reflexively, we ought to strive to see this law upheld. That is what President Bush did in Kuwait, then failed to do in Bosnia.
The discouragement of aggression, while entailing risks, will serve our self-interest by deterring the emergence of regional hegemons able to threaten our most tangible interests. To be sure, we will incur the wrath of those whose malign ambitions we thwart, but the majority of other nations and people will honor such a use of our power.
In Kosovo, it was we who violated article 2.4. That is why, as Podhoretz notes, I dissented from the prevailing pro-interventionist view within the Wilsonian Reaganite camp, much of which was and is (understandably) dismissive of international law. But, Kosovo notwithstanding, my view is that international law offers us more advantages than disadvantages. The UN Charter itself acknowledges an “inherent right of individual or collective self-defense,” and, in addition, most authorities recognize the legitimacy of “humanitarian intervention” in extreme cases of human-rights violation.
These tenets give us adequate latitude to justify resorting to force, with or without the approval of the UN Security Council, in almost all cases in which we would want or need to do so. By resting our actions on a legal basis (and accepting the correlative constraints), we can make the continued exercise of our disproportionate power easier for others to accept.
Joseph S. Nye, Jr.
Norman Podhoretz is right: despite the rhetoric, the current foreign-policy debates are not about isolationism. Polls show that a large majority of the public favors active American involvement in the world. The real debate is over what principles should guide our use of power. To my recent effort in Foreign Affairs (July/August 1999) to spell out these principles, Podhoretz gives an honorable mention: “valiant but not altogether successful.” Fair enough, I thought. I looked forward to what he would offer as an alternative. To my surprise, it is the “elevated patriotism” and “global hegemony” advocated by William Kristol and Robert Kagan of the Weekly Standard.
Now, I like Kristol and Kagan’s commitment to an active American role abroad. A decade ago, when predictions of American decline were the intellectual fashion, I demurred; my 1990 book was entitled, Bound to Lead. I also reject the realist dichotomy of interests- versus values-oriented foreign policy. Our values are part of our interests; the prudent promotion of democracy and human rights serves American interests.
But the problem with Kristol and Kagan is that their vehicle has a powerful accelerator and weak brakes. It is unlikely to stay on the road. And when it crashes, it may lead to a public overreaction that can damage our long-term involvement and interests. What the American public is seeking are principles that will keep us involved in the world without our becoming the global cop or hegemon. Here Kristol and Kagan offer precious little help.
Crusaders in foreign policy are stronger on good intentions than on good consequences. Podhoretz is wise enough to recognize this. Though he breaks with the prudence of Owen Harries, he also admits to doubts about Kristol and Kagan’s “unabashed enthusiasm.” He asks the right questions: “What are the limits that should be set for American intervention? Where, if anywhere, are the lines to be drawn?” Unfortunately, he does not answer them.
Foreign policy involves trying to accomplish varied objectives in a complex and recalcitrant world. This means trade-offs among objectives. A human-rights or promotion-of-democracy policy is not a foreign policy; it is an important part of a foreign policy. What must command priority in the balancing act are threats to survival. Without survival, all other values vanish.
Preventing attacks on the United States by countries or terrorists; preventing the emergence of hostile hegemons in Asia or Europe; preventing the emergence of dangerous situations on our borders—these deserve priority because they can threaten our survival. To be sure, differences can arise as to how much insurance to buy against the threats. And, in addition to threats, there are also opportunities to consider in shaping foreign policy.
How should we set priorities in such a world? Classical realists like Hans Morgenthau taught us to start with understanding our power. Well into the next century, the United States is likely to remain the preponderant but not the dominant nation. We will want to influence distant governments and organizations on a variety of issues—the proliferation of weapons of mass destruction, terrorism, drugs, resources, and the environment—while also promoting values congenial to our interests.
A basic proposition of public-goods theory is that if the largest beneficiary of a good (such as order) does not take the lead in devoting disproportionate resources to providing and maintaining it, smaller beneficiaries are unlikely to muster the wherewithal to do so. This means that, more than other countries, we have systemic interests to protect. Here we can learn something from Great Britain in the age when it was the preponderant but not the dominant power. Three public goods advanced by Britain in the 19th century were: (1) maintaining the balance of power among the major states; (2) promoting an open international economic system; and (3) maintaining open international commons like freedom of the seas. All three translate relatively well to the current American case.
In terms, first, of the distribution of power, we need to continue to shape the international environment. That is why we keep 100,000 troops forward-based in Europe, another 100,000 in Asia, and some 20,000 near the Persian Gulf. Our role as a stabilizer, insuring against the rise of hostile hegemons in important regions, has to remain a top priority.
Second, promoting an open international economic system is good for America and good for other countries as well. In the long term, economic growth is also more likely to foster stable, democratic, middle-class societies in other countries, though the time scale may be a lengthy one.
Third, the U.S., like 19th-century Britain, has an interest in freedom of the seas. But the international commons now also include the global climate, the uses of outer space, and, increasingly, cyberspace.
I would add to this list our systemic interest in developing and maintaining international regimes of norms, laws, and institutions to organize actions relating to trade and the environment, weapons proliferation, peacekeeping, and human rights. Those, like Charles Krauthammer, who denigrate the importance of such regimes ignore soft power and the extent to which legitimacy, too, is a reality of power. Hans Morgenthau would not have made such a mistake. Finally, as a preponderant power, the United States can provide an important public good by acting as a mediator and convenor.
To repeat: values like human rights and the promotion of democracy have to be melded and traded off with insurance of our survival and our systemic interests. Otherwise, the so-called CNN effect is likely to support waves of humanitarian sentiment a mile wide and an inch deep—enough to get us into conflicts, as in Somalia, but not to sustain our involvement. This means that the foreign-policy debate we need is over how to integrate our values with our other interests. Who knows? Such a debate might produce more strange bedfellows—and that would be a good thing.
The sense of incongruity Norman Podhoretz experienced in finding his own name alongside that of Anthony Lewis’s on a petition calling for U.S. military action in the Balkans is entirely understandable, and would, I imagine, be shared by Lewis. The debates Podhoretz charts, beginning in the early days of the cold war, reaching critical mass during the Vietnam war, and continuing even after the collapse of the Soviet empire, were so bitter and unforgiving that both liberals and conservatives still tend, mistakenly, to construe the current U.S. role in international affairs through the prism of those times.
For my own part, I remain unpersuaded by much of Podhoretz’s account of the earlier controversies. He writes as if the fact that the struggle against Communism was fundamentally a just and moral crusade manumits him from considering whether its ends justified its means. But it was this concern, far more than the isolationist currents that he anatomizes in such detail, let alone what neoconservatives call anti-anti-Communism, that, beginning in the 1960’s, caused so many liberals to reconsider the legitimacy of the cold war. The American officer who during an engagement in Vietnam told a reporter, “we had to destroy the village in order to save it,” epitomized, for many, the dangers of an interventionism that seemed contemptuous of the moral restraints that just-war theory and modern international humanitarian law, not to mention commonsense morality, should have imposed.
And yet, having registered my demurrers, I wonder, after reading Podhoretz’s piece, how relevant these disputes over the past remain. Perhaps he does as well. To be sure, he endeavors to establish a fundamental continuity between positions taken by neoconservatives like himself during the cold war and what he calls the “post-Reagan Reaganism” of writers and activists like William Kristol and Robert Kagan. But here again, such a continuity is no more intellectually or politically sustainable in the year 2000, already a half-generation removed from the fall of the Berlin Wall, than the old antinomies of political Right and Left that Podhoretz himself concedes now make a poor guide for determining where people stand on any particular issue.
In effect, the Kristol-Kagan prescription demands that the U.S. remain in a crusading mode after Jerusalem has been liberated, or, in this case, the Soviet empire has not only been defeated but ceased to exist. But a crusade cannot be based on historical analogies, like Podhoretz’s likening of the period we are entering to the one that began in 1919. Nor does a slogan like “making the world safe for democracy,” which Podhoretz deploys toward the end of his essay, have any specific gravity in the post-cold-war context, or approach the coherence of the old American ambition to prevent more of the world from becoming Communist.
When Podhoretz speaks of democracy or of the American mission in the world, what does he actually have in mind? Is he, for instance, talking of free elections, like the ones that, had they been allowed to proceed, would have brought Islamic fundamentalism to power in Algeria? Since he argues that the principle of sovereignty should “no longer embrace the right of political leaders to butcher their own people,” is he advocating an international human-rights regime, up to and including military intervention to enforce it? On a practical level, this would inevitably involve collaboration with allies, some subordination of U.S. law to international law, and an increased role for multilateral institutions like the United Nations.
The problem is not so much that Podhoretz is a Wilsonian, an epithet against which he indignantly defends himself. The problem is that the perfect coherence he yearns for is simply unattainable except in the special circumstances of a crusade. Politicians understand this instinctively, which is presumably why President Bush had to compare Saddam Hussein to Hitler when he obviously knew better, or why Clinton had to trumpet the intervention in Kosovo as the first of many wars of values, not interests, when the U.S. has neither the intention nor the capability of acting consistently elsewhere as it did in the Balkans.
Neither Podhoretz nor Kagan and Kristol seem willing to confront how anomalous were the conditions—including the ideological conditions—of the second half of the century we have just left. In my own view, the era we have entered is, both for better and for worse, much closer to the historical norm. As such, it calls for much more modest goals. Even if the U.S. is unlikely ever again to be the hegemon it was during the cold war, it will always be a great power. Sometimes it will use that power purely in defense of interests, as in the Gulf war (although the Gulf war was also an exercise in defense of international law). Sometimes that power can be used, proportionally and with good effect, to right a wrong. But just because the humpty-dumpty of American hegemony cannot or will not ever be put back together again need not be a cause for despair.
The issue, in other words, is not American hegemony but rather the attempt to craft a foreign policy that will coherently articulate both American interests and American values. That is no easy task. The very fact that our foreign policy is subject to so many new pressures—from the false promises held out by a consumer society, to the superficial immediacy of the “live-from-the-battlefield” reporting of CNN and the Internet, to the fracturing of American society along fault lines of both ethnic and elective affinity—is an enormous challenge in itself, for how is one to make foreign policy intelligible not only to our allies and our adversaries but to the American public itself? Certainly, none of the major candidates for the presidency is up to the task. All the more reason, then, to believe that if we can achieve such coherence, we will have achieved a great deal, and certainly more than we will achieve by trying to fan the embers of a hegemony that is anyway rapidly waning.
Peter W. Rodman
Norman Podhoretz’s brilliant survey of the intellectual terrain of American foreign policy leads me to two principal reflections. One has to do with the important degree of political harmony that I now see on the Right. The second has to do with the state of the debate between neo-Reaganites and realists.
Both in content and tone, Podhoretz’s essay pays a kind of tribute to the easing of many previous intramural disputes on the Right. This is one of the positive developments of the present period. Differences there certainly are (as over China and humanitarian intervention), and they are not trivial. But Podhoretz’s historical survey also reminds us of the much more glaring gulf between Right and Left. The cliché that the end of the cold war has abolished this distinction is mistaken.
As the Clinton administration has brought home to us with a vengeance for seven years, Right and Left in America (even the internationalists among them) still have very different worldviews. The Left retains a certain liberal guilt about American power, which it assuages by an assertive faith in multilateral institutions (the UN, international law), in diplomatic nostrums like multilateral arms control, and in an agenda of New Age issues like environmentalism. The Right is more strategic-minded, less naive about our ability to abolish the factor of power from international politics, and unapologetic about American strength, American sovereignty, and American preeminence. The Right puts more faith in military defense (including missile defense) and greater stock in the geopolitical components of our security (facing down Saddam Hussein and North Korea; preserving our alliance system). Clinton has been reported to dismiss this kind of strategic analysis as “Old Think.”
Liberal guilt about American power also translates into a particular kind of scruple about the use of force. Here I have in mind not just the liberal assumption that the legitimacy of our use of force can come only from some international organization (as when the Clinton administration obtained UN Security Council authorization before occupying Haiti in 1995). Nor do I mean only that the Left sees U.S. intervention as tainted if any “selfish” strategic interest gets in the way of humanitarian goals. In addition to all that, there is a moral discomfort with the actual use of force that leads liberal Democratic Presidents always to cut corners, to do the minimum, and to yearn for “surgical” or “calibrated” ways to do it.
This is the Bay of Pigs syndrome, the albatross of LBJ’s “graduated escalation” in Vietnam, the bane of Jimmy Carter’s abortive helicopter raid on Iran, and the pattern of Clinton interventions from Somalia to Iraq to Kosovo. By contrast, conservative Presidents have always had a more clearheaded understanding that, once one has made the decision to commit American power, the categorical moral as well as strategic imperative is to prevail.
Despite policy differences among conservatives, this picture illuminates the significant philosophical consensus that today unites them. We can thank Bill Clinton for setting this in such sharp relief.
The second observation I would venture is that the intraconservative debate over humanitarian intervention is also becoming more moderate. Where Norman Podhoretz sides, on balance, with the more ideological enthusiasms of William Kristol and Robert Kagan, I see the tide shifting in the other direction. I think realism is making a comeback.
For better or worse, neither Congress nor the country exhibits an eagerness for humanitarian intervention. The more expansive Kristol-Kagan definition of America’s sense of mission has not taken root. The public seems quite hesitant about new ideological crusades (even against China). Clinton’s military interventions have prompted many Americans to ask: what is our national interest in this? (Sam Donaldson of ABC News asked this question repeatedly during both the Bosnia and the Kosovo crises. Vox populi.)
The American people seem to want reassurance that their leaders can tell the difference between what is important to us in the world and what is not. Indiscriminate humanitarianism seems unsustainable. If President Clinton was afraid to risk any American casualties in Kosovo, what more damning confession could there be of how thin even he knew public support to be?
The Kristol-Kagan view deserves the label Wilsonian because, even while it clearly represents a muscular, strategic-minded Reaganite rather than Clintonite variant, it is part of a 20th-century trend that has emphasized the moral/ideological well-springs of American international engagement. Indeed, it vigorously rebuked the Nixon-Kissinger brand of Realpolitik in precisely those terms. But just as Nixon and Kissinger’s attempt to win the country over to their realist philosophy failed during the Vietnam era (spawning the resurgence of Wilsonianism that was reflected in both Carter and Reagan), now Bill Clinton’s misadventures have triggered a reaction to Wilsonianism. With humanitarian intervention quite discredited in this country after Kosovo, it is intriguing to see Podhoretz toward the end of his article, and Kristol and Kagan in an October 25 New York Times op-ed piece, all running away briskly from the Wilsonian label.
It is the Republicans’ task (in the next presidency, one hopes) to rebuild the American people’s self-confidence about American international leadership and engagement out of the wreckage that Clinton has wrought. Probably this will require some selectivity rather than universalism, some re-emphasis of the national interest rather than moral and ideological enthusiasm. American predominance in the world is fine with me. But we need to sustain the domestic base for it.
Robert W. Tucker
Are we in a situation, as Norman Podhoretz believes, resembling the one that developed at the end of World War I? The problem of security in Europe after World War I was that of an imbalance of power. Only France was fully committed to enforcing the Treaty of Versailles, and its power was insufficient for the task. Great Britain had increasingly turned away from the continent. The United States, having rejected Versailles, had once again distanced itself from the politics of Europe. Thus the stage was set at an early date for World War II. All that was missing was the political movement and leader that would channel Germany’s defeat and humiliation into a war of conquest. From the war that followed, there emerged the cold war.
Today, a decade after the end of the cold war, the alliance that won the conflict remains intact. The United States shows no signs of an intention to abandon its commitment to Europe. Whatever the faults of America’s European policy, there has been no replay of the post-World War I period. The nation has rejected a return to isolation. Far from resembling the situation that developed after World War I, the present situation is seen by many as dominated by an intrusive America. American leaders, Samuel P. Huntington writes in a recent indictment of American foreign policy, “believe that the world’s business is their business.” The critics of American hegemony reject the assumption that, in William Pfaff’s words, “American responsibility for world order is the inevitable consequence of American power.”
These expressions register opposition to the equation of power and order in international society. While they cannot be dismissed, they do not and cannot rest on the reality of a multipolar world. We do not have a multipolar world, nor do we have the imminent prospect of one. Yet the need for order persists. How is it to be met? Globalization cannot provide the answer. Even if it is the case that everyone wants to preserve the global system of technology and trade, that system is not self-sustaining. It requires a security framework that only a political order can bring.
America’s hegemonic power does create a special responsibility for world order. That responsibility, it is true, cannot in some cases satisfy the multilateral condition. Nevertheless, it seems quite doubtful that, as Robert Kagan argues, “multilateralism must be preceded by unilateralism” if American interests and world order are to be preserved. That unilateralism will beget multilateralism is surely not a self-evident proposition, although it may seem such if one assumes a near-identity of interest between leader and led. This is, of course, what today’s unilateralists usually assume. Given that assumption, it is but a short step to the conclusion that what we do for ourselves we not only do for others as well but what others would also do were they to bear an equal responsibility for order.
This is a dangerous argument, even if it sometimes happens to be true. In acting alone, it is much safer to assume that we are acting in pursuit of our own interests and not—at any rate, not necessarily—in the interests of others. For if we were truly intent on acting in the interests of others as well as our own, we would presumably accord to others a substantive role and, by doing so, end up embracing some form of multilateralism. Others, after all, must be supposed to know their interests better than we can know them.
During the long period of the cold war, the justification of American power was the defense of the independence of states from the threat posed by a hostile and expansionist Soviet Union. The policy of containment responded by and large to the time-honored compulsions of the balance of power. The order defended by American power was inseparable from containment.
It is the case that the identification of threats to this order provoked periodic disputes with allies. Unilateral action taken by the principal guarantor of containment did not go without criticisms, at times even harsh criticism. On balance, though, disaffection was limited by the visible threat of Soviet power.
The understandings of this earlier period no longer hold. Although the United States remains the principal guarantor of the post-cold-war order, this order, save for its economic dimension, no longer has the compelling character that the Soviet threat gave to the cold-war order. Our difficulties in obtaining support for more effective sanctions against Iraq testify to this.
Unless we are very lucky, a sustainable foreign policy in the years ahead will require either increasing the means of policy or invoking the greater cooperation of others. And since there is little reason for believing that the means of policy will be increased, we are left to rely on the greater cooperation of others. But the greater cooperation of others will mean that our freedom of action is narrowed. This would already appear to be the price in Europe of greater mutuality, as the Balkan wars have shown. In turn, European cooperation has been a necessary condition of American domestic support. Unilateralism would forfeit this cooperation.
Invoking the cooperation of others for the maintenance of a liberal order is not exactly an exciting task. Nevertheless, it is the task to which we are apparently fated. Are the American people ready to support such a policy with its inevitable compromises? According to the Kagan-Kristol-Podhoretz outlook, the answer must be no. Lacking a grander purpose, the public will seek (in Kristol and Kagan’s words) “deeper and deeper cuts in the defense and foreign-affairs budgets and gradually decimate the tools of U.S. hegemony.”
The past decade, however, has already cast doubt on this prophecy. The public has not insisted upon deeper and deeper cuts in the defense and foreign-affairs budgets. On the contrary, it has shown a remarkable steadiness, reflecting a national consensus on the desirability of our remaining the premier global power. The day is long past when a return to isolation is a meaningful prospect.
Norman Podhoretz’s perplexity over the “new foreign-policy debates” is characteristically sincere and honest. In no way should it be confused with President Clinton’s phony nostalgia for the allegedly simpler days of the cold war or with presidential candidate Bill Bradley’s confidence that “Until the fall of the Berlin Wall in 1989, . . . we knew where we stood on foreign policy.”
Such statements are truly astonishing from the leaders of a Democratic party which, for most of the last 25 years, had largely turned its back on the robust internationalism of Harry S. Truman, John F. Kennedy, and Senator Henry M.Jackson. Theirs was the party that voted in overwhelming numbers for the Mansfield amendment to withdraw U.S. troops from Europe, and against the Gulf war; the party whose leaders attacked Ronald Reagan for declaring that the Soviet Union was an “evil empire,” and (in an episode of lesser historical significance in which I was involved) the Bush Pentagon for suggesting that we should seek to prevent any hostile power from dominating those regions whose resources could become a source of global power.
Ironically, there is far broader consensus today on issues like the U.S. presence in Korea and Europe than ever during the cold war. Even the notion of American military superiority is taken for granted and seemingly welcomed by people who not many years ago regarded it as dangerous. This has happened partly because the Democratic party, under Clinton’s leadership, has tried to contest the foreign-policy mantle won by the Republicans through the successes of Presidents Reagan and Bush, and thereby to reclaim the center of American politics. For opportunistically leading his party away from some of its previous stances, we should perhaps be grateful to Clinton.
While it is surprising that this consensus about American military power has developed at a time when the need for it has become less evident, perhaps the explanation is that these commitments now involve less risk and demand less courage. When Reagan denounced the Soviet Union as an “evil empire,” not only did he cause outrage among those on the Left addicted to moral equivalence, but he was attacked as a warmonger: offending the Soviet Union was a dangerous business. Confronting Saddam Hussein took leadership and great courage from President Bush because no one knew that victory would come at such a low cost. It is only recently, when confronting Iraq seems relatively easy, that everyone has become a “hawk.” The debate over Kosovo was mild compared to what it would have been had the U.S. been suffering serious losses or even facing that possibility.
Among conservatives, many are now divided by the concern that the U.S. may be undertaking commitments whose importance to the national interest is unclear and which we may abandon if they prove too costly to sustain, as Clinton did in Somalia and as even Reagan did in Lebanon. Or, if we persist, we may find ourselves confronting horrendous costs that we failed to anticipate, as happened in Vietnam. In this connection, it is surprising and a bit unsettling to observe the ease with which Democrats who once embraced George McGovern now speak in a pale echo of President Kennedy’s call to “pay any price, bear any burden” in behalf of freedom. Military forces are spoken of as instruments for diplomatic signaling, and even for nation-building. Such talk should make any sensible conservative nervous, and even more so when force is actually used with the gradualism that characterized the war in Vietnam and without any sense of how to “win.”
To this I would add the qualifier, however, that the dangers of American overextension do not seem to me comparable to what they were in Vietnam, and I would agree with Podhoretz that it is far more dangerous to underestimate than to overestimate the risks of a major war in the future. Still, in order to complete his very useful guide for the perplexed, one would need to specify more precisely the mission he sets forth—protecting and preserving freedom, and spreading its blessings—even if doing so may create new fault lines among conservatives.
In particular, when it comes to putting American soldiers in harm’s way, there is a big difference between protecting freedom where it exists and spreading it. There are no less important differences between places like the Persian Gulf that could be the sources of major threats to U.S. security and places like Haiti that are not. When it comes to armed intervention, similarly, there is a difference between giving others the means to fight for themselves, as we should have done in Bosnia, and fighting for them. And when it comes to promoting democracy, there is a difference between defending it where it is established, as on Taiwan, and promoting it where it has not yet taken root. In the case of China, our limited influence on that country is more likely to be effective if we take the milder course that President Reagan followed in dealing with authoritarian regimes like the Philippines and South Korea than the approach he took toward our ideological rival in the cold war, the Soviet Union.
Finally, if pressed, I would be more inclined to analogize our own time to 1899 than, as Podhoretz does, to 1919—in the sense that the looming danger over the next twenty years is more likely to be a resurgence of great-power conflict than the ideological crusades of Nazism and Communism that produced World War II and the cold war. But while we cannot be certain what the greatest dangers confronting us will be, the worst imaginable indictment would be if future generations, looking back, were to conclude that our generation could have prevented a global conflict, but failed. It may be hard to measure our actions by so severe a standard at a time when dangers of a global magnitude seem remote. Nonetheless, it is the right standard, and the task of leadership should be to remind the American people that these are indeed the stakes of American preeminence in the world.
American Power-For What?
Must-Reads from Magazine
t can be said that the Book of Samuel launched the American Revolution. Though antagonistic to traditional faith, Thomas Paine understood that it was not Montesquieu, or Locke, who was inscribed on the hearts of his fellow Americans. Paine’s pamphlet Common Sense is a biblical argument against British monarchy, drawing largely on the text of Samuel.
Today, of course, universal biblical literacy no longer exists in America, and sophisticated arguments from Scripture are all too rare. It is therefore all the more distressing when public intellectuals, academics, or religious leaders engage in clumsy acts of exegesis and political argumentation by comparing characters in the Book of Samuel to modern political leaders. The most common victim of this tendency has been the central character in the Book of Samuel: King David.
Most recently, this tendency was made manifest in the writings of Dennis Prager. In a recent defense of his own praise of President Trump, Prager wrote that “as a religious Jew, I learned from the Bible that God himself chose morally compromised individuals to accomplish some greater good. Think of King David, who had a man killed in order to cover up the adultery he committed with the man’s wife.” Prager similarly argued that those who refuse to vote for a politician whose positions are correct but whose personal life is immoral “must think God was pretty flawed in voting for King David.”
Prager’s invocation of King David was presaged on the left two decades ago. The records of the Clinton Presidential Library reveal that at the height of the Lewinsky scandal, an email from Dartmouth professor Susannah Heschel made its way into the inbox of an administration policy adviser with a similar comparison: “From the perspective of Jewish history, we have to ask how Jews can condemn President Clinton’s behavior as immoral, when we exalt King David? King David had Batsheva’s husband, Uriah, murdered. While David was condemned and punished, he was never thrown off the throne of Israel. On the contrary, he is exalted in our Jewish memory as the unifier of Israel.”
One can make the case for supporting politicians who have significant moral flaws. Indeed, America’s political system is founded on an awareness of the profound tendency to sinfulness not only of its citizens but also of its statesmen. “If men were angels, no government would be necessary,” James Madison informs us in the Federalist. At the same time, anyone who compares King David to the flawed leaders of our own age reveals a profound misunderstanding of the essential nature of David’s greatness. David was not chosen by God despite his moral failings; rather, David’s failings are the lens that reveal his true greatness. It is in the wake of his sins that David emerges as the paradigmatic penitent, whose quest for atonement is utterly unlike that of any other character in the Bible, and perhaps in the history of the world.
While the precise nature of David’s sins is debated in the Talmud, there is no question that they are profound. Yet it is in comparing David to other faltering figures—in the Bible or today—that the comparison falls flat. This point is stressed by the very Jewish tradition in whose name Prager claimed to speak.
It is the rabbis who note that David’s predecessor, Saul, lost the kingship when he failed to fulfill God’s command to destroy the egregiously evil nation of Amalek, whereas David commits more severe sins and yet remains king. The answer, the rabbis suggest, lies not in the sin itself but in the response. Saul, when confronted by the prophet Samuel, offers obfuscations and defensiveness. David, meanwhile, is similarly confronted by the prophet Nathan: “Thou hast killed Uriah the Hittite with the sword, and hast taken his wife to be thy wife, and hast slain him with the sword of the children of Ammon.” David’s immediate response is clear and complete contrition: “I have sinned against the Lord.” David’s penitence, Jewish tradition suggests, sets him apart from Saul. Soon after, David gave voice to what was in his heart at the moment, and gave the world one of the most stirring of the Psalms:
Have mercy upon me, O God, according to thy lovingkindness: according unto the multitude of thy tender mercies blot out my transgressions.
Wash me thoroughly from mine iniquity, and cleanse me from my sin. For I acknowledge my transgressions: and my sin is ever before me.
. . . Deliver me from bloodguiltiness, O God, thou God of my salvation: and my tongue shall sing aloud of thy righteousness.
O Lord, open thou my lips; and my mouth shall shew forth thy praise.
For thou desirest not sacrifice; else would I give it: thou delightest not in burnt offering.
The sacrifices of God are a broken spirit: a broken and a contrite heart, O God, thou wilt not despise.
The tendency to link David to our current age lies in the fact that we know more about David than any other biblical figure. The author Thomas Cahill has noted that in a certain literary sense, David is the only biblical figure that is like us at all. Prior to the humanist autobiographies of the Renaissance, he notes, “we can count only a few isolated instances of this use of ‘I’ to mean the interior self. But David’s psalms are full of I’s.” In David’s Psalms, Cahill writes, we “find a unique early roadmap to the inner spirit—previously mute—of ancient humanity.”
At the same time, a study of the Book of Samuel and of the Psalms reveals how utterly incomparable David is to anyone alive today. Haym Soloveitchik has noted that even the most observant of Jews today fail to feel a constant intimacy with God that the simplest Jew of the premodern age might have felt, that “while there are always those whose spirituality is one apart from that of their time, nevertheless I think it safe to say that the perception of God as a daily, natural force is no longer present to a significant degree in any sector of modern Jewry, even the most religious.” Yet for David, such intimacy with the divine was central to his existence, and the Book of Samuel and the Psalms are an eternal testament to this fact. This is why simple comparisons between David and ourselves, as tempting as they are, must be resisted. David Wolpe, in his book about David, attempts to make the case as to why King David’s life speaks to us today: “So versatile and enduring is David in our culture that rare is the week that passes without some public allusion to his life…We need to understand David better because we use his life to comprehend our own.”
The truth may be the opposite. We need to understand David better because we can use his life to comprehend what we are missing, and how utterly unlike our lives are to his own. For even the most religious among us have lost the profound faith and intimacy with God that David had. It is therefore incorrect to assume that because of David’s flaws it would have been, as Amos Oz has written, “fitting for him to reign in Tel Aviv.” The modern State of Israel was blessed with brilliant leaders, but to which of its modern warriors or statesmen should David be compared? To Ben Gurion, who stripped any explicit invocation of the Divine from Israel’s Declaration of Independence? To Moshe Dayan, who oversaw the reconquest of Jerusalem, and then immediately handed back the Temple Mount, the locus of King David’s dreams and desires, to the administration of the enemies of Israel? David’s complex humanity inspires comparison to modern figures, but his faith, contrition, and repentance—which lie at the heart of his story and success—defy any such engagement.
And so, to those who seek comparisons to modern leaders from the Bible, the best rule may be: Leave King David out of it.
Three attacks in Britain highlight the West’s inability to see the threat clearly
This lack of seriousness manifests itself in several ways. It’s perhaps most obvious in the failure to reform Britain’s chaotic immigration and dysfunctional asylum systems. But it’s also abundantly clear from the grotesque underfunding and under-resourcing of domestic intelligence. In MI5, Britain has an internal security service that is simply too small to do its job effectively, even if it were not handicapped by an institutional culture that can seem willfully blind to the ideological roots of the current terrorism problem.
In 2009, Jonathan Evans, then head of MI5, confessed at a parliamentary hearing about the London bus and subway attacks of 2005 that his organization only had sufficient resources to “hit the crocodiles close to the boat.” It was an extraordinary metaphor to use, not least because of the impression of relative impotence that it conveys. MI5 had by then doubled in size since 2001, but it still boasted a staff of only 3,500. Today it’s said to employ between 4,000 and 5,000, an astonishingly, even laughably, small number given a UK population of 65 million and the scale of the security challenges Britain now faces. (To be fair, the major British police forces all have intelligence units devoted to terrorism, and the UK government’s overall counterterrorism strategy involves a great many people, including social workers and schoolteachers.)
You can also see that unseriousness at work in the abject failure to coerce Britain’s often remarkably sedentary police officers out of their cars and stations and back onto the streets. Most of Britain’s big-city police forces have adopted a reactive model of policing (consciously rejecting both the New York Compstat model and British “bobby on the beat” traditions) that cripples intelligence-gathering and frustrates good community relations.
If that weren’t bad enough, Britain’s judiciary is led by jurists who came of age in the 1960s, and who have been inclined since 2001 to treat terrorism as an ordinary criminal problem being exploited by malign officials and politicians to make assaults on individual rights and to take part in “illegal” foreign wars. It has long been almost impossible to extradite ISIS or al-Qaeda–linked Islamists from the UK. This is partly because today’s English judges believe that few if any foreign countries—apart from perhaps Sweden and Norway—are likely to give terrorist suspects a fair trial, or able to guarantee that such suspects will be spared torture and abuse.
We have a progressive metropolitan media elite whose primary, reflexive response to every terrorist attack, even before the blood on the pavement is dry, is to express worry about an imminent violent anti-Muslim “backlash” on the part of a presumptively bigoted and ignorant indigenous working class. Never mind that no such “backlash” has yet occurred, not even when the young off-duty soldier Lee Rigby was hacked to death in broad daylight on a South London street in 2013.
Another sign of this lack of seriousness is the choice by successive British governments to deal with the problem of internal terrorism with marketing and “branding.” You can see this in the catchy consultant-created acronyms and pseudo-strategies that are deployed in place of considered thought and action. After every atrocity, the prime minister calls a meeting of the COBRA unit—an acronym that merely stands for Cabinet Office Briefing Room A but sounds like a secret organization of government superheroes. The government’s counterterrorism strategy is called CONTEST, which has four “work streams”: “Prevent,” “Pursue,” “Protect,” and “Prepare.”
Perhaps the ultimate sign of unseriousness is the fact that police, politicians, and government officials have all displayed more fear of being seen as “Islamophobic” than of any carnage that actual terror attacks might cause. Few are aware that this short-term, cowardly, and trivial tendency may ultimately foment genuine, dangerous popular Islamophobia, especially if attacks continue.R
ecently, three murderous Islamist terror attacks in the UK took place in less than a month. The first and third were relatively primitive improvised attacks using vehicles and/or knives. The second was a suicide bombing that probably required relatively sophisticated planning, technological know-how, and the assistance of a terrorist infrastructure. As they were the first such attacks in the UK, the vehicle and knife killings came as a particular shock to the British press, public, and political class, despite the fact that non-explosive and non-firearm terror attacks have become common in Europe and are almost routine in Israel.
The success of all three plots indicates troubling problems in British law-enforcement practice and culture, quite apart from any other failings on the parts of the state in charge of intelligence, border control, and the prevention of radicalization. At the time of writing, the British media have been full of encomia to police courage and skill, not least because it took “only” eight minutes for an armed Metropolitan Police team to respond to and confront the bloody mayhem being wrought by the three Islamist terrorists (who had ploughed their rented van into people on London Bridge before jumping out to attack passersby with knives). But the difficult truth is that all three attacks would be much harder to pull off in Manhattan, not just because all NYPD cops are armed, but also because there are always police officers visibly on patrol at the New York equivalents of London’s Borough Market on a Saturday night. By contrast, London’s Metropolitan police is a largely vehicle-borne, reactive force; rather than use a physical presence to deter crime and terrorism, it chooses to monitor closed-circuit street cameras and social-media postings.
Since the attacks in London and Manchester, we have learned that several of the perpetrators were “known” to the police and security agencies that are tasked with monitoring potential terror threats. That these individuals were nevertheless able to carry out their atrocities is evidence that the monitoring regime is insufficient.
It also seems clear that there were failures on the part of those institutions that come under the leadership of the Home Office and are supposed to be in charge of the UK’s border, migration, and asylum systems. Journalists and think tanks like Policy Exchange and Migration Watch have for years pointed out that these systems are “unfit for purpose,” but successive governments have done little to take responsible control of Britain’s borders. When she was home secretary, Prime Minister Theresa May did little more than jazz up the name, logo, and uniforms of what is now called the “Border Force,” and she notably failed to put in place long-promised passport checks for people flying out of the country. This dereliction means that it is impossible for the British authorities to know who has overstayed a visa or whether individuals who have been denied asylum have actually left the country.
It seems astonishing that Youssef Zaghba, one of the three London Bridge attackers, was allowed back into the country. The Moroccan-born Italian citizen (his mother is Italian) had been arrested by Italian police in Bologna, apparently on his way to Syria via Istanbul to join ISIS. When questioned by the Italians about the ISIS decapitation videos on his mobile phone, he declared that he was “going to be a terrorist.” The Italians lacked sufficient evidence to charge him with a crime but put him under 24-hour surveillance, and when he traveled to London, they passed on information about him to MI5. Nevertheless, he was not stopped or questioned on arrival and had not become one of the 3,000 official terrorism “subjects of interest” for MI5 or the police when he carried out his attack. One reason Zaghba was not questioned on arrival may have been that he used one of the new self-service passport machines installed in UK airports in place of human staff after May’s cuts to the border force. Apparently, the machines are not yet linked to any government watch lists, thanks to the general chaos and ineptitude of the Home Office’s efforts to use information technology.
The presence in the country of Zaghba’s accomplice Rachid Redouane is also an indictment of the incompetence and disorganization of the UK’s border and migration authorities. He had been refused asylum in 2009, but as is so often the case, Britain’s Home Office never got around to removing him. Three years later, he married a British woman and was therefore able to stay in the UK.
But it is the failure of the authorities to monitor ringleader Khuram Butt that is the most baffling. He was a known and open associate of Anjem Choudary, Britain’s most notorious terrorist supporter, ideologue, and recruiter (he was finally imprisoned in 2016 after 15 years of campaigning on behalf of al-Qaeda and ISIS). Butt even appeared in a 2016 TV documentary about ISIS supporters called The Jihadist Next Door. In the same year, he assaulted a moderate imam at a public festival, after calling him a “murtad” or apostate. The imam reported the incident to the police—who took six months to track him down and then let him off with a caution. It is not clear if Butt was one of the 3,000 “subjects of interest” or the additional 20,000 former subjects of interest who continue to be the subject of limited monitoring. If he was not, it raises the question of what a person has to do to get British security services to take him seriously as a terrorist threat; if he was in fact on the list of “subjects of interest,” one has to wonder if being so designated is any barrier at all to carrying out terrorist atrocities. It’s worth remembering, as few do here in the UK, that terrorists who carried out previous attacks were also known to the police and security services and nevertheless enjoyed sufficient liberty to go at it again.B
ut the most important reason for the British state’s ineffectiveness in monitoring terror threats, which May addressed immediately after the London Bridge attack, is a deeply rooted institutional refusal to deal with or accept the key role played by Islamist ideology. For more than 15 years, the security services and police have chosen to take note only of people and bodies that explicitly espouse terrorist violence or have contacts with known terrorist groups. The fact that a person, school, imam, or mosque endorses the establishment of a caliphate, the stoning of adulterers, or the murder of apostates has not been considered a reason to monitor them.
This seems to be why Salman Abedi, the Manchester Arena suicide bomber, was not being watched by the authorities as a terror risk, even though he had punched a girl in the face for wearing a short skirt while at university, had attended the Muslim Brotherhood-controlled Didsbury Mosque, was the son of a Libyan man whose militia is banned in the UK, had himself fought against the Qaddafi regime in Libya, had adopted the Islamist clothing style (trousers worn above the ankle, beard but no moustache), was part of a druggy gang subculture that often feeds individuals into Islamist terrorism, and had been banned from a mosque after confronting an imam who had criticized ISIS.
It was telling that the day after the Manchester Arena suicide-bomb attack, you could hear security officials informing radio and TV audiences of the BBC’s flagship morning-radio news show that it’s almost impossible to predict and stop such attacks because the perpetrators “don’t care who they kill.” They just want to kill as many people as possible, he said.
Surely, anyone with even a basic familiarity with Islamist terror attacks over the last 15 or so years and a nodding acquaintance with Islamist ideology could see that the terrorist hadn’t just chosen the Ariana Grande concert in Manchester Arena because a lot of random people would be crowded into a conveniently small area. Since the Bali bombings of 2002, nightclubs, discotheques, and pop concerts attended by shameless unveiled women and girls have been routinely targeted by fundamentalist terrorists, including in Britain. Among the worrying things about the opinion offered on the radio show was that it suggests that even in the wake of the horrific Bataclan attack in Paris during a November 2015 concert, British authorities may not have been keeping an appropriately protective eye on music venues and other places where our young people hang out in their decadent Western way. Such dereliction would make perfect sense given the resistance on the part of the British security establishment to examining, confronting, or extrapolating from Islamist ideology.
The same phenomenon may explain why authorities did not follow up on community complaints about Abedi. All too often when people living in Britain’s many and diverse Muslim communities want to report suspicious behavior, they have to do so through offices and organizations set up and paid for by the authorities as part of the overall “Prevent” strategy. Although criticized by the left as “Islamophobic” and inherently stigmatizing, Prevent has often brought the government into cooperative relationships with organizations even further to the Islamic right than the Muslim Brotherhood. This means that if you are a relatively secular Libyan émigré who wants to report an Abedi and you go to your local police station, you are likely to find yourself speaking to a bearded Islamist.
From its outset in 2003, the Prevent strategy was flawed. Its practitioners, in their zeal to find and fund key allies in “the Muslim community” (as if there were just one), routinely made alliances with self-appointed community leaders who represented the most extreme and intolerant tendencies in British Islam. Both the Home Office and MI5 seemed to believe that only radical Muslims were “authentic” and would therefore be able to influence young potential terrorists. Moderate, modern, liberal Muslims who are arguably more representative of British Islam as a whole (not to mention sundry Shiites, Sufis, Ahmmadis, and Ismailis) have too often found it hard to get a hearing.
Sunni organizations that openly supported suicide-bomb attacks in Israel and India and that justified attacks on British troops in Iraq and Afghanistan nevertheless received government subsidies as part of Prevent. The hope was that in return, they would alert the authorities if they knew of individuals planning attacks in the UK itself.
It was a gamble reminiscent of British colonial practice in India’s northwest frontier and elsewhere. Not only were there financial inducements in return for grudging cooperation; the British state offered other, symbolically powerful concessions. These included turning a blind eye to certain crimes and antisocial practices such as female genital mutilation (there have been no successful prosecutions relating to the practice, though thousands of cases are reported every year), forced marriage, child marriage, polygamy, the mass removal of girls from school soon after they reach puberty, and the epidemic of racially and religiously motivated “grooming” rapes in cities like Rotherham. (At the same time, foreign jihadists—including men wanted for crimes in Algeria and France—were allowed to remain in the UK as long as their plots did not include British targets.)
This approach, simultaneously cynical and naive, was never as successful as its proponents hoped. Again and again, Muslim chaplains who were approved to work in prisons and other institutions have sometimes turned out to be Islamist extremists whose words have inspired inmates to join terrorist organizations.
Much to his credit, former Prime Minister David Cameron fought hard to change this approach, even though it meant difficult confrontations with his home secretary (Theresa May), as well as police and the intelligence agencies. However, Cameron’s efforts had little effect on the permanent personnel carrying out the Prevent strategy, and cooperation with Islamist but currently nonviolent organizations remains the default setting within the institutions on which the United Kingdom depends for security.
The failure to understand the role of ideology is one of imagination as well as education. Very few of those who make government policy or write about home-grown terrorism seem able to escape the limitations of what used to be called “bourgeois” experience. They assume that anyone willing to become an Islamist terrorist must perforce be materially deprived, or traumatized by the experience of prejudice, or provoked to murderous fury by oppression abroad. They have no sense of the emotional and psychic benefits of joining a secret terror outfit: the excitement and glamor of becoming a kind of Islamic James Bond, bravely defying the forces of an entire modern state. They don’t get how satisfying or empowering the vengeful misogyny of ISIS-style fundamentalism might seem for geeky, frustrated young men. Nor can they appreciate the appeal to the adolescent mind of apocalyptic fantasies of power and sacrifice (mainstream British society does not have much room for warrior dreams, given that its tone is set by liberal pacifists). Finally, they have no sense of why the discipline and self-discipline of fundamentalist Islam might appeal so strongly to incarcerated lumpen youth who have never experienced boundaries or real belonging. Their understanding is an understanding only of themselves, not of the people who want to kill them.
Review of 'White Working Class' By Joan C. Williams
Williams is a prominent feminist legal scholar with degrees from Yale, MIT, and Harvard. Unbending Gender, her best-known book, is the sort of tract you’d expect to find at an intersectionality conference or a Portlandia bookstore. This is why her insightful, empathic book comes as such a surprise.
Books and essays on the topic have accumulated into a highly visible genre since Donald Trump came on the American political scene; J.D. Vance’s Hillbilly Elegy planted itself at the top of bestseller lists almost a year ago and still isn’t budging. As with Vance, Williams’s interest in the topic is personal. She fell “madly in love with” and eventually married a Harvard Law School graduate who had grown up in an Italian neighborhood in pre-gentrification Brook-lyn. Williams, on the other hand, is a “silver-spoon girl.” Her father’s family was moneyed, and her maternal grandfather was a prominent Reform rabbi.
The author’s affection for her “class-migrant” spouse and respect for his family’s hardships—“My father-in-law grew up on blood soup,” she announces in her opening sentence—adds considerable warmth to what is at bottom a political pamphlet. Williams believes that elite condescension and “cluelessness” played a big role in Trump’s unexpected and dreaded victory. Enlightening her fellow elites is essential to the task of returning Trump voters to the progressive fold where, she is sure, they rightfully belong.
Liberals were not always so dense about the working class, Williams observes. WPA murals and movies like On the Waterfront showed genuine fellow feeling for the proletariat. In the 1970s, however, the liberal mood changed. Educated boomers shifted their attention to “issues of peace, equal rights, and environmentalism.” Instead of feeling the pain of Arthur Miller and John Steinbeck characters, they began sneering at the less enlightened. These days, she notes, elite sympathies are limited to the poor, people of color (POC), and the LGBTQ population. Despite clear evidence of suffering—stagnant wages, disappearing manufacturing jobs, declining health and well-being—the working class gets only fly-over snobbery at best and, more often, outright loathing.
Williams divides her chapters into a series of explainers to questions she has heard from her clueless friends and colleagues: “Why Does the Working Class Resent the Poor?” “Why Does the Working Class Resent Professionals but Admire the Rich?” “Why Doesn’t the Working Class Just Move to Where the Jobs Are?” “Is the Working Class Just Racist?” She weaves her answers into a compelling picture of a way of life and worldview foreign to her targeted readers. Working-class Americans have had to struggle for whatever stability and comfort they have, she explains. Clocking in for midnight shifts year after year, enduring capricious bosses, plant closures, and layoffs, they’re reliant on tag-team parenting and stressed-out relatives for child care. The campus go-to word “privileged” seems exactly wrong.
Proud of their own self-sufficiency and success, however modest, they don’t begrudge the self-made rich. It’s snooty professionals and the dysfunctional poor who get their goat. From their vantage point, subsidizing the day care for a welfare mother when they themselves struggle to manage care on their own dime mocks both their hard work and their beliefs. And since, unlike most professors, they shop in the same stores as the dependent poor, they’ve seen that some of them game the system. Of course that stings.
White Working Class is especially good at evoking the alternate economic and mental universe experienced by Professional and Managerial Elites, or “PMEs.” PMEs see their non-judgment of the poor, especially those who are “POC,” as a mark of their mature understanding that we live in an unjust, racist system whose victims require compassion regardless of whether they have committed any crime. At any rate, their passions lie elsewhere. They define themselves through their jobs and professional achievements, hence their obsession with glass ceilings.
Williams tells the story of her husband’s faux pas at a high-school reunion. Forgetting his roots for a moment, the Ivy League–educated lawyer asked one of his Brooklyn classmates a question that is the go-to opener in elite social settings: “What do you do?” Angered by what must have seemed like deliberate humiliation by this prodigal son, the man hissed: “I sell toilets.”
Instead of stability and backyard barbecues with family and long-time neighbors and maybe the occasional Olive Garden celebration, PMEs are enamored of novelty: new foods, new restaurants, new friends, new experiences. The working class chooses to spend its leisure in comfortable familiarity; for the elite, social life is a lot like networking. Members of the professional class may view themselves as sophisticated or cosmopolitan, but, Williams shows, to the blue-collar worker their glad-handing is closer to phony social climbing and their abstract, knowledge-economy jobs more like self-important pencil-pushing.
White Working Class has a number of proposals for creating the progressive future Williams would like to see. She wants to get rid of college-for-all dogma and improve training for middle-skill jobs. She envisions a working-class coalition of all races and ethnicities bolstered by civics education with a “distinctly celebratory view of American institutions.” In a saner political environment, some of this would make sense; indeed, she echoes some of Marco Rubio’s 2016 campaign themes. It’s little wonder White Working Class has already gotten the stink eye from liberal reviewers for its purported sympathies for racists.
Alas, impressive as Williams’s insights are, they do not always allow her to transcend her own class loyalties. Unsurprisingly, her own PME biases mostly come to light in her chapters on race and gender. She reduces immigration concerns to “fear of brown people,” even as she notes elsewhere that a quarter of Latinos also favor a wall at the southern border. This contrasts startlingly with her succinct observation that “if you don’t want to drive working-class whites to be attracted to the likes of Limbaugh, stop insulting them.” In one particularly obtuse moment, she asserts: “Because I study social inequality, I know that even Malia and Sasha Obama will be disadvantaged by race, advantaged as they are by class.” She relies on dubious gender theories to explain why the majority of white women voted for Trump rather than for his unfairly maligned opponent. That Hillary Clinton epitomized every elite quality Williams has just spent more than a hundred pages explicating escapes her notice. Williams’s own reflexive retreat into identity politics is itself emblematic of our toxic divisions, but it does not invalidate the power of this astute book.
When music could not transcend evil
he story of European classical music under the Third Reich is one of the most squalid chapters in the annals of Western culture, a chronicle of collective complaisance that all but beggars belief. Without exception, all of the well-known musicians who left Germany and Austria in protest when Hitler came to power in 1933 were either Jewish or, like the violinist Adolf Busch, Rudolf Serkin’s father-in-law, had close family ties to Jews. Moreover, most of the small number of non-Jewish musicians who emigrated later on, such as Paul Hindemith and Lotte Lehmann, are now known to have done so not out of principle but because they were unable to make satisfactory accommodations with the Nazis. Everyone else—including Karl Böhm, Wilhelm Furtwängler, Walter Gieseking, Herbert von Karajan, and Richard Strauss—stayed behind and served the Reich.
The Berlin and Vienna Philharmonics, then as now Europe’s two greatest orchestras, were just as willing to do business with Hitler and his henchmen, firing their Jewish members and ceasing to perform the music of Jewish composers. Even after the war, the Vienna Philharmonic was notorious for being the most anti-Semitic orchestra in Europe, and it was well known in the music business (though never publicly discussed) that Helmut Wobisch, the orchestra’s principal trumpeter and its executive director from 1953 to 1968, had been both a member of the SS and a Gestapo spy.
The management of the Berlin Philharmonic made no attempt to cover up the orchestra’s close relationship with the Third Reich, no doubt because the Nazi ties of Karajan, who was its music director from 1956 until shortly before his death in 1989, were a matter of public record. Yet it was not until 2007 that a full-length study of its wartime activities, Misha Aster’s The Reich’s Orchestra: The Berlin Philharmonic 1933–1945, was finally published. As for the Vienna Philharmonic, its managers long sought to quash all discussion of the orchestra’s Nazi past, steadfastly refusing to open its institutional archives to scholars until 2008, when Fritz Trümpi, an Austrian scholar, was given access to its records. Five years later, the Viennese, belatedly following the precedent of the Berlin Philharmonic, added a lengthy section to their website called “The Vienna Philharmonic Under National Socialism (1938–1945),” in which the damning findings of Trümpi and two other independent scholars were made available to the public.
Now Trümpi has published The Political Orchestra: The Vienna and Berlin Philharmonics During the Third Reich, in which he tells how they came to terms with Nazism, supplying pre- and postwar historical context for their transgressions.1 Written in a stiff mixture of academic jargon and translatorese, The Political Orchestra is ungratifying to read. Even so, the tale that it tells is both compelling and disturbing, especially to anyone who clings to the belief that high art is ennobling to the spirit.U
nlike the Vienna Philharmonic, which has always doubled as the pit orchestra for the Vienna State Opera, the Berlin Philharmonic started life in 1882 as a fully independent, self-governing entity. Initially unsubsidized by the state, it kept itself afloat by playing a grueling schedule of performances, including “popular” non-subscription concerts for which modest ticket prices were levied. In addition, the orchestra made records and toured internationally at a time when neither was common.
These activities made it possible for the Berlin Philharmonic to develop into an internationally renowned ensemble whose fabled collective virtuosity was widely seen as a symbol of German musical distinction. Furtwängler, the orchestra’s principal conductor, declared in 1932 that the German music in which it specialized was “one of the very few things that actually contribute to elevating [German] prestige.” Hence, he explained, the need for state subsidy, which he saw as “a matter of [national] prestige, that is, to some extent a requirement of national prudence.” By then, though, the orchestra was already heavily subsidized by the city of Berlin, thus paving the way for its takeover by the Nazis.
The Vienna Philharmonic, by contrast, had always been subsidized. Founded in 1842 when the orchestra of what was then the Vienna Court Opera decided to give symphonic concerts on its own, it performed the Austro-German classics for an elite cadre of longtime subscribers. By restricting membership to local players and their pupils, the orchestra cultivated what Furtwängler, who spent as much time conducting in Vienna as in Berlin, described as a “homogeneous and distinct tone quality.” At once dark and sweet, it was as instantly identifiable—and as characteristically Viennese—as the strong, spicy bouquet of a Gewürztraminer wine.
Unlike the Berlin Philharmonic, which played for whoever would pay the tab and programmed new music as a matter of policy, the Vienna Philharmonic chose not to diversify either its haute-bourgeois audience or its conservative repertoire. Instead, it played Beethoven, Brahms, Haydn, Mozart, and Schubert (and, later, Bruckner and Richard Strauss) in Vienna for the Viennese. Starting in the ’20s, the orchestra’s recordings consolidated its reputation as one of the world’s foremost instrumental ensembles, but its internal culture remained proudly insular.
What the two orchestras had in common was a nationalistic ethos, a belief in the superiority of Austro-German musical culture that approached triumphalism. One of the darkest manifestations of this ethos was their shared reluctance to hire Jews. The Berlin Philharmonic employed only four Jewish players in 1933, while the Vienna Philharmonic contained only 11 Jews at the time of the Anschluss, none of whom was hired after 1920. To be sure, such popular Jewish conductors as Otto Klemperer and Bruno Walter continued to work in Vienna for as long as they could. Two months before the Anschluss, Walter led and recorded a performance of the Ninth Symphony of Gustav Mahler, his musical mentor and fellow Jew, who from 1897 to 1907 had been the director of the Vienna Court Opera and one of the Philharmonic’s most admired conductors. But many members of both orchestras were open supporters of fascism, and not a few were anti-Semites who ardently backed Hitler. By 1942, 62 of the 123 active members of the Vienna Philharmonic were Nazi party members.
The admiration that Austro-German classical musicians had for Hitler is not entirely surprising since he was a well-informed music lover who declared in 1938 that “Germany has become the guardian of European culture and civilization.” He made the support of German art, music very much included, a key part of his political program. Accordingly, the Berlin Philharmonic was placed under the direct supervision of Joseph Goebbels, who ensured the cooperation of its members by repeatedly raising their salaries, exempting them from military service, and guaranteeing their old-age pensions. But there had never been any serious question of protest, any more than there would be among the members of the Vienna Philharmonic when the Nazis gobbled up Austria. Save for the Jews and one or two non-Jewish players who were fired for reasons of internal politics, the musicians went along unhesitatingly with Hitler’s desires.
With what did they go along? Above all, they agreed to the scrubbing of Jewish music from their programs and the dismissal of their Jewish colleagues. Some Jewish players managed to escape with their lives, but seven of the Vienna Philharmonic’s 11 Jews were either murdered by the Nazis or died as a direct result of official persecution. In addition, both orchestras performed regularly at official government functions and made tours and other public appearances for propaganda purposes, and both were treated as gems in the diadem of Nazi culture.
As for Furtwängler, the most prominent of the Austro-German orchestral conductors who served the Reich, his relationship to Nazism continues to be debated to this day. He had initially resisted the firing of the Berlin Philharmonic’s Jewish members and protected them for as long as he could. But he was also a committed (if woolly-minded) nationalist who believed that German music had “a different meaning for us Germans than for other nations” and notoriously declared in an open letter to Goebbels that “we all welcome with great joy and gratitude . . . the restoration of our national honor.” Thereafter he cooperated with the Nazis, by all accounts uncomfortably but—it must be said—willingly. A monster of egotism, he saw himself as the greatest living exponent of German music and believed it to be his duty to stay behind and serve a cause higher than what he took to be mere party politics. “Human beings are free wherever Wagner and Beethoven are played, and if they are not free at first, they are freed while listening to these works,” he naively assured a horrified Arturo Toscanini in 1937. “Music transports them to regions where the Gestapo can do them no harm.”O
nce the war was over, the U.S. occupation forces decided to enlist the Berlin Philharmonic in the service of a democratic, anti-Soviet Germany. Furtwängler and Herbert von Karajan, who succeeded him as principal conductor, were officially “de-Nazified” and their orchestra allowed to function largely undisturbed, though six Nazi Party members were fired. The Vienna Philharmonic received similarly privileged treatment.
Needless to say, there was more to this decision than Cold War politics. No one questioned the unique artistic stature of either orchestra. Moreover, the Vienna Philharmonic, precisely because of its insularity, was now seen as a living museum piece, a priceless repository of 19th-century musical tradition. Still, many musicians and listeners, Jews above all, looked askance at both orchestras for years to come, believing them to be tainted by Nazism.
Indeed they were, so much so that they treated many of their surviving Jewish ex-members in a way that can only be described as vicious. In the most blatant individual case, the violinist Szymon Goldberg, who had served as the Berlin Philharmonic’s concertmaster under Furtwängler, was not allowed to reassume his post in 1945 and was subsequently denied a pension. As for the Vienna Philharmonic, the fact that it made Helmut Wobisch its executive director says everything about its deep-seated unwillingness to face up to its collective sins.
Be that as it may, scarcely any prominent musicians chose to boycott either orchestra. Leonard Bernstein went so far as to affect a flippant attitude toward the morally equivocal conduct of the Austro-German artists whom he encountered in Europe after the war. Upon meeting Herbert von Karajan in 1954, he actually told his wife Felicia that he had become “real good friends with von Karajan, whom you would (and will) adore. My first Nazi.”
At the same time, though, Bernstein understood what he was choosing to overlook. When he conducted the Vienna Philharmonic for the first time in 1966, he wrote to his parents:
I am enjoying Vienna enormously—as much as a Jew can. There are so many sad memories here; one deals with so many ex-Nazis (and maybe still Nazis); and you never know if the public that is screaming bravo for you might contain someone who 25 years ago might have shot me dead. But it’s better to forgive, and if possible, forget. The city is so beautiful, and so full of tradition. Everyone here lives for music, especially opera, and I seem to be the new hero.
Did Bernstein sell his soul for the opportunity to work with so justly renowned an orchestra—and did he get his price by insisting that its members perform the symphonies of Mahler, with which he was by then closely identified? It is a fair question, one that does not lend itself to easy answers.
Even more revealing is the case of Bruno Walter, who never forgave Furtwängler for staying behind in Germany, informing him in an angry letter that “your art was used as a conspicuously effective means of propaganda for the regime of the Devil.” Yet Walter’s righteous anger did not stop him from conducting in Vienna after the war. Born in Berlin, he had come to identify with the Philharmonic so closely that it was impossible for him to seriously consider quitting its podium permanently. “Spiritually, I was a Viennese,” he wrote in Theme and Variations, his 1946 autobiography. In 1952, he made a second recording with the Vienna Philharmonic of Mahler’s Das Lied von der Erde, whose premiere he had conducted in 1911 and which he had recorded in Vienna 15 years earlier. One wonders what Walter, who had converted to Christianity but had been driven out of both his native lands for the crime of being Jewish, made of the text of the last movement: “My friend, / On this earth, fortune has not been kind to me! / Where do I go?”
As for the two great orchestras of the Third Reich, both have finally acknowledged their guilt and been forgiven, at least by those who know little of their past. It would occur to no one to decline on principle to perform with either group today. Such a gesture would surely be condemned as morally ostentatious, an exercise in what we now call virtue-signaling. Yet it is impossible to forget what Samuel Lipman wrote in 1993 in Commentary apropos the wartime conduct of Furtwängler: “The ultimate triumph of totalitarianism, I suppose it can be said, is that under its sway only a martyred death can be truly moral.” For the only martyrs of the Berlin and Vienna Philharmonics were their Jews. The orchestras themselves live on, tainted and beloved.
He knows what to reveal and what to conceal, understands the importance of keeping the semblance of distance between oneself and the story of the day, and comprehends the ins and outs of anonymous sourcing. Within days of his being fired by President Trump on May 9, for example, little green men and women, known only as his “associates,” began appearing in the pages of the New York Times and Washington Post to dispute key points of the president’s account of his dismissal and to promote Comey’s theory of the case.
“In a Private Dinner, Trump Demanded Loyalty,” the New York Times reported on May 11. “Comey Demurred.” The story was a straightforward narrative of events from Comey’s perspective, capped with an obligatory denial from the White House. The next day, the Washington Post reported, “Comey associates dispute Trump’s account of conversations.” The Post did not identify Comey’s associates, other than saying that they were “people who have worked with him.”
Maybe they were the same associates who had gabbed to the Times. Or maybe they were different ones. Who can tell? Regardless, the story these particular associates gave to the Post was readable and gripping. Comey, the Post reported, “was wary of private meetings and discussions with the president and did not offer the assurance, as Trump has claimed, that Trump was not under investigation as part of the probe into Russian interference in last year’s election.”
On May 16, Michael S. Schmidt of the Times published his scoop, “Comey Memo Says Trump Asked Him to End Flynn Investigation.” Schmidt didn’t see the memo for himself. Parts of it were read to him by—you guessed it—“one of Mr. Comey’s associates.” The following day, Robert Mueller was appointed special counsel to oversee the Russia investigation. On May 18, the Times, citing “two people briefed” on a call between Comey and the president, reported, “Comey, Unsettled by Trump, Is Said to Have Wanted Him Kept at a Distance.” And by the end of that week, Comey had agreed to testify before the Senate Intelligence Committee.
As his testimony approached, Comey’s people became more aggressive in their criticisms of the president. “Trump Should Be Scared, Comey Friend Says,” read the headline of a CNN interview with Brookings Institution fellow Benjamin Wittes. This “Comey friend” said he was “very shocked” when he learned that President Trump had asked Comey for loyalty. “I have no doubt that he regarded the group of people around the president as dishonorable,” Wittes said.
Comey, Wittes added, was so uncomfortable at the White House reception in January honoring law enforcement—the one where Comey lumbered across the room and Trump whispered something in his ear—that, as CNN paraphrased it, he “stood in a position so that his blue blazer would blend in with the room’s blue drapes in an effort for Trump to not notice him.” The integrity, the courage—can you feel it?
On June 6, the day before Comey’s prepared testimony was released, more “associates” told ABC that the director would “not corroborate Trump’s claim that on three separate occasions Comey told the president he was not under investigation.” And a “source with knowledge of Comey’s testimony” told CNN the same thing. In addition, ABC reported that, according to “a source familiar with Comey’s thinking,” the former director would say that Trump’s actions stopped short of obstruction of justice.
Maybe those sources weren’t as “familiar with Comey’s thinking” as they thought or hoped? To maximize the press coverage he already dominated, Comey had authorized the Senate Intelligence Committee to release his testimony ahead of his personal interview. That testimony told a different story than what had been reported by CNN and ABC (and by the Post on May 12). Comey had in fact told Trump the president was not under investigation—on January 6, January 27, and March 30. Moreover, the word “obstruction” did not appear at all in his written text. The senators asked Comey if he felt Trump obstructed justice. He declined to answer either way.
My guess is that Comey’s associates lacked Comey’s scalpel-like, almost Jesuitical ability to make distinctions, and therefore misunderstood what he was telling them to say to the press. Because it’s obvious Comey was the one behind the stories of Trump’s dishonesty and bad behavior. He admitted as much in front of the cameras in a remarkable exchange with Senator Susan Collins of Maine.
Comey said that, after Trump tweeted on May 12 that he’d better hope there aren’t “tapes” of their conversations, “I asked a friend of mine to share the content of the memo with a reporter. Didn’t do it myself, for a variety of reasons. But I asked him to, because I thought that might prompt the appointment of a special counsel. And so I asked a close friend of mine to do it.”
Collins asked whether that friend had been Wittes, known to cable news junkies as Comey’s bestie. Comey said no. The source for the New York Times article was “a good friend of mine who’s a professor at Columbia Law School,” Daniel Richman.
Every time I watch or read that exchange, I am amazed. Here is the former director of the FBI just flat-out admitting that, for months, he wrote down every interaction he had with the president of the United States because he wanted a written record in case the president ever fired or lied about him. And when the president did fire and lie about him, that director set in motion a series of public disclosures with the intent of not only embarrassing the president, but also forcing the appointment of a special counsel who might end up investigating the president for who knows what. And none of this would have happened if the president had not fired Comey or tweeted about him. He told the Senate that if Trump hadn’t dismissed him, he most likely would still be on the job.
Rarely, in my view, are high officials so transparent in describing how Washington works. Comey revealed to the world that he was keeping a file on his boss, that he used go-betweens to get his story into the press, that “investigative journalism” is often just powerful people handing documents to reporters to further their careers or agendas or even to get revenge. And as long as you maintain some distance from the fallout, and stick to the absolute letter of the law, you will come out on top, so long as you have a small army of nightingales singing to reporters on your behalf.
“It’s the end of the Comey era,” A.B. Stoddard said on Special Report with Bret Baier the other day. On the contrary: I have a feeling that, as the Russia investigation proceeds, we will be hearing much more from Comey. And from his “associates.” And his “friends.” And persons “familiar with his thinking.”
In April, COMMENTARY asked a wide variety of writers,
thinkers, and broadcasters to respond to this question: Is free speech under threat in the United States? We received twenty-seven responses. We publish them here in alphabetical order.
Floyd AbramsFree expression threatened? By Donald Trump? I guess you could say so.
When a president engages in daily denigration of the press, when he characterizes it as the enemy of the people, when he repeatedly says that the libel laws should be “loosened” so he can personally commence more litigation, when he says that journalists shouldn’t be allowed to use confidential sources, it is difficult even to suggest that he has not threatened free speech. And when he says to the head of the FBI (as former FBI director James Comey has said that he did) that Comey should consider “putting reporters in jail for publishing classified information,” it is difficult not to take those threats seriously.
The harder question, though, is this: How real are the threats? Or, as Michael Gerson put it in the Washington Post: Will Trump “go beyond mere Twitter abuse and move against institutions that limit his power?” Some of the president’s threats against the institution of the press, wittingly or not, have been simply preposterous. Surely someone has told him by now that neither he nor Congress can “loosen” libel laws; while each state has its own libel law, there is no federal libel law and thus nothing for him to loosen. What he obviously takes issue with is the impact that the Supreme Court’s 1964 First Amendment opinion in New York Times v. Sullivan has had on state libel laws. The case determined that public officials who sue for libel may not prevail unless they demonstrate that the statements made about them were false and were made with actual knowledge or suspicion of that falsity. So his objection to the rules governing libel law is to nothing less than the application of the First Amendment itself.
In other areas, however, the Trump administration has far more power to imperil free speech. We live under an Espionage Act, adopted a century ago, which is both broad in its language and uncommonly vague in its meaning. As such, it remains a half-open door through which an administration that is hostile to free speech might walk. Such an administration could initiate criminal proceedings against journalists who write about defense- or intelligence-related topics on the basis that classified information was leaked to them by present or former government employees. No such action has ever been commenced against a journalist. Press lawyers and civil-liberties advocates have strong arguments that the law may not be read so broadly and still be consistent with the First Amendment. But the scope of the Espionage Act and the impact of the First Amendment upon its interpretation remain unknown.
A related area in which the attitude of an administration toward the press may affect the latter’s ability to function as a check on government relates to the ability of journalists to protect the identity of their confidential sources. The Obama administration prosecuted more Espionage Act cases against sources of information to journalists than all prior administrations combined. After a good deal of deserved press criticism, it agreed to expand the internal guidelines of the Department of Justice designed to limit the circumstances under which such source revelation is demanded. But the guidelines are none too protective and are, after all, simply guidelines. A new administration is free to change or limit them or, in fact, abandon them altogether. In this area, as in so many others, it is too early to judge the ultimate treatment of free expression by the Trump administration. But the threats are real, and there is good reason to be wary.
Floyd Abrams is the author of The Soul of the First Amendment (Yale University Press, 2017).
Ayaan Hirsi AliFreedom of speech is being threatened in the United States by a nascent culture of hostility to different points of view. As political divisions in America have deepened, a conformist mentality of “right thinking” has spread across the country. Increasingly, American universities, where no intellectual doctrine ought to escape critical scrutiny, are some of the most restrictive domains when it comes to asking open-ended questions on subjects such as Islam.
Legally, speech in the United States is protected to a degree unmatched in almost any industrialized country. The U.S. has avoided unpredictable Canadian-style restrictions on speech, for example. I remain optimistic that as long as we have the First Amendment in the U.S., any attempt at formal legal censorship will be vigorously challenged.
Culturally, however, matters are very different in America. The regressive left is the forerunner threatening free speech on any issue that is important to progressives. The current pressure coming from those who call themselves “social-justice warriors” is unlikely to lead to successful legislation to curb the First Amendment. Instead, censorship is spreading in the cultural realm, particularly at institutions of higher learning.
The way activists of the regressive left achieve silence or censorship is by creating a taboo, and one of the most pernicious taboos in operation today is the word “Islamophobia.” Islamists are similarly motivated to rule any critical scrutiny of Islamic doctrine out of order. There is now a university center (funded by Saudi money) in the U.S. dedicated to monitoring and denouncing incidences of “Islamophobia.”
The term “Islamophobia” is used against critics of political Islam, but also against progressive reformers within Islam. The term implies an irrational fear that is tainted by hatred, and it has had a chilling effect on free speech. In fact, “Islamophobia” is a poorly defined term. Islam is not a race, and it is very often perfectly rational to fear some expressions of Islam. No set of ideas should be beyond critical scrutiny.
To push back in this cultural realm—in our universities, in public discourse—those favoring free speech should focus more on the message of dawa, the set of ideas that the Islamists want to promote. If the aims of dawa are sufficiently exposed, ordinary Americans and Muslim Americans will reject it. The Islamist message is a message of divisiveness, misogyny, and hatred. It’s anachronistic and wants people to live by tribal norms dating from the seventh century. The best antidote to Islamic extremism is the revelation of what its primary objective is: a society governed by Sharia. This is the opposite of censorship: It is documenting reality. What is life like in Saudi Arabia, Iran, the Northern Nigerian States? What is the true nature of Sharia law?
Islamists want to hide the true meaning of Sharia, Jihad, and the implications for women, gays, religious minorities, and infidels under the veil of “Islamophobia.” Islamists use “Islamophobia” to obfuscate their vision and imply that any scrutiny of political Islam is hatred and bigotry. The antidote to this is more exposure and more speech.
As pressure on freedom of speech increases from the regressive left, we must reject the notions that only Muslims can speak about Islam, and that any critical examination of Islamic doctrines is inherently “racist.”
Instead of contorting Western intellectual traditions so as not to offend our Muslim fellow citizens, we need to defend the Muslim dissidents who are risking their lives to promote the human rights we take for granted: equality for women, tolerance of all religions and orientations, our hard-won freedoms of speech and thought.
It is by nurturing and protecting such speech that progressive reforms can emerge within Islam. By accepting the increasingly narrow confines of acceptable discourse on issues such as Islam, we do dissidents and progressive reformers within Islam a grave disservice. For truly progressive reforms within Islam to be possible, full freedom of speech will be required.
Ayaan Hirsi Ali is a research fellow at the Hoover Institution, Stanford University, and the founder of the AHA Foundation.
Lee C. BollingerI know it is too much to expect that political discourse mimic the measured, self-questioning, rational, footnoting standards of the academy, but there is a difference between robust political debate and political debate infected with fear or panic. The latter introduces a state of mind that is visceral and irrational. In the realm of fear, we move beyond the reach of reason and a sense of proportionality. When we fear, we lose the capacity to listen and can become insensitive and mean.
Our Constitution is well aware of this fact about the human mind and of its negative political consequences. In the First Amendment jurisprudence established over the past century, we find many expressions of the problematic state of mind that is produced by fear. Among the most famous and potent is that of Justice Brandeis in Whitney v. California in 1927, one of the many cases involving aggravated fears of subversive threats from abroad. “It is the function of (free) speech,” he said, “to free men from the bondage of irrational fears.” “Men feared witches,” Brandeis continued, “and burned women.”
Today, our “witches” are terrorists, and Brandeis’s metaphorical “women” include the refugees (mostly children) and displaced persons, immigrants, and foreigners whose lives have been thrown into suspension and doubt by policies of exclusion.
The same fears of the foreign that take hold of a population inevitably infect our internal interactions and institutions, yielding suppression of unpopular and dissenting voices, victimization of vulnerable groups, attacks on the media, and the rise of demagoguery, with its disdain for facts, reason, expertise, and tolerance.
All of this poses a very special obligation on those of us within universities. Not only must we make the case in every venue for the values that form the core of who we are and what we do, but we must also live up to our own principles of free inquiry and fearless engagement with all ideas. This is why recent incidents on a handful of college campuses disrupting and effectively censoring speakers is so alarming. Such acts not only betray a basic principle but also inflame a rising prejudice against the academic community, and they feed efforts to delegitimize our work, at the very moment when it’s most needed.
I do not for a second support the view that this generation has an unhealthy aversion to engaging differences of opinion. That is a modern trope of polarization, as is the portrayal of universities as hypocritical about academic freedom and political correctness. But now, in this environment especially, universities must be at the forefront of defending the rights of all students and faculty to listen to controversial voices, to engage disagreeable viewpoints, and to make every effort to demonstrate our commitment to the sort of fearless and spirited debate that we are simultaneously asking of the larger society. Anyone with a voice can shout over a speaker; but being able to listen to and then effectively rebut those with whom we disagree—particularly those who themselves peddle intolerance—is one of the greatest skills our education can bestow. And it is something our democracy desperately needs more of. That is why, I say to you now, if speakers who are being denied access to other campuses come here, I will personally volunteer to introduce them, and listen to them, however much I may disagree with them. But I will also never hesitate to make clear why I disagree with them.
Lee C. Bollinger is the 19th president of Columbia University and the author of Uninhibited, Robust, and Wide-Open: A Free Press for a New Century. This piece has been excerpted from President Bollinger’s May 17 commencement address.
Richard A. Epstein
Today, the greatest threat to the constitutional protection of freedom of speech comes from campus rabble-rousers who invoke this very protection. In their book, the speech of people like Charles Murray and Heather Mac Donald constitutes a form of violence, bordering on genocide, that receives no First Amendment protection. Enlightened protestors are both bound and entitled to shout them down, by force or other disruptive actions, if their universities are so foolish as to extend them an invitation to speak. Any indignant minority may take the law into its own hands to eradicate the intellectual cancer before it spreads on their own campus.
By such tortured logic, a new generation of vigilantes distorts the First Amendment doctrine: Speech becomes violence, and violence becomes heroic acts of self-defense. The standard First Amendment interpretation emphatically rejects that view. Of course, the First Amendment doesn’t let you say what you want when and wherever you want to. Your freedom of speech is subject to the same limitations as your freedom of action. So you have no constitutional license to assault other people, to lie to them, or to form cartels to bilk them in the marketplace. But folks such as Murray, Mac Donald, and even Yiannopoulos do not come close to crossing into that forbidden territory. They are not using, for example, “fighting words,” rightly limited to words or actions calculated to provoke immediate aggression against a known target. Fighting words are worlds apart from speech that provokes a negative reaction in those who find your speech offensive solely because of the content of its message.
This distinction is central to the First Amendment. Fighting words have to be blocked by well-tailored criminal and civil sanctions lest some people gain license to intimidate others from speaking or peaceably assembling. The remedy for mere offense is to speak one’s mind in response. But it never gives anyone the right to block the speech of others, lest everyone be able to unilaterally increase his sphere of action by getting really angry about the beliefs of others. No one has the right to silence others by working himself into a fit of rage.
Obviously, it is intolerable to let mutual animosity generate factional warfare, whereby everyone can use force to silence rivals. To avoid this war of all against all, each side claims that only its actions are privileged. These selective claims quickly degenerate into a form of viewpoint discrimination, which undermines one of the central protections that traditional First Amendment law erects: a wall against each and every group out to destroy the level playing field on which robust political debate rests. Every group should be at risk for having its message fall flat. The new campus radicals want to upend that understanding by shutting down their adversaries if their universities do not. Their aggression must be met, if necessary, by counterforce. Silence in the face of aggression is not an acceptable alternative.
Richard A. Epstein is the Laurence A. Tisch Professor of Law at the New York University School of Law.
David FrenchWe’re living in the midst of a troubling paradox. At the exact same time that First Amendment jurisprudence has arguably never been stronger and more protective of free expression, millions of Americans feel they simply can’t speak freely. Indeed, talk to Americans living and working in the deep-blue confines of the academy, Hollywood, and the tech sector, and you’ll get a sense of palpable fear. They’ll explain that they can’t say what they think and keep their jobs, their friends, and sometimes even their families.
The government isn’t cracking down or censoring; instead, Americans are using free speech to destroy free speech. For example, a social-media shaming campaign is an act of free speech. So is an economic boycott. So is turning one’s back on a public speaker. So is a private corporation firing a dissenting employee for purely political reasons. Each of these actions is largely protected from government interference, and each one represents an expression of the speaker’s ideas and values.
The problem, however, is obvious. The goal of each of these kinds of actions isn’t to persuade; it’s to intimidate. The goal isn’t to foster dialogue but to coerce conformity. The result is a marketplace of ideas that has been emptied of all but the approved ideological vendors—at least in those communities that are dominated by online thugs and corporate bullies. Indeed, this mindset has become so prevalent that in places such as Portland, Berkeley, Middlebury, and elsewhere, the bullies and thugs have crossed the line from protected—albeit abusive—speech into outright shout-downs and mob violence.
But there’s something else going on, something that’s insidious in its own way. While politically correct shaming still has great power in deep-blue America, its effect in the rest of the country is to trigger a furious backlash, one characterized less by a desire for dialogue and discourse than by its own rage and scorn. So we’re moving toward two Americas—one that ruthlessly (and occasionally illegally) suppresses dissenting speech and the other that is dangerously close to believing that the opposite of political correctness isn’t a fearless expression of truth but rather the fearless expression of ideas best calculated to enrage your opponents.
The result is a partisan feedback loop where right-wing rage spurs left-wing censorship, which spurs even more right-wing rage. For one side, a true free-speech culture is a threat to feelings, sensitivities, and social justice. The other side waves high the banner of “free speech” to sometimes elevate the worst voices to the highest platforms—not so much to protect the First Amendment as to infuriate the hated “snowflakes” and trigger the most hysterical overreactions.
The culturally sustainable argument for free speech is something else entirely. It reminds the cultural left of its own debt to free speech while reminding the political right that a movement allegedly centered around constitutional values can’t abandon the concept of ordered liberty. The culture of free speech thrives when all sides remember their moral responsibilities—to both protect the right of dissent and to engage in ideological combat with a measure of grace and humility.
David French is a senior writer at National Review.
Pamela GellerThe real question isn’t whether free speech is under threat in the United States, but rather, whether it’s irretrievably lost. Can we get it back? Not without war, I suspect, as is evidenced by the violence at colleges whenever there’s the shamefully rare event of a conservative speaker on campus.
Free speech is the soul of our nation and the foundation of all our other freedoms. If we can’t speak out against injustice and evil, those forces will prevail. Freedom of speech is the foundation of a free society. Without it, a tyrant can wreak havoc unopposed, while his opponents are silenced.
With that principle in mind, I organized a free-speech event in Garland, Texas. The world had recently been rocked by the murder of the Charlie Hebdo cartoonists. My version of “Je Suis Charlie” was an event here in America to show that we can still speak freely and draw whatever we like in the Land of the Free. Yet even after jihadists attacked our event, I was blamed—by Donald Trump among others—for provoking Muslims. And if I tried to hold a similar event now, no arena in the country would allow me to do so—not just because of the security risk, but because of the moral cowardice of all intellectual appeasers.
Under what law is it wrong to depict Muhammad? Under Islamic law. But I am not a Muslim, I don’t live under Sharia. America isn’t under Islamic law, yet for standing for free speech, I’ve been:
- Prevented from running our advertisements in every major city in this country. We have won free-speech lawsuits all over the country, which officials circumvent by prohibiting all political ads (while making exceptions for ads from Muslim advocacy groups);
- Shunned by the right, shut out of the Conservative Political Action Conference;
- Shunned by Jewish groups at the behest of terror-linked groups such as the Council on American-Islamic Relations;
- Blacklisted from speaking at universities;
- Prevented from publishing books, for security reasons and because publishers fear shaming from the left;
- Banned from Britain.
A Seattle court accused me of trying to shut down free speech after we merely tried to run an FBI poster on global terrorism, because authorities had banned all political ads in other cities to avoid running ours. Seattle blamed us for that, which was like blaming a woman for being raped because she was wearing a short skirt.
This kind of vilification and shunning is key to the left’s plan to shut down all dissent from its agenda—they make legislation restricting speech unnecessary.
The same refusal to allow our point of view to be heard has manifested itself elsewhere. The foundation of my work is individual rights and equality for all before the law. These are the foundational principles of our constitutional republic. That is now considered controversial. Truth is the new hate speech. Truth is going to be criminalized.
The First Amendment doesn’t only protect ideas that are sanctioned by the cultural and political elites. If “hate speech” laws are enacted, who would decide what’s permissible and what’s forbidden? The government? The gunmen in Garland?
There has been an inversion of the founding premise of this nation. No longer is it the subordination of might to right, but right to might. History is repeatedly deformed with the bloody consequences of this transition.
Pamela Geller is the editor in chief of the Geller Report and president of the American Freedom Defense Initiative.
Jonah GoldbergOf course free speech is under threat in America. Frankly, it’s always under threat in America because it’s always under threat everywhere. Ronald Reagan was right when he said in 1961, “Freedom is never more than one generation away from extinction. We didn’t pass it on to our children in the bloodstream. It must be fought for, protected, and handed on for them to do the same.”
This is more than political boilerplate. Reagan identified the source of the threat: human nature. God may have endowed us with a right to liberty, but he didn’t give us all a taste for it. As with most finer things, we must work to acquire a taste for it. That is what civilization—or at least our civilization—is supposed to do: cultivate attachments to certain ideals. “Cultivate” shares the same Latin root as “culture,” cultus, and properly understood they mean the same thing: to grow, nurture, and sustain through labor.
In the past, threats to free speech have taken many forms—nationalist passion, Comstockery (both good and bad), political suppression, etc.—but the threat to free speech today is different. It is less top-down and more bottom-up. We are cultivating a generation of young people to reject free speech as an important value.
One could mark the beginning of the self-esteem movement with Nathaniel Branden’s 1969 paper, “The Psychology of Self-Esteem,” which claimed that “feelings of self-esteem were the key to success in life.” This understandable idea ran amok in our schools and in our culture. When I was a kid, Saturday-morning cartoons were punctuated with public-service announcements telling kids: “The most important person in the whole wide world is you, and you hardly even know you!”
The self-esteem craze was just part of the cocktail of educational fads. Other ingredients included multiculturalism, the anti-bullying crusade, and, of course, that broad phenomenon known as “political correctness.” Combined, they’ve produced a generation that rejects the old adage “sticks and stones can break my bones but words can never harm me” in favor of the notion that “words hurt.” What we call political correctness has been on college campuses for decades. But it lacked a critical mass of young people who were sufficiently receptive to it to make it a fully successful ideology. The campus commissars welcomed the new “snowflakes” with open arms; truly, these are the ones we’ve been waiting for.
“Words hurt” is a fashionable concept in psychology today. (See Psychology Today: “Why Words Can Hurt at Least as Much as Sticks and Stones.”) But it’s actually a much older idea than the “sticks and stones” aphorism. For most of human history, it was a crime to say insulting or “injurious” things about aristocrats, rulers, the Church, etc. That tendency didn’t evaporate with the Divine Right of Kings. Jonathan Haidt has written at book length about our natural capacity to create zones of sanctity, immune from reason.
And that is the threat free speech faces today. Those who inveigh against “hate speech” are in reality fighting “heresy speech”—ideas that do “violence” to sacred notions of self-esteem, racial or gender equality, climate change, and so on. Put whatever label you want on it, contemporary “social justice” progressivism acts as a religion, and it has no patience for blasphemy.
When Napoleon’s forces converted churches into stables, the clergy did not object on the grounds that regulations regarding the proper care and feeding of animals had been violated. They complained of sacrilege and blasphemy. When Charles Murray or Christina Hoff Summers visits college campuses, the protestors are behaving like the zealous acolytes of St. Jerome. Appeals to the First Amendment have as much power over the “antifa” fanatics as appeals to Odin did to champions of the New Faith.
That is the real threat to free speech today.
Jonah Goldberg is a senior editor at National Review and a fellow at the American Enterprise Institute.
KC JohnsonIn early May, the Washington Post urged universities to make clear that “racist signs, symbols, and speech are off-limits.” Given the extraordinarily broad definition of what constitutes “racist” speech at most institutions of higher education, this demand would single out most right-of-center (and, in some cases, even centrist and liberal) discourse on issues of race or ethnicity. The editorial provided the highest-profile example of how hostility to free speech, once confined to the ideological fringe on campus, has migrated to the liberal mainstream.
The last few years have seen periodic college protests—featuring claims that significant amounts of political speech constitute “violence,” thereby justifying censorship—followed by even more troubling attempts to appease the protesters. After the mob scene that greeted Charles Murray upon his visit to Middlebury College, for instance, the student government criticized any punishment for the protesters, and several student leaders wanted to require that future speakers conform to the college’s “community standard” on issues of race, gender, and ethnicity. In the last few months, similar attempts to stifle the free exchange of ideas in the name of promoting diversity occurred at Wesleyan, Claremont McKenna, and Duke. Offering an extreme interpretation of this point of view, one CUNY professor recently dismissed dialogue as “inherently conservative,” since it reinforced the “relations of power that presently exist.”
It’s easy, of course, to dismiss campus hostility to free speech as affecting only a small segment of American public life—albeit one that trains the next generation of judges, legislators, and voters. But, as Jonathan Chait observed in 2015, denying “the legitimacy of political pluralism on issues of race and gender” has broad appeal on the left. It is only most apparent on campus because “the academy is one of the few bastions of American life where the political left can muster the strength to impose its political hegemony upon others.” During his time in office, Barack Obama generally urged fellow liberals to support open intellectual debate. But the current campus environment previews the position of free speech in a post-Obama Democratic Party, increasingly oriented around identity politics.
Waning support on one end of the ideological spectrum for this bedrock American principle should provide a political opening for the other side. The Trump administration, however, seems poorly suited to make the case. Throughout his public career, Trump has rarely supported free speech, even in the abstract, and has periodically embraced legal changes to facilitate libel lawsuits. Moreover, the right-wing populism that motivates Trump’s base has a long tradition of ideological hostility to civil liberties of all types. Even in campus contexts, conservatives have defended free speech inconsistently, as seen in recent calls that CUNY disinvite anti-Zionist fanatic Linda Sarsour as a commencement speaker.
In a sharply polarized political environment, awash in dubiously-sourced information, free speech is all the more important. Yet this same environment has seen both sides, most blatantly elements of the left on campuses, demand restrictions on their ideological foes’ free speech in the name of promoting a greater good.
KC Johnson is a professor of history at Brooklyn College and the CUNY Graduate Center.
Laura KipnisI find myself with a strange-bedfellows problem lately. Here I am, a left-wing feminist professor invited onto the pages of Commentary—though I’d be thrilled if it were still 1959—while fielding speaking requests from right-wing think tanks and libertarians who oppose child-labor laws.
Somehow I’ve ended up in the middle of the free-speech-on-campus debate. My initial crime was publishing a somewhat contentious essay about campus sexual paranoia that put me on the receiving end of Title IX complaints. Apparently I’d created a “hostile environment” at my university. I was investigated (for 72 days). Then I wrote up what I’d learned about these campus inquisitions in a second essay. Then I wrote about it all some more, in a book exposing the kangaroo-court elements of the Title IX process—and the extra-legal gag orders imposed on everyone caught in its widening snare.
I can’t really comment on whether more charges have been filed against me over the book. I’ll just say that writing about being a Title IX respondent could easily become a life’s work. I learned, shortly after writing this piece, that I and my publisher were being sued for defamation, among other things.
Is free speech under threat on American campuses? Yes. We know all about student activists who wish to shut down talks by people with opposing views. I got smeared with a bit of that myself, after a speaking invitation at Wellesley—some students made a video protesting my visit before I arrived. The talk went fine, though a group of concerned faculty circulated an open letter afterward also protesting the invitation: My views on sexual politics were too heretical, and might have offended students.
I didn’t take any of this too seriously, even as right-wing pundits crowed, with Wellesley as their latest outrage bait. It was another opportunity to mock student activists, and the fact that I was myself a feminist rather than a Charles Murray or a Milo Yiannopoulos, made them positively gleeful.
I do find myself wondering where all my new free-speech pals were when another left-wing professor, Steven Salaita, was fired (or if you prefer euphemism, “his job offer was withdrawn”) from the University of Illinois after he tweeted criticism of Israel’s Gaza policy. Sure the tweets were hyperbolic, but hyperbole and strong opinions are protected speech, too.
I guess free speech is easy to celebrate until it actually challenges something. Funny, I haven’t seen Milo around lately—so beloved by my new friends when he was bashing minorities and transgender kids. Then he mistakenly said something authentic (who knew he was capable of it!), reminiscing about an experience a lot of gay men have shared: teenage sex with older men. He tried walking it back—no, no, he’d been a victim, not a participant—but his fan base was shrieking about pedophilia and fleeing in droves. Gee, they were all so against “political correctness” a few minutes before.
It’s easy to be a free-speech fan when your feathers aren’t being ruffled. No doubt what makes me palatable to the anti-PC crowd is having thus far failed to ruffle them enough. I’m just going to have to work harder.
Laura Kipnis’s latest book is Unwanted Advances: Sexual Paranoia Comes to Campus.
Eugene KontorovichThe free and open exchange of views—especially politically conservative or traditionally religious ones—is being challenged. This is taking place not just at college campuses but throughout our public spaces and cultural institutions. James Watson was fired from the lab he led since 1968 and could not speak at New York University because of petty, censorious students who would not know DNA from LSD. Our nation’s founders and heroes are being “disappeared” from public commemoration, like Trotsky from a photograph of Soviet rulers.
These attacks on “free speech” are not the result of government action. They are not what the First Amendment protects against. The current methods—professional and social shaming, exclusion, and employment termination—are more inchoate, and their effects are multiplied by self-censorship. A young conservative legal scholar might find himself thinking: “If the late Justice Antonin Scalia can posthumously be deemed a ‘bigot’ by many academics, what chance have I?”
Ironically, artists and intellectuals have long prided themselves on being the first defenders of free speech. Today, it is the institutions of both popular and high culture that are the censors. Is there one poet in the country who would speak out for Ann Coulter?
The inhibition of speech at universities is part of a broader social phenomenon of making longstanding, traditional views and practices sinful overnight. Conservatives have not put up much resistance to this. To paraphrase Martin Niemöller’s famous dictum: “First they came for Robert E. Lee, and I said nothing, because Robert E. Lee meant nothing to me.”
The situation with respect to Israel and expressions of support for it deserves separate discussion. Even as university administrators give political power to favored ideologies by letting them create “safe spaces” (safe from opposing views), Jews find themselves and their state at the receiving end of claims of apartheid—modern day blood libels. It is not surprising if Jewish students react by demanding that they get a safe space of their own. It is even less surprising if their parents, paying $65,000 a year, want their children to have a nicer time of it. One hears Jewish groups frequently express concern about Jewish students feeling increasingly isolated and uncomfortable on campus.
But demanding selective protection from the new ideological commissars is unlikely to bring the desired results. First, this new ideology, even if it can be harnessed momentarily to give respite to harassed Jews on campus, is ultimately illiberal and will be controlled by “progressive” forces. Second, it is not so terrible for Jews in the Diaspora to feel a bit uncomfortable. It has been the common condition of Jews throughout the millennia. The social awkwardness that Jews at liberal arts schools might feel in being associated with Israel is of course one of the primary justifications for the Jewish State. Facing the snowflakes incapable of hearing a dissonant view—but who nonetheless, in the grip of intersectional ecstasy, revile Jewish self-determination—Jewish students should toughen up.
Eugene Kontorovich teaches constitutional law at Northwestern University and heads the international law department of the Kohelet Policy Forum in Jerusalem.
Nicholas LemannThere’s an old Tom Wolfe essay in which he describes being on a panel discussion at Princeton in 1965 and provoking the other panelists by announcing that America, rather than being in crisis, is in the middle of a “happiness explosion.” He was arguing that the mass effects of 20 years of post–World War II prosperity made for a larger phenomenon than the Vietnam War, the racial crisis, and the other primary concerns of intellectuals at the time.
In the same spirit, I’d say that we are in the middle of a free-speech explosion, because of 20-plus years of the Internet and 10-plus years of social media. If one understands speech as disseminated individual opinion, then surely we live in the free-speech-est society in the history of the world. Anybody with access to the unimpeded World Wide Web can say anything to a global audience, and anybody can hear anything, too. All threats to free speech should be understood in the context of this overwhelmingly reality.
It is a comforting fantasy that a genuine free-speech regime will empower mainly “good,” but previously repressed, speech. Conversely, repressive regimes that are candid enough to explain their anti-free-speech policies usually say that they’re not against free speech, just “bad” speech. We have to accept that more free speech probably means, in the aggregate, more bad speech, and also a weakening of the power, authority, and economic support for information professionals such as journalists. Welcome to the United States in 2017.
I am lucky enough to live and work on the campus of a university, Columbia, that has been blessedly free of successful attempts to repress free speech. Just in the last few weeks, Charles Murray and Dinesh D’Souza have spoken here without incident. But, yes, the evidently growing popularity of the idea that “hate speech” shouldn’t be permitted on campuses is a problem, especially, it seems, at small private liberal-arts colleges. We should all do our part, and I do, by frequently and publicly endorsing free-speech principles. Opposing the BDS movement falls squarely into that category.
It’s not just on campuses that free-speech vigilance is needed, though. The number-one threat to free speech, to my mind, is that the wide-open Web has been replaced by privately owned platforms such as Facebook and Google as the way most people experience the public life of the Internet. These companies are committed to banning “hate speech,” and they are eager to operate freely in countries, like China, that don’t permit free political speech. That makes for a far more consequential constrained environment than any campus’s speech code.
Also, Donald Trump regularly engages in presidentially unprecedented rhetoric demonizing people who disagree with him. He seems to think this is all in good fun, but, as we have already seen at his rallies, not everybody hears it that way. The place where Trumpism will endanger free speech isn’t in the center—the White House press room—but at the periphery, for example in the way that local police handle bumptious protestors and the journalists covering them. This is already happening around the country. If Trump were as disciplined and knowledgeable as Vladimir Putin or Recep Tayyip Erdogan, which so far he seems not to be, then free speech could be in even more serious danger from government, which in most places is its usual main enemy.
Nicholas Lemann is a professor at Columbia Journalism School and a staff writer for the New Yorker.
Michael J. LewisFree speech is a right but it is also a habit, and where the habit shrivels so will the right. If free speech today is in headlong retreat—everywhere threatened by regulation, organized harassment, and even violence—it is in part because our political culture allowed the practice of persuasive oratory to atrophy. The process began in 1973, an unforeseen side effect of Roe v. Wade. Legislators were delighted to learn that by relegating this divisive matter of public policy to the Supreme Court and adopting a merely symbolic position, they could sit all the more safely in their safe seats.
Since then, one crucial question of public policy after another has been punted out of the realm of politics and into the judicial. Issues that might have been debated with all the rhetorical agility of a Lincoln and a Douglas, and then subjected to a process of negotiation, compromise, and voting, have instead been settled by decree: e.g., Chevron, Kelo, Obergefell. The consequences for speech have been pernicious. Since the time of Pericles, deliberative democracy has been predicated on the art of persuasion, which demands the forceful clarity of thought and expression without which no one has ever been persuaded. But a legislature that relegates its authority to judges and regulators will awaken to discover its oratorical culture has been stunted. When politicians, rather than seeking to convince and win over, prefer to project a studied and pleasant vagueness, debate withers into tedious defensive performance. It has been decades since any presidential debate has seen any sustained give and take over a matter of policy. If there is any suspense at all, it is only the possibility that a fatigued or peeved candidate might blurt out that tactless shard of truth known as a gaffe.
A generation accustomed to hearing platitudes smoothly dispensed from behind a teleprompter will find the speech of a fearless extemporaneous speaker to be startling, even disquieting; unfamiliar ideas always are. Unhappily, they have been taught to interpret that disquiet as an injury done to them, rather than as a premise offered to them to consider. All this would not have happened—certainly not to this extent—had not our deliberative democracy decided a generation ago that it preferred the security of incumbency to the risks of unshackled debate. The compulsory contraction of free speech on college campuses is but the logical extension of the voluntary contraction of free speech in our political culture.
Michael J. Lewis’s new book is City of Refuge: Separatists and Utopian Town Planning (Princeton University Press).
Heather Mac DonaldThe answer to the symposium question depends on how powerful the transmission belt is between academia and the rest of the country. On college campuses, violence and brute force are silencing speakers who challenge left-wing campus orthodoxies. These totalitarian outbreaks have been met with listless denunciations by college presidents, followed by . . . virtually nothing. As of mid-May, the only discipline imposed for 2017’s mass attacks on free speech at UC Berkeley, Middlebury, and Clare-mont McKenna College was a letter of reprimand inserted—sometimes only temporarily—into the files of several dozen Middlebury students, accompanied by a brief period of probation. Previous outbreaks of narcis-sistic incivility, such as the screaming-girl fit at Yale and the assaults on attendees of Yale’s Buckley program, were discreetly ignored by college administrators.
Meanwhile, the professoriate unapologetically defends censorship and violence. After the February 1 riot in Berkeley to prevent Milo Yiannapoulos from speaking, Déborah Blocker, associate professor of French at UC Berkeley, praised the rioters. They were “very well-organized and very efficient,” Blocker reported admiringly to her fellow professors. “They attacked property but they attacked it very sparingly, destroying just enough University property to obtain the cancellation order for the MY event and making sure no one in the crowd got hurt” (emphasis in original). (In fact, perceived Milo and Donald Trump supporters were sucker-punched and maced; businesses downtown were torched and vandalized.) New York University’s vice provost for faculty, arts, humanities, and diversity, Ulrich Baer, displayed Orwellian logic by claiming in a New York Times op-ed that shutting down speech “should be understood as an attempt to ensure the conditions of free speech for a greater group of people.”
Will non-academic institutions take up this zeal for outright censorship? Other ideological products of the left-wing academy have been fully absorbed and operationalized. Racial victimology, which drives much of the campus censorship, is now standard in government and business. Corporate diversity trainers counsel that bias is responsible for any lack of proportional racial representation in the corporate ranks. Racial disparities in school discipline and incarceration are universally attributed to racism rather than to behavior. Public figures have lost jobs for violating politically correct taboos.
Yet Americans possess an instinctive commitment to the First Amendment. Federal judges, hardly an extension of the Federalist Society, have overwhelmingly struck down campus speech codes. It is hard to imagine that they would be any more tolerant of the hate-speech legislation so prevalent in Europe. So the question becomes: At what point does the pressure to conform to the elite worldview curtail freedom of thought and expression, even without explicit bans on speech?
Social stigma against conservative viewpoints is not the same as actual censorship. But the line can blur. The Obama administration used regulatory power to impose a behavioral conformity on public and private entities. School administrators may have technically still possessed the right to dissent from novel theories of gender, but they had to behave as if they were fully on board with the transgender revolution when it came to allowing boys to use girls’ bathrooms and locker rooms.
Had Hillary Clinton had been elected president, the federal bureaucracy would have mimicked campus diversocrats with even greater zeal. That threat, at least, has been avoided. Heresies against left-wing dogma may still enter the public arena, if only by the back door. The mainstream media have lurched even further left in the Trump era, but the conservative media, however mocked and marginalized, are expanding (though Twitter and Facebook’s censorship of conservative speakers could be a harbinger of more official silencing).
Outside the academy, free speech is still legally protected, but its exercise requires ever greater determination.
Heather Mac Donald is a fellow at the Manhattan Institute and the author of The War on Cops.
John McWhorterThere is a certain mendacity, as Brick put it in Cat on a Hot Tin Roof, in our discussion of free speech on college campuses. Namely, none of us genuinely wish that absolutely all issues be aired in the name of education and open-mindedness. To insist so is to pretend that civilized humanity makes nothing we could call advancement in philosophical consensus.
I doubt we need “free speech” on issues such as whether slavery and genocide are okay, whether it has been a mistake to view women as men’s equals, or to banish as antique the idea that whites are a master race while other peoples represent a lower rung on the Darwinian scale. With all due reverence of John Stuart Mill’s advocacy for the regular airing of even noxious views in order to reinforce clarity on why they were rejected, we are also human beings with limited time. A commitment to the Enlightenment justifiably will decree that certain views are, indeed, no longer in need of discussion.
However, our modern social-justice warriors are claiming that this no-fly zone of discussion is vaster than any conception of logic or morality justifies. We are being told that questions regarding the modern proposals about cultural appropriation, about whether even passing infelicitous statements constitute racism in the way that formalized segregation and racist disparagement did, or about whether social disparities can be due to cultural legacies rather than structural impediments, are as indisputably egregious, backwards, and abusive as the benighted views of the increasingly distant past.
That is, the new idea is not only that discrimination and inequality still exist, but that to even question the left’s utopian expectation on such matters justifies the same furious, sloganistic and even physically violent resistance that was once levelled against those designated heretics by a Christian hegemony.
Of course the protesters in question do not recognize themselves in a portrait as opponents of something called heresy. They suppose that Galileo’s opponents were clearly wrong but that they, today, are actually correct in a way that no intellectual or moral argument could coherently deny.
As such, we have students allowed to decree college campuses as “racist” when they are the least racist spaces on the planet—because they are, predictably given the imperfection of humans, not perfectly free of passingly unsavory interactions. Thinkers invited to talk for a portion of an hour from the right rather than the left and then have dinner with a few people and fly home are treated as if they were reanimated Hitlers. The student of color who hears a few white students venturing polite questions about the leftist orthodoxy is supported in fashioning these questions as “racist” rhetoric.
The people on college campuses who openly and aggressively spout this new version of Christian (or even Islamist) crusading—ironically justifying it as a barricade against “fascist” muzzling of freedom when the term applies ominously well to the regime they are fostering—are a minority. However, the sawmill spinning blade of their rhetoric has succeeding in rendering opposition as risky as espousing pedophilia, such that only those natively open to violent criticism dare speak out. The latter group is small. The campus consensus thereby becomes, if only at moralistic gunpoint à la the ISIS victim video, a strangled hard-leftism.
Hence freedom of speech is indeed threatened on today’s college campuses. I have lost count of how many of my students, despite being liberal Democrats (many of whom sobbed at Hillary Clinton’s loss last November), have told me that they are afraid to express their opinions about issues that matter, despite the fact that their opinions are ones that any liberal or even leftist person circa 1960 would have considered perfectly acceptable.
Something has shifted of late, and not in a direction we can legitimately consider forwards.
John McWhorter teaches linguistics, philosophy, and music history at Columbia University and is the author of The Language Hoax, Words on the Move, and Talking Back, Talking Black.
Kate Bachelder OdellIt’s 2021, and Harvard Square has devolved into riots: Some 120 people are injured in protests, and the carnage includes fire-consumed cop cars and smashed-in windows. The police discharge canisters of tear gas, and, after apprehending dozens of protesters, enforce a 1:45 A.M. curfew. Anyone roaming the streets after hours is subject to arrest. About 2,000 National Guardsmen are prepared to intervene. Such violence and disorder is also roiling Berkeley and other elite and educated areas.
Oh, that’s 1970. The details are from the Harvard Crimson’s account of “anti-war” riots that spring. The episode is instructive in considering whether free speech is under threat in the United States. Almost daily, there’s a new YouTube installment of students melting down over viewpoints of speakers invited to one campus or another. Even amid speech threats from government—for example, the IRS’s targeting of political opponents—nothing has captured the public’s attention like the end of free expression at America’s institutions of higher learning.
Yet disruption, confusion, and even violence are not new campus phenomena. And it’s hard to imagine that young adults who deployed brute force in the 1960s and ’70s were deeply committed to the open and peaceful exchange of ideas.
There may also be reason for optimism. The rough and tumble on campus in the 1960s and ’70s produced a more even-tempered ’80s and ’90s, and colleges are probably heading for another course correction. In covering the ruckuses at Yale, Missouri, and elsewhere, I’ve talked to professors and students who are figuring out how to respond to the illiberalism, even if the reaction is delayed. The University of Chicago put out a set of free-speech principles last year, and others schools such as Princeton and Purdue have endorsed them.
The NARPs—Non-Athletic Regular People, as they are sometimes known on campus—still outnumber the social-justice warriors, who appear to be overplaying their hand. Case in point is the University of Missouri, which experienced a precipitous drop in enrollment after instructor Melissa Click and her ilk stoked racial tensions last spring. The college has closed dorms and trimmed budgets. Which brings us to another silver lining: The economic model of higher education (exorbitant tuition to pay ever more administrators) may blow up traditional college before the fascists can.
Note also that the anti-speech movement is run by rich kids. A Brookings Institution analysis from earlier this year discovered that “the average enrollee at a college where students have attempted to restrict free speech comes from a family with an annual income $32,000 higher than that of the average student in America.” Few rank higher in average income than those at Middlebury College, where students evicted scholar Charles Murray in a particularly ugly scene. (The report notes that Murray was received respectfully at Saint Louis University, “where the median income of students’ families is half Middlebury’s.”) The impulses of over-adulated 20-year-olds may soon be tempered by the tyranny of having to show up for work on a daily basis.
None of this is to suggest that free speech is enjoying some renaissance either on campus or in America. But perhaps as the late Wall Street Journal editorial-page editor Robert Bartley put it in his valedictory address: “Things could be worse. Indeed, they have been worse.”
Kate Bachelder Odell is an editorial writer for the Wall Street Journal.
Jonathan RauchIs free speech under threat? The one-syllable answer is “yes.” The three-syllable answer is: “Yes, of course.” Free speech is always under threat, because it is not only the single most successful social idea in all of human history, it is also the single most counterintuitive. “You mean to say that speech that is offensive, untruthful, malicious, seditious, antisocial, blasphemous, heretical, misguided, or all of the above deserves government protection?” That seemingly bizarre proposition is defensible only on the grounds that the marketplace of ideas turns out to be the most powerful engine of knowledge, prosperity, liberty, social peace, and moral advancement that our species has had the good fortune to discover.
Every new generation of free-speech advocates will need to get up every morning and re-explain the case for free speech and open inquiry—today, tomorrow, and forever. That is our lot in life, and we just need to be cheerful about it. At discouraging moments, it is helpful to remember that the country has made great strides toward free speech since 1798, when the Adams administration arrested and jailed its political critics; and since the 1920s, when the U.S. government banned and burned James Joyce’s great novel Ulysses; and since 1954, when the government banned ONE, a pioneering gay journal. (The cover article was a critique of the government’s indecency censors, who censored it.) None of those things could happen today.
I suppose, then, the interesting question is: What kind of threat is free speech under today? In the present age, direct censorship by government bodies is rare. Instead, two more subtle challenges hold sway, especially, although not only, on college campuses. The first is a version of what I called, in my book Kindly Inquisitors, the humanitarian challenge: the idea that speech that is hateful or hurtful (in someone’s estimation) causes pain and thus violates others’ rights, much as physical violence does. The other is a version of what I called the egalitarian challenge: the idea that speech that denigrates minorities (again, in someone’s estimation) perpetuates social inequality and oppression and thus also is a rights violation. Both arguments call upon administrators and other bureaucrats to defend human rights by regulating speech rights.
Both doctrines are flawed to the core. Censorship harms minorities by enforcing conformity and entrenching majority power, and it no more ameliorates hatred and injustice than smashing thermometers ameliorates global warming. If unwelcome words are the equivalent of bludgeons or bullets, then the free exchange of criticism—science, in other words—is a crime. I could go on, but suffice it to say that the current challenges are new variations on ancient themes—and they will be followed, in decades and centuries to come, by many, many other variations. Memo to free-speech advocates: Our work is never done, but the really amazing thing, given the proposition we are tasked to defend, is how well we are doing.
Jonathan Rauch is a senior fellow at the Brookings Institution and the author of Kindly Inquisitors: The New Attacks on Free Thought.
Nicholas Quinn RosenkranzSpeech is under threat on American campuses as never before. Censorship in various forms is on the rise. And this year, the threat to free speech on campus took an even darker turn, toward actual violence. The prospect of Milo Yiannopoulos speaking at Berkeley provoked riots that caused more than $100,000 worth of property damage on the campus. The prospect of Charles Murray speaking at Middlebury led to a riot that put a liberal professor in the hospital with a concussion. Ann Coulter’s speech at Berkeley was cancelled after the university determined that none of the appropriate venues could be protected from “known security threats” on the date in question.
The free-speech crisis on campus is caused, at least in part, by a more insidious campus pathology: the almost complete lack of intellectual diversity on elite university faculties. At Yale, for example, the number of registered Republicans in the economics department is zero; in the psychology department, there is one. Overall, there are 4,410 faculty members at Yale, and the total number of those who donated to a Republican candidate during the 2016 primaries was three.
So when today’s students purport to feel “unsafe” at the mere prospect of a conservative speaker on campus, it may be easy to mock them as “delicate snowflakes,” but in one sense, their reaction is understandable: If students are shocked at the prospect of a Republican behind a university podium, perhaps it is because many of them have never before laid eyes on one.
To see the connection between free speech and intellectual diversity, consider the recent commencement speech of Harvard President Drew Gilpin Faust:
Universities must be places open to the kind of debate that can change ideas….Silencing ideas or basking in intellectual orthodoxy independent of facts and evidence impedes our access to new and better ideas, and it inhibits a full and considered rejection of bad ones. . . . We must work to ensure that universities do not become bubbles isolated from the concerns and discourse of the society that surrounds them. Universities must model a commitment to the notion that truth cannot simply be claimed, but must be established—established through reasoned argument, assessment, and even sometimes uncomfortable challenges that provide the foundation for truth.
Faust is exactly right. But, alas, her commencement audience might be forgiven a certain skepticism. After all, the number of registered Republicans in several departments at Harvard—e.g., history and psychology—is exactly zero. In those departments, the professors themselves may be “basking in intellectual orthodoxy” without ever facing “uncomfortable challenges.” This may help explain why some students will do everything in their power to keep conservative speakers off campus: They notice that faculty hiring committees seem to do exactly the same thing.
In short, it is a promising sign that true liberal academics like Faust have started speaking eloquently about the crucial importance of civil, reasoned disagreement. But they will be more convincing on this point when they hire a few colleagues with whom they actually disagree.
Nicholas Quinn Rosenkranz is a professor of law at Georgetown. He serves on the executive committee of Heterodox Academy, which he co-founded, on the board of directors of the Federalist Society, and on the board of directors of the Foundation for Individual Rights in Education (FIRE).
Ben ShapiroIn February, I spoke at California State University in Los Angeles. Before my arrival, professors informed students that a white supremacist would be descending on the school to preach hate; threats of violence soon prompted the administration to cancel the event. I vowed to show up anyway. One hour before the event, the administration backed down and promised to guarantee that the event could go forward, but police officers were told not to stop the 300 students, faculty, and outside protesters who blocked and assaulted those who attempted to attend the lecture. We ended up trapped in the auditorium, with the authorities telling students not to leave for fear of physical violence. I was rushed from campus under armed police guard.
Is free speech under assault?
Of course it is.
On campus, free speech is under assault thanks to a perverse ideology of intersectionality that claims victim identity is of primary value and that views are a merely secondary concern. As a corollary, if your views offend someone who outranks you on the intersectional hierarchy, your views are treated as violence—threats to identity itself. On campus, statements that offend an individual’s identity have been treated as “microaggressions”–actual aggressions against another, ostensibly worthy of violence. Words, students have been told, may not break bones, but they will prompt sticks and stones, and rightly so.
Thus, protesters around the country—leftists who see verbiage as violence—have, in turn, used violence in response to ideas they hate. Leftist local authorities then use the threat of violence as an excuse to ideologically discriminate against conservatives. This means public intellectuals like Charles Murray being run off of campus and his leftist professorial cohort viciously assaulted; it means Ann Coulter being targeted for violence at Berkeley; it means universities preemptively banning me and Ayaan Hirsi Ali and Condoleezza Rice and even Jason Riley.
The campus attacks on free speech are merely the most extreme iteration of an ideology that spans from left to right: the notion that your right to free speech ends where my feelings begin. Even Democrats who say that Ann Coulter should be allowed to speak at Berkeley say that nobody should be allowed to contribute to a super PAC (unless you’re a union member, naturally).
Meanwhile, on the right, the president’s attacks on the press have convinced many Republicans that restrictions on the press wouldn’t be altogether bad. A Vanity Fair/60 Minutes poll in late April found that 36 percent of Americans thought freedom of the press “does more harm than good.” Undoubtedly, some of that is due to the media’s obvious bias. CNN’s Jeff Zucker has targeted the Trump administration for supposedly quashing journalism, but he was silent when the Obama administration’s Department of Justice cracked down on reporters from the Associated Press and Fox News, and when hacks like Deputy National Security Adviser Ben Rhodes openly sold lies regarding Iran. But for some on the right, the response to press falsities hasn’t been to call for truth, but to instead echo Trumpian falsehoods in the hopes of damaging the media. Free speech is only important when people seek the truth. Leftists traded truth for tribalism long ago; in response, many on the right seem willing to do the same. Until we return to a common standard under which facts matter, free speech will continue to rest on tenuous grounds.
Ben Shapiro is the editor in chief of The Daily Wire and the host of The Ben Shapiro Show.
Judith ShulevitzIt’s tempting to blame college and university administrators for the decline of free speech in America, and for years I did just that. If the guardians of higher education won’t inculcate the habits of mind required for serious thinking, I thought, who will? The unfettered but civil exchange of ideas is the basic operation of education, just as addition is the basic operation of arithmetic. And universities have to teach both the unfettered part and the civil part, because arguing in a respectful manner isn’t something anyone does instinctively.
So why change my mind now? Schools still cling to speech codes, and there still aren’t enough deans like the one at the University of Chicago who declared his school a safe-space-free zone. My alma mater just handed out prizes for “enhancing race and/or ethnic relations” to two students caught on video harassing the dean of their residential college, one screaming at him that he’d created “a space for violence to happen,” the other placing his face inches away from the dean’s and demanding, “Look at me.” All this because they deemed a thoughtful if ill-timed letter about Halloween costumes written by the dean’s wife to be an act of racist aggression. Yale should discipline students who behave like that, even if they’re right on the merits (I don’t think they were, but that’s not the point). They certainly don’t deserve awards. I can’t believe I had to write that sentence.
But in abdicating their responsibilites, the universities have enabled something even worse than an attack on free speech. They’ve unleashed an assault on themselves. There’s plenty of free speech around; we know that because so much bad speech—low-minded nonsense—tests our constitutional tolerance daily, and that’s holding up pretty well. (As Nicholas Lemann observes elsewhere in this symposium, Facebook and Google represent bigger threats to free speech than students and administrators.) What’s endangered is good speech.
Universities were setting themselves up to be used. Provocateurs exploit the atmosphere on campus to goad overwrought students, then gleefully trash the most important bastion of our crumbling civil society. Higher education and everything it stands for—logical argument, the scientific method, epistemological rigor—start to look illegitimate. Voters perceive tenure and research and higher education itself as hopelessly partisan and unworthy of taxpayers’ money.
The press is a secondary victim of this process of delegitimization. If serious inquiry can be waved off as ideology, then facts won’t be facts and reporting can’t be trusted. All journalism will be equal to all other journalism, and all journalists will be reduced to pests you can slam to the ground with near impunity. Politicians will be able to say anything and do just about anything and there will be no countervailing authority to challenge them. I’m pretty sure that that way lies Putinism and Erdoganism. And when we get to that point, I’m going to start worrying about free speech again.
Judith Shulevitz is a critic in New York.
Harvey SilverglateFree speech is, and has always been, threatened. The title of Nat Hentoff’s 1993 book Free Speech for Me – but Not for Thee is no less true today than at any time, even as the Supreme Court has accorded free speech a more absolute degree of protection than in any previous era.
Since the 1980s, the high court has decided most major free-speech cases in favor of speech, with most of the major decisions being unanimous or nearly so.
Women’s-rights advocates were turned back by the high court in 1986 when they sought to ban the sale of printed materials that, because deemed pornographic by some, were alleged to promote violence against women. Censorship in the name of gender–based protection thus failed to gain traction.
Despite the demands of civil-rights activists, the Supreme Court in 1992 declared cross-burning to be a protected form of expression in R.A.V. v. City of St. Paul, a decision later refined to strengthen a narrow exception for when cross-burning occurs primarily as a physical threat rather than merely an expression of hatred.
Other attempts at First Amendment circumvention have been met with equally decisive rebuff. When the Reverend Jerry Falwell sued Hustler magazine publisher Larry Flynt for defamation growing out of a parody depicting Falwell’s first sexual encounter as a drunken tryst with his mother in an outhouse, a unanimous Supreme Court lectured on the history of parody as a constitutionally protected, even if cruel, form of social and political criticism.
When the South Boston Allied War Veterans, sponsor of Boston’s Saint Patrick’s Day parade, sought to exclude a gay veterans’ group from marching under its own banner, the high court unanimously held that as a private entity, even though marching in public streets, the Veterans could exclude any group marching under a banner conflicting with the parade’s socially conservative message, notwithstanding public-accommodations laws. The gay group could have its own parade but could not rain on that of the conservatives.
Despite such legal clarity, today’s most potent attacks on speech are coming, ironically, from liberal-arts colleges. Ubiquitous “speech codes” limit speech that might insult, embarrass, or “harass,” in particular, members of “historically disadvantaged” groups. “Safe spaces” and “trigger warnings” protect purportedly vulnerable students from hearing words and ideas they might find upsetting. Student demonstrators and threats of violence have forced the cancellation of controversial speakers, left and right.
It remains unclear how much campus censorship results from politically correct faculty, control-obsessed student-life administrators, or students socialized and indoctrinated into intolerance. My experience suggests that the bureaucrats are primarily, although not entirely, to blame. When sued, colleges either lose or settle, pay a modest amount, and then return to their censorious ways.
This trend threatens the heart and soul of liberal education. Eventually it could infect the entire society as these students graduate and assume influential positions. Whether a resulting flood of censorship ultimately overcomes legal protections and weakens democracy remains to be seen.
Harvey Silverglate, a Boston-based lawyer and writer, is the co-author of The Shadow University: The Betrayal of Liberty on America’s Campuses (Free Press, 1998). He co-founded the Foundation for Individual Rights in Education in 1999 and is on FIRE’s board of directors. He spent some three decades on the board of the ACLU of Massachusetts, two of those years as chairman. Silverglate taught at Harvard Law School for a semester during a sabbatical he took in the mid-1980s.
Christina Hoff SommersWhen Heather Mac Donald’s “blue lives matter” talk was shut down by a mob at Claremont McKenna College, the president of neighboring Pomona College sent out an email defending free speech. Twenty-five students shot back a response: “Heather Mac Donald is a fascist, a white supremacist . . . classist, and ignorant of interlocking systems of domination that produce the lethal conditions under which oppressed peoples are forced to live.”
Some blame the new campus intolerance on hypersensitive, over-trophied millennials. But the students who signed that letter don’t appear to be fragile. Nor do those who recently shut down lectures at Berkeley, Middlebury, DePaul, and Cal State LA. What they are is impassioned. And their passion is driven by a theory known as intersectionality.
Intersectionality is the source of the new preoccupation with microaggressions, cultural appropriation, and privilege-checking. It’s the reason more than 200 colleges and universities have set up Bias Response Teams. Students who overhear potentially “otherizing” comments or jokes are encouraged to make anonymous reports to their campus BRTs. A growing number of professors and administrators have built their careers around intersectionality. What is it exactly?
Intersectionality is a neo-Marxist doctrine that views racism, sexism, ableism, heterosexism, and all forms of “oppression” as interconnected and mutually reinforcing. Together these “isms” form a complex arrangement of advantages and burdens. A white woman is disadvantaged by her gender but advantaged by her race. A Latino is burdened by his ethnicity but privileged by his gender. According to intersectionality, American society is a “matrix of domination,” with affluent white males in control. Not only do they enjoy most of the advantages, they also determine what counts as “truth” and “knowledge.”
But marginalized identities are not without resources. According to one of intersectionality’s leading theorists, Patricia Collins (former president of the American Sociology Association), disadvantaged groups have access to deeper, more liberating truths. To find their voice, and to enlighten others to the true nature of reality, they require a safe space—free of microaggressive put-downs and imperious cultural appropriations. Here they may speak openly about their “lived experience.” Lived experience, according to intersectional theory, is a better guide to the truth than self-serving Western and masculine styles of thinking. So don’t try to refute intersectionality with logic or evidence: That only proves that you are part of the problem it seeks to overcome.
How could comfortably ensconced college students be open to a convoluted theory that describes their world as a matrix of misery? Don’t they flinch when they hear intersectional scholars like bell hooks refer to the U.S. as an “imperialist, white-supremacist, capitalist patriarchy”? Most take it in stride because such views are now commonplace in high-school history and social studies texts. And the idea that knowledge comes from lived experience rather than painstaking study and argument is catnip to many undergrads.
Silencing speech and forbidding debate is not an unfortunate by-product of intersectionality—it is a primary goal. How else do you dismantle a lethal system of oppression? As the protesting students at Claremont McKenna explained in their letter: “Free speech . . . has given those who seek to perpetuate systems of domination a platform to project their bigotry.” To the student activists, thinkers like Heather MacDonald and Charles Murray are agents of the dominant narrative, and their speech is “a form of violence.”
It is hard to know how our institutions of higher learning will find their way back to academic freedom, open inquiry, and mutual understanding. But as long as intersectional theory goes unchallenged, campus fanaticism will intensify.
Christina Hoff Sommers is a resident scholar at the American Enterprise Institute. She is the author of several books, including Who Stole Feminism? and The War Against Boys. She also hosts The Factual Feminist, a video blog. @Chsommers
John StosselYes, some college students do insane things. Some called police when they saw “Trump 2016” chalked on sidewalks. The vandals at Berkeley and the thugs who assaulted Charles Murray are disgusting. But they are a minority. And these days people fight back.
Someone usually videotapes the craziness. Yale’s “Halloween costume incident” drove away two sensible instructors, but videos mocking Yale’s snowflakes, like “Silence U,” make such abuse less likely. Groups like Young America’s Foundation (YAF) publicize censorship, and the Foundation for Individual Rights in Education (FIRE) sues schools that restrict speech.
Consciousness has been raised. On campus, the worst is over. Free speech has always been fragile. I once took cameras to Seton Hall law school right after a professor gave a lecture on free speech. Students seemed to get the concept. Sean, now a lawyer, said, “Protect freedom for thought we hate; otherwise you never have a society where ideas clash, and we come up with the best idea.” So I asked, “Should there be any limits?” Students listed “fighting words,” “shouting fire in a theater,” malicious libel, etc.— reasonable court-approved exceptions. But then they went further. Several wanted bans on “hate” speech, “No value comes out of hate speech,” said Javier. “It inevitably leads to violence.”
No it doesn’t, I argued, “Also, doesn’t hate speech bring ideas into the open, so you can better argue about them, bringing you to the truth?”
“No,” replied Floyd, “With hate speech, more speech is just violence.”
So I pulled out a big copy of the First Amendment and wrote, “exception: hate speech.”
Two students wanted a ban on flag desecration “to respect those who died to protect it.”
One wanted bans on blasphemy:
“Look at the gravity of the harm versus the value in blasphemy—the harm outweighs the value.”
Several wanted a ban on political speech by corporations because of “the potential for large corporations to improperly influence politicians.”
Finally, Jillian, also now a lawyer, wanted hunting videos banned.
“It encourages harm down the road.”
I asked her, incredulously, “you’re comfortable locking up people who make a hunting film?”
“Oh, yeah,” she said. “It’s unnecessary cruelty to feeling and sentient beings.”
So, I picked up my copy of the Bill of Rights again. After “no law . . . abridging freedom of speech,” I added: “Except hate speech, flag burning, blasphemy, corporate political speech, depictions of hunting . . . ”
That embarrassed them. “We may have gone too far,” said Sean. Others agreed. One said, “Cross out the exceptions.” Free speech survived, but it was a close call. Respect for unpleasant speech will always be thin. Then-Senator Hillary Clinton wanted violent video games banned. John McCain and Russ Feingold tried to ban political speech. Donald Trump wants new libel laws, and if you burn a flag, he tweeted, consequences might be “loss of citizenship or a year in jail!” Courts or popular opinion killed those bad ideas.
Free speech will survive, assuming those of us who appreciate it use it to fight those who would smother it.
John Stossel is a FOX News/FOX Business Network Contributor.
Warren TreadgoldEven citizens of dictatorships are free to praise the regime and to talk about the weather. The only speech likely to be threatened anywhere is the sort that offends an important and intolerant group. What is new in America today is a leftist ideology that threatens speech precisely because it offends certain important and intolerant groups: feminists and supposedly oppressed minorities.
So far this new ideology is clearly dominant only in colleges and universities, where it has become so strong that most controversies concern outside speakers invited by students, not faculty speakers or speakers invited by administrators. Most academic administrators and professors are either leftists or have learned not to oppose leftism; otherwise they would probably never have been hired. Administrators treat even violent leftist protestors with respect and are ready to prevent conservative and moderate outsiders from speaking rather than provoke protests. Most professors who defend conservative or moderate speakers argue that the speakers’ views are indeed noxious but say that students should be exposed to them to learn how to refute them. This is very different from encouraging a free exchange of ideas.
Although the new ideology began on campuses in the ’60s, it gained authority outside them largely by means of several majority decisions of the Supreme Court, from Roe (1973) to Obergefell (2015). The Supreme Court decisions that endanger free speech are based on a presumed consensus of enlightened opinion that certain rights favored by activists have the same legitimacy as rights explicitly guaranteed by the Constitution—or even more legitimacy, because the rights favored by activists are assumed to be so fundamental that they need no grounding in specific constitutional language. The Court majorities found restricting abortion rights or homosexual marriage, as large numbers of Americans wish to do, to be constitutionally equivalent to restricting black voting rights or interracial marriage. Any denial of such equivalence therefore opposes fundamental constitutional rights and can be considered hate speech, advocating psychological and possibly physical harm to groups like women seeking abortions or homosexuals seeking approval. Such speech may still be constitutionally protected, but acting upon it is not.
This ideology of forbidding allegedly offensive speech has spread to most of the Democratic Party and the progressive movement. Rather than seeing themselves as taking one side in a free debate, progressives increasingly argue (for example) that opposing abortion is offensive to women and supporting the police is offensive to blacks. Some politicians object so strongly to such speech that despite their interest in winning votes, they attack voters who disagree with them as racists or sexists. Expressing views that allegedly discriminate against women, blacks, homosexuals, and various other minorities can now be grounds for a lawsuit.
Speech that supposedly offends women or minorities has already cost some people their careers, their businesses, and their opportunities to deliver or hear speeches. Such intimidation is the intended result of an ideology that threatens free speech.
Warren Treadgold is a professor of history at Saint Louis University.
Matt WelchLike a sullen zoo elephant rocking back and forth from leg to leg, there is an oversized paradox we’d prefer not to see standing smack in the sightlines of most our policy debates. Day by day, even minute by minute, America simultaneously gets less free in the laboratory, but more free in the field. Individuals are constantly expanding the limits and applications of their own autonomy, even as government transcends prior restraints on how far it can reach into our intimate business.
So it is that the Internal Revenue Service can charge foreign banks with collecting taxes on U.S. citizens (therefore causing global financial institutions to shun many of the estimated 6 million-plus Americans who live abroad), even while block-chain virtuosos make illegal transactions wholly undetectable to authorities. It has never been easier for Americans to travel abroad, and it’s never been harder to enter the U.S. without showing passports, fingerprints, retinal scans, and even social-media passwords.
What’s true for banking and tourism is doubly true for free speech. Social media has given everyone not just a platform but a megaphone (as unreadable as our Facebook timelines have all become since last November). At the same time, the federal government during this unhappy 21st century has continuously ratcheted up prosecutorial pressure against leakers, whistleblowers, investigative reporters, and technology companies.
A hopeful bulwark against government encroachment unique to the free-speech field is the Supreme Court’s very strong First Amendment jurisprudence in the past decade or two. Donald Trump, like Hillary Clinton before him, may prattle on about locking up flag-burners, but Antonin Scalia and the rest of SCOTUS protected such expression back in 1990. Barack Obama and John McCain (and Hillary Clinton—she’s as bad as any recent national politician on free speech) may lament the Citizens United decision, but it’s now firmly legal to broadcast unfriendly documentaries about politicians without fear of punishment, no matter the electoral calendar.
But in this very strength lies what might be the First Amendment’s most worrying vulnerability. Barry Friedman, in his 2009 book The Will of the People, made the persuasive argument that the Supreme Court typically ratifies, post facto, where public opinion has already shifted. Today’s culture of free speech could be tomorrow’s legal framework. If so, we’re in trouble.
For evidence of free-speech slippage, just read around you. When both major-party presidential nominees react to terrorist attacks by calling to shut down corners of the Internet, and when their respective supporters are actually debating the propriety of sucker punching protesters they disagree with, it’s hard to escape the conclusion that our increasingly shrill partisan sorting is turning the very foundation of post-1800 global prosperity into just another club to be swung in our national street fight.
In the eternal cat-and-mouse game between private initiative and government control, the former is always advantaged by the latter’s fundamental incompetence. But what if the public willingly hands government the power to muzzle? It may take a counter-cultural reformation to protect this most noble of American experiments.
Matt Welch is the editor at large of Reason.
Adam. J. WhiteFree speech is indeed under threat on our university campuses, but the threat did not begin there and it will not end there. Rather, the campus free-speech crisis is a particularly visible symptom of a much more fundamental crisis in American culture.
The problem is not that some students, teachers, and administrators reject traditional American values and institutions, or even that they are willing to menace or censor others who defend those values and institutions. Such critics have always existed, and they can be expected to use the tools and weapons at their disposal. The problem is that our country seems to produce too few students, teachers, and administrators who are willing or able to respond to them.
American families produce children who arrive on campus unprepared for, or uninterested in, defending our values and institutions. For our students who are focused primarily on their career prospects (if on anything at all), “[c]ollege is just one step on the continual stairway of advancement,” as David Brooks observed 16 years ago. “They’re not trying to buck the system; they’re trying to climb it, and they are streamlined for ascent. Hence they are not a disputatious group.”
Meanwhile, parents bear incomprehensible financial burdens to get their kids through college, without a clear sense of precisely what their kids will get out of these institutions in terms of character formation or civic virtue. With so much money at stake, few can afford for their kids to pursue more than career prospects.
Those problems are not created on campus, but they are exacerbated there, as too few college professors and administrators see their institutions as cultivators of American culture and republicanism. Confronted with activists’ rage, they offer no competing vision of higher education—let alone a compelling one.
Ironically, we might borrow a solution from the Left. Where progressives would leverage state power in service of their health-care agenda, we could do the same for education. State legislatures and governors, recognizing the present crisis, should begin to reform and renegotiate the fundamental nature of state universities. By making state universities more affordable, more productive, and more reflective of mainstream American values, they will attract students—and create incentives for competing private universities to follow suit.
Let’s hope they do it soon, for what’s at stake is much more than just free speech on campus, or even free speech writ large. In our time, as in Tocqueville’s, “the instruction of the people powerfully contributes to the support of a democratic republic,” especially “where instruction which awakens the understanding is not separated from moral education which amends the heart.” We need our colleges to cultivate—not cut down—civic virtue and our capacity for self-government. “Republican government presupposes the existence of these qualities in a higher degree than any other form,” Madison wrote in Federalist 55. If “there is not sufficient virtue among men for self-government,” then “nothing less than the chains of despotism” can restrain us “from destroying and devouring one another.”
Adam J. White is a research fellow at the Hoover Institution.
Cathy YoungA writer gets expelled from the World Science Fiction Convention for criticizing the sci-fi community’s preoccupation with racial and gender “inclusivity” while moderating a panel. An assault on free speech, or an exercise of free association? How about when students demand the disinvitation of a speaker—or disrupt the speech? When a critic of feminism gets banned from a social-media platform for unspecified “abuse”?
Such questions are at the heart of many recent free-speech controversies. There is no censorship by government; but how concerned should we be when private actors effectively suppress unpopular speech? Even in the freest society, some speech will—and should—be considered odious and banished to unsavory fringes. No one weeps for ostracized Holocaust deniers or pedophilia apologists.
But shunned speech needs to remain a narrow exception—or acceptable speech will inexorably shrink. As current Federal Communications Commission chairman Ajit Pai cautioned last year, First Amendment protections will be hollowed out unless undergirded by cultural values that support a free marketplace of ideas.
Sometimes, attacks on speech come from the right. In 2003, an Iraq War critic, reporter Chris Hedges, was silenced at Rockford College in Illinois by hecklers who unplugged the microphone and rushed the stage; some conservative pundits defended this as robust protest. Yet the current climate on the left—in universities, on social media, in “progressive” journalism, in intellectual circles—is particularly hostile to free expression. The identity-politics left, fixated on subtle oppressions embedded in everyday attitudes and language, sees speech-policing as the solution.
Is hostility to free-speech values on the rise? New York magazine columnist Jesse Singal argues that support for restrictions on public speech offensive to minorities has remained steady, and fairly high, since the 1970s. Perhaps. But the range of what qualifies as offensive—and which groups are to be shielded—has expanded dramatically. In our time, a leading liberal magazine, the New Republic, can defend calls to destroy a painting of lynching victim Emmett Till because the artist is white and guilty of “cultural appropriation,” and a feminist academic journal can be bullied into apologizing for an article on transgender issues that dares to mention “male genitalia.”
There is also a distinct trend of “bad” speech being squelched by coercion, not just disapproval. That includes the incidents at Middlebury College in Vermont and at Claremont McKenna in California, where mobs not only prevented conservative speakers—Charles Murray and Heather Mac Donald—from addressing audiences but physically threatened them as well. It also includes the use of civil-rights legislation to enforce goodthink in the workplace: Businesses may face stiff fines if they don’t force employees to call a “non-binary” co-worker by the singular “they,” even when talking among themselves.
These trends make a mockery of liberalism and enable the kind of backlash we have seen with Donald Trump’s election. But the backlash can bring its own brand of authoritarianism. It’s time to start rebuilding the culture of free speech across political divisions—a project that demands, above all, genuine openness and intellectual consistency. Otherwise it will remain, as the late, great Nat Hentoff put it, a call for “free speech for me, but not for thee.”
Cathy Young is a contributing editor at Reason.
Robert J. ZimmerFree speech is not a natural feature of human society. Many people are comfortable with free expression for views they agree with but would withhold this privilege for those they deem offensive. People justify such restrictions by various means: the appeal to moral certainty, political agendas, demand for change, opposing change, retaining power, resisting authority, or, more recently, not wanting to feel uncomfortable. Moral certainty about one’s views or a willingness to indulge one’s emotions makes it easy to assert that others are doing true damage or creating unacceptable offense simply by presenting a fundamentally different perspective.
The resulting challenges to free expression may come in the form of laws, threats, pressure (whether societal, group, or organizational), or self-censorship in the face of a prevailing consensus. Specific forms of challenge may be more or less pronounced as circumstances vary. But the widespread temptation to consider the silencing of “objectionable” viewpoints as acceptable implies that the challenge to free expression is always present.
The United States today is no exception. We benefit from the First Amendment, which asserts that the government shall make no law abridging the freedom of speech. However, fostering a society supporting free expression involves matters far beyond the law. The ongoing and increasing demonization of one group by another creates a political and social environment conducive to suppressing speech. Even violent acts opposing speech can become acceptable or encouraged. Such behavior is evident at both political rallies and university events. Our greatest current threat to free expression is the emergence of a national culture that accepts the legitimacy of suppression of speech deemed objectionable by a segment of the population.
University and college campuses present a particularly vivid instance of this cultural shift. There have been many well-publicized episodes of speakers being disinvited or prevented from speaking because of their views. However, the problem is much deeper, as there is significant self-censorship on many campuses. Both faculty and students sometimes find themselves silenced by social and institutional pressures to conform to “acceptable” views. Ironically, the very mission of universities and colleges to provide a powerful and deeply enriching education for their students demands that they embrace and protect free expression and open discourse. Failing to do so significantly diminishes the quality of the education they provide.
My own institution, the University of Chicago, through the words and actions of its faculty and leaders since its founding, has asserted the importance of free expression and its essential role in embracing intellectual challenge. We continue to do so today as articulated by the Chicago Principles, which strongly affirm that “the University’s fundamental commitment is to the principle that debate or deliberation may not be suppressed because the ideas put forth are thought by some or even by most members of the University community to be offensive, unwise, immoral, or wrong-headed.” It is only in such an environment that universities can fulfill their own highest aspirations and provide leadership by demonstrating the value of free speech within society more broadly. A number of universities have joined us in reinforcing these values. But it remains to be seen whether the faculty and leaders of many institutions will truly stand up for these values, and in doing so provide a model for society as a whole.
Robert J. Zimmer is the president of the University of Chicago.