What role, if any, should a concern for human rights play in American foreign policy? Is there a conflict between…
Recently, the editors of COMMENTARY addressed the following questions to a group of American intellectuals of varying political views:
- What role, if any, should a concern for human rights play in American foreign policy? Is there a conflict between this concern and the American national interest?
- Does the distinction between authoritarianism and totalitarianism seem important to you? If so, what follows from it in practice? If not, what distinctions would you make in judging and dealing with non-democratic regimes?
- Does the approach of the Reagan administration, to the extent that it can be inferred from statements of the President and other high officials, compare favorably or unfavorably with the Carter administration’s human-rights policy?
The responses—eighteen in all—are printed below in alphabetical order.
Human rights would certainly seem to be an important part of foreign policy, since the present struggle for the world is about liberty, and indeed the survival of liberty for the conceivable future of our civilization. Yet a separate human-rights policy as such, the more we think about it, does not appear to be one from which we can expect to get much mileage.
In the first place, the violations of rights that will get into the news and attract attention are likely to be those by right-wing governments. They are cruder and more impulsive in visiting punishment and oppression upon their victims—and they make the news. The totalitarians of the Left are more systematic, deliberate, rational, “scientific”—and they escape notice. The Soviet secret police must have sneered contemptuously at the amateurishness with which Jacobo Timerman was manhandled. They know other and quieter ways of breaking a man down. After all, they have had more practice at it. The Soviet Union has been perfecting its techniques of oppression against its own people for the last sixty years. Go back to 1941 and Arthur Koestler’s Darkness at Noon to see how the inquisitor wears away and destroys his victim without laying a violent hand upon him. Indeed, the perfection of the totalitarian state will have arrived when its oppressiveness is so effectively managed that it becomes the universal and accepted tenor of life for its citizens, and there are no wayward incidents to be reported to any world body. In part, this has already come about for large portions of the Soviet population.
But the second and more formidable reason why we cannot expect much mileage from a separate human-rights program lies in the ideology of the Communist countries themselves and the fact that this ideology places the matter of human rights in a perspective altogether different from our own. The Communists are the practiced users of a double-speak that can take what we deem to be violations of personal liberty as steps toward a more ideal social system. Their violations of human rights are always redeemed in the ultimate vistas of history, the secret truth of which they are in sole possession. No case of ordinary human rights could be presented strong enough to dent the self-righteous armor of their apocalyptic vision of the socialist future.
Obviously, in the above I accept the distinction between authoritarian and totalitarian regimes, as I think we should. But I am uncomfortable with the word “authoritarian” in this connection. “Authority” is a word we might hope would come back into less unfavorable use in our political vocabulary. It connotes, or should connote, legality, legitimacy, and stability—qualities of a society that are at the farthest remove from the capricious will of a dictator. Eighteenth-century England, for example, did not know certain democratic liberties that we do, and thus might be called authoritarian. Yet within the framework of its authority, free political institutions that developed and fostered the cultivation of individual freedom were possible. Why not, then, speak merely of right-wing dictatorships or simply non-democratic versus totalitarian societies?
The key point in the distinction is the dominance of ideology in the totalitarian regime. This ideology entails the complete and systematic management of human life by the state. Eventually this regimentation can leave no loopholes, and any serious crisis of power must lead the ruling bureaucracy to plug what loopholes of freedom remain. In practice, of course, some totalitarian countries are more thoroughgoing than others—which means, in effect, that the latter are in fact less farther along the way of bureaucratic organization. Is Yugoslavia totalitarian or authoritarian? Because of the nature of its terrain and the heterogeneity of its peoples, the regimentation of life in that society cannot go as far as the totalitarian ideal would prescribe. Yet in any crunch, liberties are lopped off one by one. Yugoslav intellectuals extracted much réclame from their journal of liberal Marxist thought! but when the reclame became too noticeable, the magazine was shut down and further publication forbidden. The raison d’être, the animating drive, of the totalitarian country is toward systematic and complete regimentation; and if it forsakes this drive, it begins to come apart. Gaps may be permitted for a time in the level of organization achieved; but where the gaps threaten to become an issue of principle on the side of liberty, the gaps must be closed. (Hence the intolerable situation of Poland now for the Communist bureaucracy. A free labor movement, even though socialist in its leanings, is simply intolerable.)
It is this ideological aspect that Americans fail to grasp. Our intellectual habits and traditions do not incline us to see history as a struggle over ideas, which it really is, and never more so than at the present time. The traveling American Congressman or businessman meeting and drinking sociably with a Soviet commissar would like to believe the latter is a good fellow like himself, animated by the same practical drives of good sense and self-interest. Even so astute a student of international affairs as Henry Kissinger is perpetually beguiled into trying to fit the Soviets into the patterns of normal diplomacy until, after all the diplomatic jockeying, the bedrock primacy of the ideological always asserts itself. That difference is an impassable barrier to any meaningful program of human rights. A plea for individual rights and liberties, as we know them, simply falls on deaf ears with the totalitarians.
We have entered a new period in which this ideological factor is being pushed into new prominence after the years of détente. The Third World, with the possibilities of global instability it provides, is too rich a field for revolutionary exploitation to be ignored by the Soviets. That world is riddled by poverty, and by colonial resentments, and here the old Marxist slogans—they are champions of the poor and oppressed, of the forces of “progress” and of wars of “liberation,” etc.—go to work. In view of the Communist record, economic as well as political, these slogans have long been meaningless; but it is part of the tragedy of our time that, meaningless as they have become, they are nevertheless believed, and not only in the Third World. This revolutionary push now begins to converge on Latin America. The aim is to isolate the United States step by step. If we have not altogether forgotten the tradition of the Monroe Doctrine, then we should view with very deep apprehension the establishing of successive outposts of a foreign power, in this case the Soviet Union, in our hemisphere.
It becomes necessary for us then to maintain relations with any countries that are not definitely aligned against us. These countries may not always be to our liking; the Third World is not a fertile ground for breeding liberal democracy, as we know it—that requires time, stability, social habit. When the Timerman case exploded, some of the purists among us called for our breaking relations with Argentina. That is a curious demand since we have maintained relations with the Soviet Union since 1933—throughout the long period of Stalinist purges and terror. We weaken ourselves to the degree that we isolate ourselves within the community of nations. The future of human rights depends on the continued existence and power of the United States in the world today. It would be a supreme irony if the partisans of human rights, out of motives of purity, insisted on a policy that would only weaken us among the family of nations.
I think this last brings us now to a summary statement of what an effective human-rights policy must be. It cannot be a separate campaign or crusade as such; it has simply to be a part or a concomitant of a larger national policy. And the outlines of that larger policy are tolerably clear: we have to keep the United States a free country and also maintain an adequate defense posture so that our presence as a world power is felt among the other nations. It is this continuing and effective presence, after all, that will do more in the long run for the cause of human rights than any oratorical crusade.
I don’t know what the human-rights policy of the Reagan administration is, or even if it has one, but it would do well to avoid following Carter here. His policy was crusading and evangelical, as perhaps befitted a born-again President. But Carter was also a President who had strange illusions about the Soviet Union, and when these collapsed for him after the invasion of Afghanistan, his policy was shown to have been meaningless. It may gratify our sense of self-righteousness to preach, but the cause of human rights is hardly advanced if it is accompanied by a decline in American power.
Peter L. Berger:
Human rights will have to play a role in American foreign policy as long as the United States remains what it has been from its beginning—not a nation like any other, but one inextricably linked to a particular political creed. And as long as the United States remains a democracy, this linkage will always reassert itself, for the simple reason that the American people will insist on it even if an American administration should be tempted to set it aside. This political creed, which was the original raison d’être and continues to be the principle of legitimacy of the American nation-state, has as its very core a number of propositions about human rights. It follows that the idea that American foreign policy could be conducted in a Machiavellian spirit detached from any moral considerations is not only repugnant to American traditions but eminently impracticable. Put differently, in the case of the United States there is a necessary connection between national interest and national values, and even a Realpolitik worthy of that name will have to take this aspect of social reality into account.
To say this, however, is not to say that American foreign policy is or ever could be nothing but an exercise in morality. While the United States has specific characteristics that distinguish it from other nation-states, it is also a nation-state and as such shares the general conditions under which this socio-political species must exist in the real world. The real world, today as always, is a harsh place which only rarely allows the statesman the luxury of making morally pure decisions. The foreign policy of any state, and especially of a great power, will necessarily include actions that are morally ambiguous and sometimes even morally repugnant. Whoever does not recognize this engages in abstract morality without relevance to the real world (or worse, with a relevance that runs counter to his own best intentions). Compromise, the pursuit of the lesser evil, and even actions that (in Machiavelli’s words) endanger the eternal salvation of the statesman’s soul are part and parcel of the vocation of politics. The mainstream of the Jewish and Christian moral tradition has always known this, which is why utopianism has always been in tension with that mainstream. This is as true of the utopian manifestations of the American political creed (as, for instance, in Wilsonianism) as of the more repugnant utopias that plague the world today. This means that, inevitably, American foreign policy will have to deal with regimes and to accept situations where human rights are being violated. If nothing else, the limitations of American power dictate this eventuality: exactly to the degree that the United States cannot be the world’s policeman, it cannot be the world’s Sunday-school teacher.
As Friedrich Meinecke has argued in his classic history of the idea of raison d’êtat, the political life of every state is marked by an uneasy tension between kratos and ethos, power and values. This tension makes for the tragic pathos of all politics. In the American case the tension takes on a peculiar force, because of the aforementioned linkage of the American polity to a set of clearly defined, putatively binding values. There is no neat way of resolving the tension. The Machiavellian resolution would destroy the very legitimacy of the American state; the resolution of abstract idealism would likely destroy its empirical survival. It follows from this that no elegant formula can ever be devised by which American foreign policy could be conducted with regard to human rights. Inevitably, rather, the relation between American foreign policy and the concern for human rights will be messy, less than morally satisfying, sometimes deeply troubling. In religious terms, there is no magic by which American statesmen and the American people who elect them can be free of sin in the conduct of foreign policy.
The morally crucial question then becomes, quite simply, which sin American foreign policy will burden us with. But this question has an empirical component. It is linked to how we assess the empirical realities of the contemporary world. Again, one can make limited use of Machiavelli here, specifically of his, concept of necessity (necessità) governing the actions of a statesman: for the statesman can decide what is and what is not “necessary” (including the necessary moral compromises he must tolerate) only if he has a clear picture of the realities of the world in which he must act. It is here that the distinction between totalitarianism and authoritarianism is morally relevant—not as an abstract exercise in political theorizing, but as an empirical measuring device for the assessment of contemporary political realities. A central reality of our time is the appearance of the totalitarian state, a novum in human history, while authoritarian regimes have been an issue at least as far back as Aristotle, the Hebrew prophets, and Confucius. (We cannot enter here into the absurd notion about this distinction which prevails in the liberal press today, to the effect that the distinction has just now been invented in order to legitimate certain policies of the Reagan administration. To the best of my knowledge, Carl Friedrich and Hannah Arendt, who developed the distinction a generation ago, did not have the Reagan administration in mind.)
Another reality of our time is that the foremost propagator of totalitarianism since World War II is the expansive Soviet superpower. Now, while the United States cannot be the world’s policeman, it happens to be the world’s only effective (or still effective) countervailing power to Soviet expansionism. Those of us, then, who understand the world in the light of these realities will conclude that the maintenance of this countervailing power must, of necessity (with the full weight of necessità), be a central principle of American foreign policy.
This assessment has important moral implications. First, insofar as totalitarian regimes in general are concerned, they must be seen as an assault on human rights in their very structure, over and beyond the particular outrages they habitually engage in. This has been Solzhenitsyn’s most important moral message, one he has tirelessly delivered (though many Western intellectuals, it seems, still have not heard it). Thus resistance to totalitarianism, between nations and within nations, is not a morally neutral policy, but one that touches on the very survival of human rights in the world. And second, the same moral quality pertains to resistance to the expansion of Soviet power, a resistance which, by the vicissitudes of history, has fallen mainly on the United States. Consequently, it is not some perverse political theory but moral considerations of great urgency which lead to the proposition that the assault on human rights by totalitarian regimes is more serious than violations by authoritarian regimes. The same moral considerations force a differentiation between regimes that foster the purposes of Soviet imperialism and those that, however odious they may be in their domestic policies, do not serve Soviet ambitions and perhaps even oppose them.
These considerations draw the broad outline of a policy; they cannot fill in the details. Essentially, this policy is that the foremost political as well as moral task of the United States in the contemporary world must be resistance to the expansion of the Soviet system. This in itself constitutes a crucially important service to the cause of human rights. The pursuit of this policy, in the real world, will inevitably bring American interests in line with the interests of governments whose record on human rights leaves much to be desired. Liberals, for convoluted ideological reasons, will usually place right-wing authoritarian regimes in this category. Even if one does not share their ideological proclivities, one must concede that there is indeed a moral problem when the United States allies itself with police states whose best or sometimes only excuse is their anti-Soviet stance. But the moral problem is just as grave when American interests happen to coincide with those of left-wing police states, some of which fall clearly on the totalitarian side of the conceptual divide (in which case the moral problem is even graver). The most important example of this, of course, is the growing American relationship with the People’s Republic of China (not to mention the bizarre role played by the United States in giving diplomatic support, however gingerly, to the Khmer Rouge representatives in the United Nations). Morally responsible individuals will differ in their view of the “necessity” of these relations in each case. It seems clear, however, that no one, right across the American political spectrum, is in a position to be both politically relevant and morally pure in these matters.
Recent public discussion of all this has been of such singular crudity that it is necessary to spell out some points which should be clear to anyone with a capacity for intellectual discrimination. The position taken here does not imply that torture by a government friendly to the United States is somehow more acceptable than the same practice undertaken by an anti-American government. Nor does it imply that all anti-Communists are morally superior to all pro-Communists. Nor is it implied that resistance to Soviet power is or should be the only purpose of American foreign policy. There may be some few individuals with such ideas, although they have scarcely been heard from in the current debate. In any event, such ideas deserve our contempt. The difficult questions all relate to the power available to the United States when it tries to act on behalf of human rights anywhere outside its borders and to the ironies of history by which, often, good intentions lead to nefarious results. The Iranian debacle should have been instructive to all Americans, be they liberals or not, as to the difficulty of a human-rights policy under conditions vastly different from those prevailing domestically.
There is a basic premise of the American political creed that, perhaps unexpectedly, may be helpful in this area—to wit, the premise that the government of the United States is a creation of the American people, representing the latter but in no way identical with it. Officials of the American government, especially if they are diplomats, cannot speak as ordinary citizens do. It is difficult, to say the least, for one arm of the State Department to work toward amity with a foreign government which is being morally denounced as a violator of human rights by another arm of the State Department. As for diplomats, it is an essential part of their vocation to have seemingly amicable relations with gangsters and murderers; they could not do their job otherwise. But the rest of us are under no such constraints. American democracy does not speak only through its public officials. While the latter may have to observe the niceties of intergovernmental etiquette, limiting their interventions in foreign police states to “quiet diplomacy” (or, for that matter, quiet subversion—at least in those parts of the world where the CIA has not yet been liquidated), other Americans, individually and through private institutions, can speak with a very different voice. This is both morally and politically important. There is also an argument to be made (I made it back in 1977) for taking even the governmental concern for human rights out of the State Department and putting it into an independent, though publicly funded, agency with no responsibility for the execution of foreign policy. In any case, we underestimate foreigners if we assume that they do not understand the constraints under which American public officials operate in their countries, or even if we assume that they are ignorant about the relation between government and private citizens in the United States. For this reason, the human-right policies of American religious institutions, labor organizations, business enterprises, and academic bodies are as important as the policies of the American government in this area.
The Reagan administration, despite the intense agitation about it in the liberal press, has not yet run up enough of a record on human-rights issues to allow a considered judgment. It compares favorably with the Carter administration in that it apparently understands the realities of totalitarianism and of Soviet power—something that could not be said of its predecessor, except sporadically. The Reagan administration is unlikely to confuse the moral issues raised by a Central American dictatorship with those raised by the Gulag Archipelago, or to destroy a strategically important American ally as a by-product of its human-rights policy. That constitutes considerable progress. It remains to be seen to what degree the present administration will develop policies toward friendly nations, especially in the Third World, that achieve a sensible balance between hard-nosed national interests and moral imperatives. So far, there are good grounds for optimism.
1. Obviously, given the nature of the American tradition and particularly given the American commitment to the concept of freedom, the United States cannot be indifferent to the state of human rights abroad. After all, we live in an age of massive political awakening around the world. The idea of human rights, once so narrowly confined to a small portion of mankind, is becoming today the central and most compelling force for political change throughout the world. It is a fact that the awakening involves an effort to widen the scope of individual freedom. To the extent that such an effort is identified with America, it is in America’s interest to further such processes. This enhances the symbolic appeal of America, while gradually transforming the world into a genuinely more decent place for people to live. That is surely in our national interest.
Therefore, in some fashion and to some degree the issue of human rights has to affect relations between the United States and pertinent countries. The difficult question is to what degree and in what fashion. The answer to that question is certainly not to be found in some turgid academic distinctions between authoritarian and totalitarian governments.
2. What is the distinction between different types of dictatorial regimes meant to convey? Suppose an authoritarian regime (e.g., Amin’s Uganda) blatantly violates human rights while a totalitarian regime (e.g., Ceau?escu’s Romania) does so more selectively. Should we be less critical of Amin? Alternatively, suppose a totalitarian regime is relatively friendly to the United States while some authoritarian regime is intensely hostile. Should we ignore that consideration? Finally, suppose that one or the other of these types of regimes happens either to threaten us more directly or to have a variety of other important relationships with us. What, in brief, are we supposed to deduce from that rigid and abstract distinction?
A more meaningful way to approach the problem is to ask whether a given government, which grossly violates human rights (be it authoritarian or totalitarian), is also involved in relationships that are of importance to us, and would such relationships be adversely affected—in a manner damaging to us—if we pressed the human-rights issue beyond some reasonable point. For example, should we abandon the increasingly important strategic relationship with China, something which no doubt would please the Soviet Union, because of Chinese disregard of human rights? Should we refuse to help them become stronger in resisting Soviet threats because they do not meet our standards? Or, in the case of the Soviet Union, should we refuse to negotiate arms-control agreements with the Soviets because they violate the human rights of dissidents or Jewish would-be émigrés?
In brief, it is necessary to make relative judgments, but these judgments have nothing to do with governmental typologies. Rather, they involve the difficult question of balancing whatever good we actually can do on behalf of human rights with the negative costs of such efforts for other important relationships that we may have with different governments.
3. I believe the Reagan administration’s human-rights policy compares unfavorably with that of the Carter administration. When the Carter administration came to office, in the wake of Vietnam and Watergate, the American people wanted an approach to foreign policy that reaffirmed basic political values, values that were somewhat in eclipse during the preceding years. Jimmy Carter offered to restore the American people’s sense of moral worth and convey to the world a more authentic picture of the United States than one of a country following various policies of expedience. The Carter approach sought to reduce the identification of the United States with repugnant regimes as a means of reaffirming America’s leadership of the world and advancing our goals of opposing Soviet influence and power. The Carter policy was far from perfect. We did not succeed in overcoming the inherent contradictions that such a policy has to confront, and no doubt we made some mistakes. Nonetheless, we did make significant progress in enhancing the scope of individual freedom in a variety of places around the world. Thanks to our efforts, thousands of political prisoners were released in several countries in Latin America and in Asia. We enhanced the scope of personal freedom in South Korea. We obtained the release of five key dissidents from the Soviet Union (but, alas, the Soviets did crack down on the exit of Jews and on human-rights dissidents within the Soviet Union itself). We quietly obtained the release of some Jewish women from Syria, and so on, and so on. I know of no comparable actions by the Reagan administration, while some of its top officials have made awkward and embarrassing statements regarding the alleged excellent state of human rights in certain (dare I say it?) authoritarian regimes. In brief, under Carter other governments knew that the state of human rights was of genuine concern to the U.S. Do they now?
1. We take the trouble to consider the question because we believe that we can influence American foreign policy. The question then translates into a more realistic one: should we attempt to influence this policy in the direction of concern for human rights, or should we renounce any such concern, thus abandoning foreign policy to other concerns such as the climate for business operations and control over resources? Properly put, the question reduces to this: do we care about the human consequences of our actions (or inaction)?
Acting as individuals, most people are not gangsters. Matters are often different when they subordinate themselves to institutional structures of various sorts, such as corporations or the nation-state. We regard it as wrong, indeed pathological, to steal food from a starving child. But we engage in such behavior on a massive scale without second thought, when the act is disguised in terms of high policy: for example, when U.S. power is employed to overthrow a moderate regime in Guatemala that is attempting to improve the lot of miserable peasants, replacing it by a successor devoted to export-oriented agriculture while tens of thousands starve and most of the work force labors under conditions of semi-slavery (that is, those who survive the death squads run by the regimes placed and maintained in power by the United States). As individuals, we abhor terrorism and torture, but we participate in them—willingly or blindly, depending on our degree of sophistication—insofar as we tolerate the contribution of the United States to the plague of terror and torture that has spread over Latin America (and not there alone) in the past several decades. A substantial part of political discourse is devoted to obscuring such simple facts as these. If we care about the human consequences of what we do or fail to do, if we accept the most elementary moral principles, then we will attempt to influence state policy toward a concern for human rights.
One of the devices used to obscure plain facts is the concept of “national interest,” a mystification that serves to conceal the ways in which state policy is formed and executed. Within the nation, there are individuals and groups who have interests, often conflicting ones; furthermore, such groups do not observe national boundaries. Within a particular nation-state, some groups are sufficiently powerful to exert a major, perhaps dominant, influence over state policy and the ideological system. Their special interests then become, in effect, “the national interest.” To take again the case of Guatemala: in 1954 the United Fruit Company had an interest in blocking land reform; I did not. In subsequent years, other corporations and national-security managers had an interest in enhancing state terrorism: for example, by contributing to the violent repression of 1966-68; I did not. What was “the national interest”? In practice, it was the special interest of those with the power to influence and execute state policy and to shape the basic structure of the ideological system, including the flow of information.
Occasionally, one can discern something that might be called a “national interest”: for example, the shared interest in avoiding a nuclear holocaust. In general, however, the concept is invoked as one part of a system of disguising reality, an important task since, again, most people are not gangsters and will act to modify abhorrent policies if they come to understand the facts and their real significance.
Discussion at this level of abstraction provides only the most general guidelines for policy. In practice, there is generally a network of conflicting interests, values, and interpretations. Suppose that a concern for human rights does come into conflict with some shared interest of most Americans. The questions that then arise are in principle no different from those we face as individuals: how do we act, for example, when the interests of a starving child conflict with our desire for a scrap of food he possesses? Sometimes the questions are not so simple, and a concern for the human consequences of our actions does not dictate a straightforward conclusion. But it is impossible to deal with these real and substantive questions at the level of abstraction at which the question is posed.
2. The standard terminology of political discourse is nebulous and pliable, but not useless. This is true of the concepts “authoritarian” and “totalitarian.” The authoritarian regime has been defined as one whose “leaders do not rule by terror, but by that mixture of severity and benevolence that characterize the patriarchal family” (Ralf Dahrendorf, whose example is pre-World War I Imperial Germany); while the “decisive characteristic” of the totalitarian regime is “that a police power . . . enjoying unlimited discretionary powers could take ‘preventive’ action also against persons who at most were suspected of possible opposition or infractions” not only “with the external coercive measures of a dictatorship, but with the creation of ideological and racial policing powers that encroach on every aspect of human life” (Karl Dietrich Bracher, referring to Nazi Germany). In its extreme forms, as the Nazi ideologue Robert Ley stated, “There are no more private citizens. The time when anybody could do or not do what he pleased is past.”
But the real world fits poorly into such categories. The terror-and-torture states of Latin America, which are merely authoritarian according to contemporary theology, satisfy most of the standard criteria of totalitarianism—for example, those formulated by Carl Friedrich—though not all: they do not, for example, attempt to develop mass support but rather rely on violence to subjugate the mass of the population. Brazil is authoritarian, but labor is regulated by decrees that copy Mussolini’s fascist legislation, and those who sell their blood for export or who advertise their kidneys and eyes for sale in a desperate effort to survive, like the peasants and slum dwellers of authoritarian societies quite generally, are hardly more able to “do what they please” than citizens of totalitarian states. One can only regard with amazement Jeane Kirkpatrick’s solemn assurance that “because the miseries of traditional life are familiar, they are bearable to ordinary people. . . .” Similarly, South Africa is officially registered as authoritarian, but for the large majority of its population—for example, the millions now being forcibly removed to “native homelands” where they can starve without interference by the state—it is among the most vicious of existing regimes. Furthermore, the claim that totalitarian societies never develop democratizing tendencies (Kirkpatrick) is plainly false, as the history of the Soviet empire clearly shows. To take the most recent case, the Economist notes that in Poland “a totalitarian state has been turned into one that is now more pluralistic than most countries in the world.” In contrast, Lech Walesa would be lucky to survive for a day in Guatemala or Argentina; and, we may add, if he were to be treated in the manner of his counterpart Luis Ignacio da Silva (“Lula”) in currently more liberal Brazil, there would be an international outcry, notably lacking in the case of da Silva. Democratizing tendencies in the Soviet empire have regularly been aborted by external force. If that proves the Kirkpatrick thesis, then one can similarly “prove” that autocratic regimes of Central America are incapable of evolving democratic tendencies, as shown by the regular U.S. intervention to suppress them.
While the concepts “authoritarian” and “totalitarian” may have some analytic value in their earlier usage, the current revival is merely an attempt to provide a fig leaf for the traditional policy of supporting regimes that offer their human and material resources for foreign exploitation and plunder, as distinct from those that are less “free” in this crucial respect. This distinction is a real and significant one, but it plainly will not do to state it openly and honestly. Hence the need for some pretentious verbiage to obscure the facts.
Whatever validity the analytic categories may have, the policy implications are slight. Furthermore, there is no reason to limit ourselves to non-democratic regimes in considering policy implications. In the case of any regime, we should inquire into the consequences of our actions with regard to it. Israel is a democracy, but that fact does not remove the need to determine what the effect of the unprecedented U.S. military and economic support will be for those suffering mounting repression under the military occupation, or for the victims of Israeli attacks in Lebanon in the past years. Suppose that Argentina carries its mimicry of the Nazis to the point of sending Jews to death camps. It will still be authoritarian, but will it merit our support on these grounds? If support for totalitarian Hungary or Vietnam would reinforce democratic tendencies or reduce human misery (for which we bear overwhelming responsibility, in the latter case), then it is justified. The real policy questions escape the categories authoritarian and totalitarian, and are not restricted to non-democratic regimes.
3. One should be cautious in speaking of “the Carter administration’s human-rights policy.” As the record amply demonstrates, the Carter administration lent its full support to aggression and massacre (e.g., in Timor), and to brutal tyrants like the Shah and Somoza even well after the natural allies of the United States (in the latter case, the Nicaraguan business community) turned against them. There are numerous other examples. The motivation for the human-rights rhetoric was transparent: it was necessary to reconstruct a battered ideology, the traditional appeal to “American benevolence” and “Wilsonian ideals of self-determination” having lost its power to delude, apart from well-behaved segments of the intelligentsia. Nevertheless, this rhetoric had its value. It helped create a climate in which people concerned for human rights, including Latin American priests, members of Congress, and others, could act, sometimes effectively, in support of significant human values. It is therefore natural that the Reagan administration should have been greeted with enthusiasm by Latin American dictators, South African racists, and the like. They recognized that the limited barriers to state terrorism posed by U.S. policy in the early post-Vietnam period would be lifted as Reagan and his cohorts undertook their programs of alms for the wealthy, development of what is in effect the state sector of the economy under the euphemism of “defense,” international confrontation, and, in general, the pursuit of the “national interest” as defined by those with power and privilege. Space does not permit elaboration of the likely consequences, but perhaps they are obvious enough.
The problem with our human-rights policy—as with so many policies over the past several years—begins in language. Who in the Western world, or in the Eastern or Third worlds for that matter, is not prepared to profess his allegiance to the idea of “rights”? Call them human rights and the public piety grows deafening. Apart from the warm feeling of righteousness that such professions give to many of the people who make them, all this “right”-ful posture-taking is considerably aided, if not indeed made possible, by the fact that we scarcely know what we or others, mean by the term.
I submit that in the United States of America, where people, in contradistinction to people in a large number of other places, make some effort to believe what they say, the word “rights” has induced a kind of moral idiocy. Let me give a by no means exceptional example. Recently I took part in a panel discussion on one of those topics having to do with The Way We Live Now. During this discussion, not surprisingly, the issue of rights very quickly came to the fore. One of the participants demanded her right as a woman to be intelligent; another demanded her right as a woman to be granted possession of her body; and another demanded his right as a man to be a loving father. Three specimens within about five minutes—and among people well regarded for their ability to articulate—of a truly significant form of spiritual and intellectual retardation: all attributable to the culture of “rights.”
But what does this have to do with American foreign policy? The answer is, everything. For if we assume the sincerity of the advocates of a U.S. human-rights policy—and we are probably safe in doing so in two cases out of three—we must also assume they are advocating that certain critical decisions be made by the leading power of the Western democracies based on something they have neither defined, described, nor above all delimited.
For some people, our human-rights policy is seen as the agency for bringing pressure to bear on the Soviet Union to release dissidents from prisons and psychiatric wards and let those who wish to leave the country. For others, far more numerous these days, it is seen as the agency for securing United States support for leftist insurgencies against traditionalist or rightist governments. For still others, it is seen mainly as the agency for establishing a certain political agenda here at home, a political agenda that extends all the way from the improvement of prison facilities to the teaching of native Indian culture to the further institutionalization of preferential treatment for women.
In other words, in a climate in which everything, from the deepest personal yearning to the boldest bit of public presumption, has become a “right,” we can no longer without immense care and tedious argument discuss either morality or politics. We can no longer establish even so obvious a distinction as that between unfairness and genocide, let alone that between police brutality and government terror. Is it any wonder that men of supposed good will might claim some confusion about the difference, say, between Chile and Cuba? They have already been verbally beaten into passing over the difference between Harlem and Auschwitz.
COMMENTARY’s question about the role of concern for human rights in American foreign policy is asked, of course, in an entirely different context. The issue here is really whether the U.S. is to behave as great powers have always customarily behaved and pursue the most straightforward and “hard-headed” definition of its national interest or whether its conduct in foreign affairs should not be fueled by some higher ideological purpose. This issue has indeed been debated among us, to no clear resolution, since the earliest days of the so-called cold war: were we, in our massive efforts to contain Soviet power behind the Oder-Neisse line, merely attempting to establish a new world balance of power or were we attempting to stabilize and if possible expand that area of the globe’s surface where men might enjoy some genuine measure of liberty?
Though the debate itself continues to rage, it seems to me that the count on it has long since been in. The Soviet Union is not merely a competing and expanding great power, any more than Nazi Germany was, and therefore its successes and failures—and our successes and failures in responding to them—cannot be reckoned by any traditional calculus of statecraft. The idea that the U.S. must maturely recognize, and act to stabilize and deal with, an established new reality of power among nations brought us détente. And détente in turn brought us the greatest threat, not only to the survival of the value of liberty in the world but to our most narrowly defined national interest, that we have ever faced. Those of us who declared, as it was often deemed simple-mindedly or woolly-headedly, that the mission of the United States was to combat the spread of Communism wherever feasible have turned out to be far more realistic and hard-headed than our ostensibly more worldly-wise brothers.
The mission is, to be sure, an agonizingly difficult one. For the struggle between the sustaining of liberty and an expanding Communist totalitarianism is asymmetrical. Those who set out to do the former must build; those who oversee the latter must only interdict. From this point of view, the advocates of a human-rights policy have muddied beyond endurance our thinking about our first, prior and critical, task.
Take the case of Iran. Shall we not in another year or two or three be forced to recognize just who has truly benefited from the chaos created by the interdiction of American power there? Will we really still be able to tell ourselves that the overthrow of the Shah has served no more than to energize Muslim fundamentalism? I doubt it more with every passing day and each assassination and counter-assassination. And the Iranians . . . will they be looking back on the human-rights violations of Savak as a golden age compared with their present—as the South Vietnamese, including former members of the Vietcong, now have reason to look back on the widely advertised horrors of the Thieu regime?
The struggle between liberty and totalitarianism is indeed an asymmetrical one and always has been; and so far as I can see, our human-rights policy has only trivially added to the sum of justice in the world while at the same time adding incalculably to the already burdensome handicaps of the liberty party. What will a weakening of the U.S. position, so definitively important in the maintenance of liberty on earth, in the long run benefit anybody— except, of course, those who have for more than sixty years been spitting on the very notion that a citizen has any right at all against the state, spitting, in fact, on the very notion of a citizen?
Which brings me back to the original problem: what do we mean by human rights? If we mean the possibility for ordinary people to think and lead their daily lives, within reasonable bounds, as they choose, without being kicked around too much either by their neighbors or their rulers—in itself a dream still beyond the reach of billions of peoples—then insuring the strength and success of the United States of America is a human-rights policy. But if we mean all those other things that have attached themselves to the awesome combination of two awesome abstractions, “human” and “rights”—that we all be equally rich, happy, admired, respected, healthy, unobstructed in our desires, and unlimited in our actions—or even if we mean only some of them, we are in the soup. For there is no way that we can be guaranteed such things in advance by any social or political system either yet devised or devisable.
What we do know we can hope for from a system of political institutions in the real world is a certain, maybe even a very high, degree of political liberty. We in the United States know this because we already enjoy it. Others might one day come to know it if we keep the faith (and our powder dry). Engaged in a struggle to survive and prevail as the United States now is, what genuinely contributes to the struggle is for that very reason a genuine contribution to the survival—and it is to be hoped some day the prevalence—of those basic rights from which other bounties flow. To hamstring ourselves in doing what is necessary for that survival, albeit in the name of high principle, is only to bring closer the day when there will be no rights of any kind for anybody.
In this context, the argument about whether we can distinguish between authoritarianism and totalitarianism is little more than a moral smoke screen. Making such distinctions is not only necessary to, it is the moral enterprise itself. Anyone who is not a moral cretin can and does make these distinctions whether he admits it or not. There is a distinction to be made between authoritarianism and totalitarianism; there is a distinction to be made between this authoritarian government and that; there is a distinction to be made between this democratic or republican government and that. There is even a distinction to be made between one kind of official brutality and another, and what is more, we, all of us, make it all the time.
The people who argue that there is no distinction between authoritarianism and totalitarianism, however, do not argue in good faith. Some of them have decided to deplore equally everything less than perfect and let others dirty themselves with reality. These are not interesting, not morally, not politically, and they do not count: they are the careless, indulged children of free society, depending on remaining a luxury the rest of us can afford. The others have simply made a political choice. They mean nothing more by their argument than that the friends of the United States are their enemies. For such as these the spectacle of people risking the high seas in leaky boats, or hanging from trees in the yard of a foreign embassy where it is rumored they will find safe passage out of their homeland, makes no effect on their use of the term human rights. They have made the distinction between Communist totalitarianism and its alternatives, all right. It’s just that they have come down on the other side.
As for the Reagan administration, it has spoken but is yet to be tried in the crucible of fortitude. So are we all. The test is not in the knowing—for, as I said, everybody knows—but in the willing.
The defense of human rights is nothing less than the defense of minimal decency in the relations between state and citizenry. Unless one is totally indifferent to the fate of human beings beyond state boundaries, it is impossible not to feel some measure of identification with foreign victims of governmental abuse. In actuality, for reasons of race, religion, professional association, personal friendship, and media attention, many of us find ourselves often more attuned to foreign victims of human-rights abuses than to those of fellow citizens caught in virtually “invisible” pockets of oppression here at home.
Furthermore, internationalizing efforts to protect individuals and groups from the cruel excesses of state power seems like a necessary further stage in moral evolution. Indeed, human survival may turn out to depend on our individual and collective capacity over the next several decades to develop a sufficient sense of species identity to sustain the emergence of new types of national-security systems: in essence, making security less reliant on weapons of mass destruction and military establishments. An emphasis on human rights seems like a relevant vehicle for strengthening this awareness of a potential political community embracing the entire planet.
And yet, whether governments of sovereign states, especially of superstates, are in a good position to promote human rights is far from self-evident. Given the way the world works, human-rights considerations are generally subordinated to wider geopolitical concerns, even by the most enlightened of governments. As a result, to the extent that human-rights issues are introduced into formal statecraft, governments mostly dwell on the abuses of their adversaries and overlook those of allies. This apparent double standard confirms accusations of hypocrisy and cynicism leveled against any government that pretends to shape its foreign policy by reference to moral considerations. The United States is under particular suspicion on these grounds because it has periodically claimed to be a distinctively moral actor on the global stage. In recent decades, because American power has grown so great, there is the further question as to whether a dominant world state, given its role and pattern of interests, can ever genuinely incorporate human-rights considerations into its foreign policy. Finally, there is a concern in the nuclear age about whether an emphasis on the human-rights violations of adversaries does not have the detrimental effect, however unintended, of war-mongering.
I think these concerns are well-grounded. The logic of the state system does not give much scope to genuinely moral considerations. Moralizing by strong states is often perceived as either mere posturing or as a pretext for intervention. Arguably, any serious advocacy of human rights is by its nature interventionary, thereby undercutting respect for the norms of nonintervention which may be the most basic among the principles of order in a society of sovereign states. Samuel P. Huntington directly challenges this normative support for noninterventionism in his recent COMMENTARY article [“Human Rights & American Power,” September] where he contends that the main historical extensions of American military power have been virtually synonymous with the realization of human rights.
Although I find Huntington’s claims dangerously self-serving and empirically unconvincing, nevertheless I acknowledge some role for human-rights considerations in United States foreign policy. Even powerful governments can occasionally serve the wider global interests in upholding human rights. One way to reconcile the promotion of human rights with the maintenance of international order is by reference to the severity of the violation. If the pattern of violation is severe (massive, persisting, offensive to minimal ideas of morality, and officially condoned by the target government), then the reality of global awareness creates a mandate for all actors and overrides the normal inhibition on nonintervention. Even in such circumstances, no impartial procedure exists by which to pronounce the existence of an exceptional situation; subjective determinations are both unavoidable and hazardous.
Finding an adequate criterion for severity is essential if the promotion of human rights is to be put on a principled (non-opportunistic) basis. The most prominent recent proposal is to rely on the distinction between totalitarian and authoritarian regimes, regarding the violations of the former as alone sufficiently severe to justify a foreign-policy emphasis. Such a distinction seems attractive in several respects. It is reasonably clear in application and rests upon certain regime characteristics that appear detached from the vagaries of geopolitical alignment. Besides, totalitarian regimes give no internal space to their opponents, spare no one in the society, and seem virtually immune to internal reformist pressures. On the other hand, the distinction is not finally satisfactory. As used by political practitioners, especially on the Right, the distinction seems little more than a code devised to say that human-rights abuses by Communist adversaries are fit concerns for U.S. foreign policy whereas abuses by anti-Communist allies and friends are not. Endowing human rights with such a heavy ideological component seems unduly provocative in a world where the search for peace has to be given precedence over efforts, however laudable in intent, to coerce domestic reforms in adversary societies.
At the same time, a non-selective emphasis on human rights of the sort associated with the initial two years of the Carter Presidency seems ill-advised and unsustainable. In fact, it was quietly abandoned as impractical. As directed against allies and friends, it seemed simultaneously ineffective, interventionary, and self-defeating, while as against adversaries it complicated negotiations and relations without enhancing human rights. The Carter approach to human rights could not finally be reconciled with a world of states or with the practical pursuit of U.S. interests in the world.
Some criterion of severity seems necessary to ground a foreign-policy emphasis on human rights. I find the distinction between totalitarian and authoritarian regimes too ideological in character and not sufficiently correlated with regime type. This abstract concern has been reinforced by the self-righteously one-eyed human-rights stance of the Reagan administration to date.
I would favor for this reason a more substantive conception of severity as the basis for foreign-policy promotion of human-rights goals. In this regard, genocide, and possibly apartheid, are the only circumstances of human-rights abuse that seem to me severe enough to warrant waiving inhibitions on intervention in internal affairs. Such a narrow image of what is to be done does not of course condone other kinds of human-rights abuse, but only rejects their foreign-policy relevance. Such a disclaimer does not imply that the United States government should not take account of human-rights achievements and failures in shaping discretionary policies on aid and trade. It means only that the interventionary option should be strictly and consistently reserved for a very limited range of instances. Even here, considerations of prudence need to be taken into account. If the Soviet Union engages even in genocidal policies with respect to its peoples, the United States should still obviously be circumspect in response, although uninhibited to the extent it can bring effective pressure to bear by prudent means. That is, it may be prudent to endorse armed intervention to unseat Idi Amin but imprudent in the extreme vis-à-vis Joseph Stalin. In both instances, principled intervention could be amply justified on human-rights grounds.
In addition to promoting actual change in foreign societies, human-rights issues provide a government with an opportunity to express its moral outlook. Symbolic positions on moral questions have become important in international affairs since World War II. They can be understood, possibly too optimistically, as part of the unconscious preparation of the climate for more globalist kinds of political behavior that will eventually be needed to overcome basic survival threats. In these regards, Reagan’s early “stands” on human rights seem disastrous, as they repudiate even the most modest efforts to manifest moral solidarity with the rest of the world on a minimal human-rights agenda (e.g., apartheid, Nestlé).
True, the Reagan administration’s approach to human rights is less hypocritical than that of its predecessors. It expresses what it genuinely believes. At the same time, it practically reduces human rights to partisan ideology and seems to call for a completely laissez-faire moral order in world affairs at a time when cultural, economic, and security trends are making life on the planet more integrated. Even if there is no current capacity or disposition to implement human rights, their symbolic endorsement represents some sort of promise to the future we can ill-afford to repudiate.
In conclusion, neither the Carter nor the Reagan approach to human rights seems to provide adequately for their role in U.S. foreign policy. Carter’s efforts were ill-considered, naive, erratic, and proved unsustainable, while the early indications are that the hard-headed Reagan approach unduly isolates the United States in global arenas and underestimates the symbolic importance of human-rights activism.
1. Of course a concern for human rights should play a role in American foreign policy. It always has. Long before the administration of Jimmy Carter, the United States acted to demonstrate its concern for human rights, and was strongly urged to do so by its citizens, most prominently perhaps American Jews. The origins of the American Jewish Committee are to be found in the effort to protect Russian Jews at the turn of the century, and it was clear from the beginning that this effort could not have been effective if it were limited to the actions of Jews, as Jews, alone, but would require action by the American government, which American Jews pushed for, through all the means normal in American politics.
The greatest violation of human rights in our times was the systematic murder of the Jews of Europe by Nazi Germany. American Jews may be faulted for not trying hard enough to influence their government to do what it could to limit the inhuman crimes of Germany, but they certainly tried.
But the practices of Jews in the past and today (and those of other ethnic groups, too) do not really answer some troubling questions. Perhaps Jews were wrong to press their government to act to protect the rights of Jews, regardless of how that might affect some grand diplomatic design of American foreign policy? Perhaps the American government was wrong to respond to these pressures, as it sometimes did? Perhaps the national interest should be strictly separated from the interests or emotions or passions of any specific part of the American population? Perhaps. But it is unrealistic to urge such a course, on people or on governments. In any case, we now have a national consensus, written into law, that the American government must be concerned with human rights in every country in the world, and must regulate its relations with each country in part on the basis of that country’s human-rights record.
This, its seems to me, goes somewhat too far: it suggests a kind of self-righteous arrogance to sit in judgment on every country in the world, as Congress now requires us to do. The human-rights records of some countries, those with which we have limited relations, which are not represented in this country by any substantial group of residents or citizens coming from that country, and which are of no great significance for our foreign policy, should not be of such concern to us that we must publicly rate—and berate—them.
But this is a problem for Congress to deal with; even in the absence of legislative requirements, human rights will, and should, affect American foreign policy. Indeed, the whole basis of our foreign policy and our system of alliances is to protect some basic human rights, and the countries that operate on the basis of them, from an enemy which denies those rights. The nature of the alliance we lead is that the core countries within it—the United Kingdom, France, Germany, Italy, Canada, and the smaller countries of Northern and Western Europe—are defined by their commitment to popular rule, and to government under law protecting individual rights.
Unfortunately there is no absolute correlation between our allies, the countries of strategic importance whom we want to prevent from falling under Russian Communist dominance, and the practice of human rights. That defines our problem.
2. I have made the distinction between authoritarianism and totalitarianism in the past (see my article, “American Values and American Foreign Policy,” COMMENTARY, July 1976), and I thought then, and think now, that it is important. Classically, the difference is that in a totaliarian regime there are no separate sources of power—neither church, nor independent ethnic community, nor independent business enterprise, nor a free labor movement, nor independent voluntary organizations and institutions. Everything is organized under the government, and no independent source of authority, knowledge, or power remains. In an authoritarian regime, all or some of these may, and do, persist.
A second difference follows: because of these independent sources of power, an authoritarian regime could evolve or be transformed into a democratic one, with or without violent revolution. One could not imagine this happening in a totalitarian regime. This was the perspective that seemed reasonable until a few years ago. At that time, we had the examples of Europe’s southern tier to support this view.
But just as our conviction of the inevitably tight connection of Communist parties under central Russian leadership broke up in the post-World War II years, as first Yugoslavia, then China, took an independent course, and as various Communist parties followed one or another of these examples, so, too, our conviction that Communism can never turn into democracy is now somewhat shaken. An initial shock was that Yugoslavia allowed free emigration. All other Communist countries still routinely prevent their citizens from emigrating. Then there was the case of the Prague Spring in Czechoslovakia in 1968, which suggested that a Communist regime could liberalize itself, though we never had a chance to see if this as yet unheard of development was really possible—Russian troops intervened. And now there is Poland, with the amazing creation of a free trade union that is rather more than a trade union, and a considerable liberalization of press and publication by a frightened regime. Communism, it seems, can evolve away from rule by a tiny, self-perpetuating oligarchy—as has always been true of authoritarianism.
Authoritarianism has gone through its own evolution. We now see (or have become aware of) a degree of cruelty in the treatment of political opponents or suspected political opponents in Latin America that seems unmatched in Communist countries. In the Communist bloc, landlords may be exposed to public denunciation, so-called traitors may be framed in political trials (preceded by torture or drug treatment), and both may be shot thereafter—but the tortures reported from Chile and Argentina do not seem to me, admittedly no expert on the enormous mass of materials describing man’s inhuman treatment of man in both totalitarian and authoritarian countries, to be matched in Communist countries. The issue may not be that authoritarian torturers are more sadistic than Communist ones, but that there is something in the history of some Latin American countries which has encouraged a degree of cruelty that is exceptional. Perhaps the same was true of the Shah’s regime—was it run-of-the-mill authoritarianism, or was it something indigenous to Iranian culture? The great authority on India, A. L. Bashan, writes that there is no record of torture in Hindu India; I don’t know if he is right, but it suggests that there may be a cultural factor in torture. Torture was a part of European jurisprudence, and was applied under law, into the 18th century. Torture was part of Chinese jurisprudence. Is there a cultural issue here that we do not fully recognize? I am not suggesting that torture, once established in a society or culture, marks it forever—the English, despite the horrible tortures of the Elizabethan age, evolved in time a most decent and humane treatment of the accused and the condemned. And I am not suggesting that Latin American practice is a direct descendant of the Inquisition. But there is a problem here that transcends to my mind the distinction between totalitarianism and authoritarianism.
What consequences do we draw from this? We must express our horror and outrage at such inhuman mistreatment. If the major aim of our foreign policy were to do this, we would have time, I am afraid, for very little else. Perhaps more important, some of these countries which engage in torture, or practices close to torture, are important allies: it is not easy to threaten to withdraw support from South Korea if it does not mend its ways while North Korea is strong, menacing, and totalitarian.
On the other hand, there are countries which have engaged in or tolerated vicious practices which are not important to our national defense, not important for the defense of the core countries that sustain the values of popular government and government under law. What need is there in the case of these countries (Argentina? Chile?) to deal at more than arm’s length with those who sanction torture?
3. The Carter administration embraced the cause of human rights as a key guide to our foreign policy: it cannot be that. We cannot and should not go around the world pinning medals on some countries, recording debits for others. It is truly not our business. If Congress requires the administration to do so, as it does, we should not embrace the requirement enthusiastically. Inevitably, our attitude to any country must depend on the degree to which human rights are respected. But if we make this a determining consideration in our foreign relations, we only encourage the denunciation of those regimes with which we are allied (and of ourselves), however advanced the degree of respect for human rights in such countries may be, simply because they are more open to press and other investigations than are totalitarian states. Faults and blots will be sought out which enable our enemies to cast us in the posture of an abuser of human rights. Thus we endanger important elements in our foreign policy by opening up an angle of attack to which we ourselves have in advance given precedence.
For these reasons, a public emphasis on human rights as a controlling factor in foreign policy was ready for some downgrading when Ronald Reagan took office. On the other hand, just as the Carter regime seemed to suffer from an exaggerated sense of the moral burden placed on it—encouraged by Congress—in grading the entire world on human rights, so the Reagan regime seems to suffer from a similar exaggeration of the requirements of national defense, and rushes about offering arms to any country that can present itself as an enemy of Soviet Russia. If we sent arms to fewer countries, there would be fewer whose human-rights record we would have to apologize for. Under the circumstances, Chile, Argentina, and Pakistan, among many others, seem ideal candidates for American indifference. The first two are hardly threatened by external Communist aggression, and if they were threatened by internal Communist subversion, sending arms to their military rulers would be of little help. Pakistan may present a lesser problem in terms of human-rights justification (though the enemies of its present dictator do not think so), and perhaps a more serious question if one is engaged in the planning of a grand design against potential Soviet aggression. But that question is serious only if our leaders take the position that the entire world is the proper object of American concern, that everything is truly related to everything else, and that everyone who asks for arms and is not strictly a part of the Soviet bloc must be supplied. We must get out of the habit—as a government—of giving marks to every country in the world, whether for human-rights violations or for suitability to receive American arms on concessional terms. Insofar as the Reagan administration seems to have followed in the same business of giving out grades, though for different purposes, I fault it.
It is not necessary for us to insure that every country loves us more (or dislikes us less) than the Soviet Union, is “good” on human rights, or is properly supplied with American arms. The objects of our concern should be narrower. Those countries with which we are closely linked and which abuse human rights should know of our displeasure; just how we show this is a matter to be decided for each country and on each occasion individually. For the rest, we should be neither the world’s moral policeman nor the world’s arms supplier. But that is to define a role for the American government, not the American people. Just as every famine, every earthquake, has occasioned some outpouring of American concern, so, too, every case of abuse of human rights urges some Americans to take action—and quite properly. But it should not be the task of the American government to replace the American people by responding to every act of inhumanity that arouses the concern of Americans. Those aroused, of course, will lobby their government to back them in their concern, just as ethnically identified Americans do. Among all these pressures, it is unfortunately necessary for our government to exercise care in determining where it puts its influence, and exerts its pressure, in improving the human condition. The world, we have discovered, is bigger than even the power or knowledge of our great country can encompass, its dilemmas too various, its problems too resistant to solution. Even in the exercise of virtue, as in the exercise of power, one must learn moderation.
The editors’ questions do not admit of yes-or-no answers. Throughout the American experience, remote and recent, the answers have always been—of course, but. An explanation of why this was so will respond to the first question and simplify the issues of the second and third.
From time to time, ideologues in the United States went all out, on one side or another. Occasionally, professional diplomats made raison d’état absolute, brushing all else aside as sentimental moralism. But urgings that the United States acquire colonies or play the balance-of-power game carried weight only in brief interludes of the past century. Similarly the contrary idea, that this country bore the sole responsibility for safeguarding human rights. From 1790 onward, some men and women, from time to time, plunged into causes they hoped would hasten the imminent spread around the world of the human rights guaranteed by their own Constitution. They joined Jacobin societies or went off to fight for Greek or Polish independence. But their countrymen were always more cautious. They knew that their own Bill of Rights was subject to frequent changes in emphasis and they neither expected a prompt transformation of ancient institutions elsewhere, nor surrendered hope of regeneration in good time. They made no lunges after absolutes but rested their judgments on the practical choice of alternatives.
The line actually followed was affirmative but with qualifications. Their history as a people, originating in great migrations and revolution, established a commitment to human rights. The cause of America was the cause of all mankind—this was the animating theme of the Revolution. It followed, as a matter of course, that republican diplomacy ought to have a moral quality. But the shapers of American policy knew that actuality was not what they wished it to be. An Old World encompassed a large part of the globe, much of it subject to Oriental despotisms, the balance ruled by kingdoms bound together in networks of dynastic alliances. The new, more fortunate, world of that Pilgrim people, the Americans, for the moment occupied but a corner of the earth and numbered but a tiny fraction of its population. In the long run, some day, they would lead others toward life, liberty, and the pursuit of happinesss. But for the moment, it would be ludicrous to pit their own strength against the tyrannies of Asia and Europe or to fancy that the peoples of those unhappy continents would at once throw off the shackles of ignorance and superstition which generations of oppression had fastened on them.
The United States would not, therefore, involve itself in dangerous entangling alliances which it was in any case too weak to influence. But neither would it hold itself aloof. Rather, by peaceful trade and efforts at amelioration, it would help others progress to the point at which they could free themselves. That understanding shaped the way in which citizens of the early Republic applied moral judgments to relations with the outer world. They are still useful.
Within that context the first set of questions posed by COMMENTARY could not arise. A concern for human rights necessarily played a role in foreign policy insofar as American statesmen voiced the ideals expressed in their own Declaration of Independence and Bill of Rights. But there could be no conflict with the national interest as long as consciousness of the limits of power tempered that concern.
After 1914 and even more so after 1941, the context changed. The foreshortened future no longer afforded Americans the luxury of distinguishing between the long-term rhetorical concern for human rights and prudence in immediate action. Now the military and economic power of the United States could swing the balance between freedom and slavery, everywhere. For two decades after the end of World War II the assumption of concordance between national interest and human rights held. At San Francisco, Joseph Proskauer led the American delegation in insinuating into the Charter of the United Nations an expression of concern for those rights that ultimately led to the Universal Declaration of Human Rights.
Confusion followed the erosion of belief in the accord between national interest and human rights. In the past fifteen years Americans have lost the sense of history that once informed them whence they came, whither they were going; and that thereby located them in relation to the rest of the world. As a result their judgments are discontinuous, guided by no consistent standards of either national interest or human rights. At the same time they have lost the ability to discriminate between the ideal and the practical, between the ultimately good and the immediately preferable. Facing a resolute antagonist ready to accept any generality in principle, but not to be bound by it in practice, Americans too often fell into the trap of tying their own hands by rules of the game that did not restrain the Soviet Union.
The choice of alternatives. There were human-rights violations under Fulgencio Batista, the Shah of Iran, and Somoza. It was in the national interest of the United States to do what it could to correct abuses, but with an awareness of alternatives. The violations of the successor regimes are far worse, and leverage for correction nonexistent. The unwillingness to weigh genuine alternatives, comprehensible in the instance of Batista, became ludicrous by the instance of Somoza.
Awareness of consequences. The faults close at hand are easily visible, less so those obscured by a curtain of censorship. In 1970 it was easier to report on constraints on freedom of speech or inhumane prisons in South than in North Vietnam, in South than in North Korea. It called for an all-too-rare display of evenhandedness to be fair under those circumstances. Furthermore, Kim Il Sung and Ho Chi Minh could laugh American criticism away. Rhee and Thieu could not. The flat assertion of general principles under those circumstances was disastrous.
National self-determination. Every people deserves a national home. Fine. But what constitutes a people? No states satisfy the national aspirations of Macedonians, Basques, Welsh, Kurds, and Armenians, among others. Why are the Palestinians more worthy? When indeed did the Palestinians define themselves as a people? The answers to such questions are not abstract but practical.
The settler regimes. No principle of equity justifies the universal abstract condemnation of settler regimes which have entered a territory in historic times and developed it. The American Indians who themselves migrated to North America, the African blacks who came south, the Arabs who came west to Algeria and Morocco are all presumed to have a greater claim to the land than the Europeans whose efforts discovered its resources and made it fruitful. Why? Again the ideal fails to supply an answer.
The foregoing considerations expose the futility of devising a typology of totalitarian and authoritarian regimes. To throw together in the latter category one-party states like Mexico and Egypt, tradition-based governments like those of Portugal’s Salazar and Spain’s Franco, military regimes as in Chile and Pakistan is to confuse both the abstract and the practical issue. No country has a perfect record when it comes to human rights—not even the United States or Switzerland or Sweden. To base judgments on a scale of virtue is hazardous: how does one compare South Africa, which degrades blacks by apartheid but maintains a rule of law for whites, with Chad, Ethiopia, and Libya, which degrade everyone by the denial of all rights?
There is only one reliable criterion on which to base action. Romania is totalitarian, Chile authoritarian. Democratic freedom-loving people will not admire either government. But neither is a threat to its neighbors; neither is aggressive in its behavior. We deplore the lack of liberty in the Soviet Union or Nazi Germany. But those regimes affected the American national interest by their aggressions which, ultimately, threatened the world’s peace. The difference between the China of 1961 and 1981 is not so much the alteration of its government as the change in attitudes toward its neighbors.
About Reagan, it is too early to judge. Carter’s record was disastrous not only for the confusion of ideals and reality, but also for sheer ineptitude. To scold the Argentinians while snuggling up to Castro, to frown on the dictator of Nicaragua and smile on the dictator of Panama, to value Saudi oil but not Namibian uranium displayed a level of confusion the new administration will not readily match. Reagan’s team sounds more prudent; time will tell whether its actions are.
To the extent that the defense of democratic institutions and an open society is integral to the preservation of human rights, American foreign policy since World War II, and for many years before that, has always attempted to further human rights. The insistence upon free and unfettered elections in the agreements at Teheran, Yalta, and Potsdam is one piece of supporting evidence. To the extent that free elections rest upon freely given consent, they presuppose freedom of speech, press, and assembly which are necessary conditions of all other assured freedoms. When on the basis of the New York Times’s Cuban correspondent’s misleading account of Castro’s actual strength and ideological commitment, the U.S. State Department withdrew its support of Batista, it expected Castro to live up to his promises and conduct free elections. Even after twenty years of despotic rule, Castro still fears a free and honest election, despite the fact that since his advent to power almost a half-million Cuban citizens who would have voted against him have fled the country.
A profound injustice has been done to Jeane Kirkpatrick and to those who hold views similar to hers on human rights by the likes of Anthony Lewis, Patricia Derian, and other apologists for the Carter administration. Her concern for human rights is every bit as principled and firm as theirs but more intelligent because it is exercised with some consideration for the consequences of the mode and timing of protest against human-rights violations for the whole complex of freedoms that as democrats we seek to further. The easy absolutism that demands official U.S. protest against all violations of human rights anywhere and anytime without reckoning with context, degree, and the relative effectiveness of types and kinds of protest may sometimes worsen the prospects of human freedom. It is a stance whose inconsistency betrays the political animus behind it. After all, those who are decrying Ambassador Kirkpatrick’s position as a reversal of the human-rights position of the Carter administration remained silent when Carter gave full recognition to the Chinese Communist regime, whose own Gulag Archipelago dwarfs in depth, extent, and cruelty the outrages and repressions of the Argentine military regime. The justification for playing the China card currently in the interest of defending the cause of freedom is obvious. But whatever the State Department says or does not say, as citizens of a free society we do not have to forgo for a moment our public criticisms of Chinese practices. And the same goes for Taiwan and South Korea.
The distinction between authoritarian regimes and totalitarian regimes should be clear to even a minimally literate political person. When Jacobo Timerman says in the Jerusalem Post (August 23, 1981) that after reading and talking to Jeane Kirkpatrick and William F. Buckley, Jr., “and still now, and after all that, I don’t know what they mean,” he is being disingenuous. When he adds that Ambassador Kirkpatrick believes that “only authoritarian governments can develop into democracies and totalitarian governments cannot do so—this is a totalitarian way of thinking,” he is being intellectually dishonest. For what Ambassador Kirkpatrick believes and has said is that historically some authoritarian regimes have developed into democracies while until now no totalitarian regime has done so. And more important, that because there are islands of cultural and economic freedom in authoritarian regimes which can become points of infection, it is easier to transform authoritarian regimes into democracies than it is to transform totalitarian ones.
The Wilhelmine empire of Germany, which perpetrated many injustices against socialists and liberals and imprisoned its subjects for lèse majestè, was authoritarian; Hitler’s regime was totalitarian. I once attended a luncheon of exiled Social Democrats in New York in the late 30’s in which they ironically drank a toast to the golden days of the Wilhelmine empire. Under Kaiser Wilhelm, rebels and dissenters used their heads to write pamphlets when they were in prison. Under Hitler, their heads fell under the ax.
The Czarist regime, which was anti-Semitic and politically repressive, was authoritarian; the Stalinist regime, totalitarian. In the oppressive twenty-year rule of the last Czar, a small number of political prisoners were hanged for acts of violence. Under Stalin’s twenty-year rule, an average of 20,000 persons a month were either shot or perished in concentration camps.
One can and should oppose both authoritarian and totalitarian regimes, but this does not wipe out the distinction between them. To be sure, from the point of view of the innocent victims, the distinction is irrelevant. But from the point of view of policies designed to prevent additional victims, the differences may require different modes and methods of protest. And in those unhappy historical situations where one must choose between supporting an authoritarian or a totalitarian regime, because there are simply no alternatives and because a policy of no-support insures a totalitarian victory, common sense dictates the choice of the lesser evil. My guess is that there were alternatives to tolerating Batista and supporting Castro. But if our choice had been restricted to Batista or Castro, who could deny that democratic forces in Cuba would have found it far easier to defeat Batista?
Only those blinded by partisan passions can fail to see that crucial political choices, like moral choices, are rarely between the simply good and simply bad, or the simply right and the simply wrong. It is or should be a commonplace that our moral choices are made when good conflicts with good, or right with right, or the good with the right. In an imperfect world, existing realities sometimes limit the range of desirable activities for a democratic regime to hard choices among evils. The American colonies in their struggle for independence made common cause with still feudal France—the land of the Bastille—against England whose citizens enjoyed the freedoms which the American colonists were struggling to attain. But that alliance was the price of independence. In its absence, there would have been disastrous defeat.
When Hitler double-crossed Stalin and invaded the Soviet Union in 1941, the United States government, with bipartisan support, sent military and economic aid to the Soviet Union, even though Stalin’s innocent victims at the time were probably more numerous than those of Hitler. Yet there was no outcry from contemporary Anthony Lewises and other absolutists on the occasion! The victory of Hitler was perceived as a grave threat to the complex of human freedoms we were pledged to sustain, and anything that helped us achieve that victory was sanctioned. Had there been a law on our books then that forbade us from sending military and economic aid to any country that violated basic human rights, we could not have aided the Soviet Union to resist the Nazi invasion. Even earlier, world liberal opinion protested the invasion of Ethiopia by Mussolini’s troops despite the existence of slavery in that country.
If and when the United States government finds it necessary in the national interest or in defense of our strategy of freedom temporarily to aid by economic and military supplies an authoritarian or totalitarian regime, there is no reason for our citizens to cease from public criticism. On the contrary, Americans should seize the occasion to redouble their efforts to expose the violations of freedom in that regime and urge greater respect for human rights. It may not be politic for the American ambassador to do so in a loud voice. There may be other means of suasion. But we as citizens are not bound by diplomatic convention. We can speak our minds about the barbarities of the Argentinian junta even if we know that there are even worse regimes. What is morally impermissible is to deny the truth about the authoritarian or totalitarian practices of those who for the moment may be arrayed on our side against a common enemy.
When Stalinist Russia was our co-belligerent in the war against Hitler, some innocent and not-so-innocent publicists and academic figures pretended that Stalin’s Russia was our democratic ally and that Russia enjoyed a new form of democracy. John Dewey warned in the New York Times that precisely when the Soviet Union needed our help most was the time to lay down the conditions for a democratic peace. He also protested the efforts made by some Hollywood moguls and their fellow-traveling aides as well as our Ambassador to Moscow, Joseph E. Davies, to portray the Moscow “frame-up” trials as legitimate, and a welcome action against Hitler’s fifth columnists. For that, Dewey was defamed by Arthur Urban Pope, Corliss Lamont, and other staunch defenders of Soviet democracy. It is this kind of semantic corruption and degradation of the spirit against which we must be on guard whenever political necessity compels us to have commerce with authoritarian or totalitarian enemies of human freedom.
There is another factor highly relevant to the policy the U.S. government should pursue toward authoritarian and totalitarian regimes. This is whether or not they are expansionist, using force in various dimensions to overrun or destroy the legitimate regimes of other countries, and thereby threaten our vital national interests or weaken the overall position of the free nations in their defensive alliance. From this point of view, Cuba is a greater danger to the free world than is Paraguay, reprehensible as the latter’s regime may be.
Even if it turns out that because of the demographic factor there are more violations of human rights in China than in the Soviet Union, currently the latter is the greater threat to the free world. The situation, of course, may change: Stalin is still more venerated in Red China than in the Soviet Union.
As a free nation we should never fear the challenge of Communism as an ideology. Indeed, we should welcome it, since with respect to human rights and the enhancement of material welfare for the masses Communism is demonstrably inferior. That is why it will never win the uncoerced allegiance of free men and women. The immediate danger, however, comes not from Communist ideology but from the direct and indirect global expansionism of the Soviet Union as evidenced by the incursion of its own forces into Afghanistan and the use of proxy satellite forces in Africa and Asia. I agree with Elias M. Schwarzbart [COMMENTARY, Letters from Readers, August 1981] that if we can contain Soviet expansion, in time Soviet totalitarianism will probably wither on the vine.
The human-rights debate of recent years to which this symposium is directed is not really about which should play the most important role in U.S. foreign policy: human rights or the national interest. It is rather a debate about which policies promote human rights, which regimes threaten them most gravely, which policies actually serve the national interest and how the U.S. national interest should be conceived anyway. Some of us believe that because they seek by violent, repressive means total control over the societies they govern, establish great armies, and pursue aggressive and expansionist foreign policies, Marxist-Leninist states constitute the gravest threat to human rights in the contemporary world. Others, including the human-rights establishment of the Carter period, believe that because authoritarian regimes such as those found in Chile, Argentina, and Uruguay tolerate social injustice and sometimes use violence arbitrarily, they perpetrate the gravest offenses against human rights. Involved here are different assessments of practices (which type of regime in fact imprisons, enslaves, tortures, kills most people?); different assessments of the future (which type of regime is most susceptible of liberalization and democratization?); different views of the U.S. national interest (is the establishment of new Marxist-Leninist regimes compatible with our national interest?); different views about the relation of U.S. strength to human rights (is freedom safer if we are strong, or does that matter?); and perhaps most basic of all, different views about the relations between state and society.
All of these questions must be considered if we are to confront seriously COMMENTARY’s questions. The first of these questions is the easiest: not only should human rights play a central role in U.S. foreign policy, no U.S. foreign policy can possibly succeed that does not accord them a central role. The nature of politics and the character of the United States alike guarantee that this should be the case.
Politics is a purposive human activity which involves the use of power in the name of some collectivity, some “we,” and some vision of the collective good. The collective may be a nation, class, tribe, family, or church. The vision of the public good may be modest or grand, monstrous or divine, elaborate or simple, explicitly articulated or simply “understood.” It may call for the restoration of the glory of France; the establishment of a Jewish homeland; the construction of a racially pure one-thousand-year Reich; the achievement of a classless society from which power has been eliminated. The point is that government act with reference to a vision of the public good characteristic of a people. If they are to command popular assent, important public policies must be congruent with the core identity of a people. In democracies the need for moral justification of political action is especially compelling—nowhere more so than in the United States. The fact that Americans do not share a common history, race, language, religion gives added centrality to American values, beliefs, and goals, making them the key element of our national identity. The American people are defined by the American creed. The vision of the public good which defines us is and always has been a commitment to individual freedom and a conviction that government exists, above all, for the purpose of protecting individual rights. (“To protect these rights,” says the Declaration of Independence, “governments are instituted among men.”) Government, in the American view, has no purpose greater than that of protecting and extending the rights of its citizens. For this reason, the definitive justification of government policy in the U.S. is to protect the rights—liberty, property, personal security—of citizens. Defending these rights or extending them to other peoples is the only legitimate purpose of American foreign policy.
From the War of Independence through the final withdrawal from Vietnam, American Presidents have justified our policies, especially in time of danger and sacrifice (when greatest justification is required), by reference to our national commitment to the preservation and/or extension of freedom—and the democratic institutions through which that freedom is guaranteed. Obviously, then, there is no conflict between a concern for human rights and the American national interest as traditionally conceived. Our national interest flows from our identity, and our identity features a commitment to the rights of persons. (Conventional debate about whether foreign policy should be based on “power” or morality is in fact a disagreement about moral ends and political means.)
It is true that the explicit moral emphasis in presidential pronouncements on U.S. foreign policy had declined in the decade preceding Jimmy Carter’s candidacy, partly because of the diminishing national consensus about whether protecting human rights required (or even permitted) containing Communism even through war, and partly because of concern that moral appeals would excite popular passions and complicate the task of limiting the war in Vietnam. It is also true that Jimmy Carter shared this reticence and only reluctantly—and in response to pressure from Senator Henry Jackson—incorporated the human-rights theme into his presidential campaign.
Almost immediately, however, it became clear that the human-rights policies expounded and implemented by Jimmy Carter were different in their conception and their consequences from those of his predecessors. The cultural revolution that had swept through American cities, campuses, and news rooms, challenging basic beliefs and transforming institutional practices, had as its principal target the morality of the American experience and the legitimacy of American national interests. It was, after all, a period when the leading columnist of a distinguished newspaper wrote: “The United States is the most dangerous and destructive power in the world.” It was a time when the president of a leading university asserted: “In twenty-six years since waging a world war against the forces of tyranny, fascism, and genocide in Europe we have become a nation more tyrannical, more fascistic, and more capable of genocide than was ever conceived or thought possible two decades ago. We conquered Hitler but we have come to embrace Hitlerism.” It was the period when a nationally known cleric said: “The reason for the paroxysm in the nation’s conscience is simply that Calley is all of us. He is every single citizen in our graceless land.”
If the United States is “the most destructive power in the world,” if we are “capable of genocide,” if we are a “graceless land,” then the defense of our national interest could not be integrally linked to the defense of human rights or any other morally worthy cause.
The cultural revolution set the scene for two redefinitions: first, a redefinition of human rights, which now became something very different from the freedoms and protections embodied in U.S. constitutional practices; and second, a redefinition of the national interest which dissociated morality and U.S. power.
As long as the United States was perceived as a virtuous society, policies which enhanced its power were also seen as virtuous. Morality and American power were indissolubly linked in the traditional conception. But with the U.S. defined as an essentially immoral society, pursuit of U.S. power was perceived as immoral and pursuit of morality as indifferent to U.S. power. Morality now required transforming our deeply flawed society, not enhancing its power.
In the human-rights policies of the Carter administration, the effects of the cultural revolution were reinforced, first, by a secular translation of the Christian imperative to cast first the beam from one’s own eye, and, second, by a determinist, quasi-Marxist theory of historical development. The result was a conception of human rights so broad, ambiguous, and utopian that it could serve as the grounds for condemning almost any society; a conception of national interest to which U.S. power was, at best, irrelevant; and a tendency to suppose history was on the side of our opponents. (Of course, the Carter administration did not invent these orientations, it simply reflected the views of the new liberalism that was both the carrier and the consequence of the cultural revolution.)
Human rights in the Carter version had no specific content, except a general demand that societies provide all the freedoms associated with constitutional democracy, all the economic security promised by socialism, and all the self-fulfillment featured in Abraham Maslow’s psychology. And it assumed that governments were responsible for providing these. Any society which did not feature democracy, “social justice,” and self-fulfillment—that is, any society at all—could be measured against these standards and found wanting. And where all are “guilty,” no one is especially so.
The judicial protections associated with the rule of law and the political freedoms associated with democracy had no special priority in the Carter doctrine of human rights. To the contrary, the powerful inarticulate predisposition of the new liberalism favored equality over liberty, and economic over political rights; socialism over capitalism, and Communist dictatorship over traditional military regimes. These preferences, foreshadowed in Carter’s Notre Dame speech, found forthright expression in the administration’s human-rights policy. UN Ambassador Andrew Young asserted, for example: “For most of the world, civil and political rights . . . come as luxuries that are far away in the future,” and he called on the U.S. to recognize that there are various equally valid concepts of human rights in the world. The Soviets, he added, “have developed a completely different concept of human rights. For them, human rights are essentially not civil and political but economic. . . .” President Carter, for his part, tried hard to erase the impression that his advocacy of human rights implied an anti-Soviet bias. “I have never had an inclination to single out the Soviet Union as the only place where human rights are being abridged,” he told a press conference on February 23, 1977. “I’ve tried to make sure that the world knows that we’re not singling out the Soviet Union for criticism.” In Carter’s conception of the political universe, strong opposition to Marxist-Leninist totalitarianism would have been inappropriate because of our shared “goals.” On April 12, 1978, he informed President Ceau?escu of Romania that “our goals are also the same, to have a just system of economics and politics, to let the people of the world share in growth, in peace, in personal freedom.”
It should not be supposed that under Carter no distinction was made between totalitarian and authoritarian regimes—for while the Carter administration was reluctant to criticize Communist states for their human-rights violations (incredibly, not until April 21, 1978 did Carter officials denounce Cambodia for its massive human-rights violations), no similar reticence was displayed in criticizing authoritarian recipients of U.S. aid. On the basis of annual reports required by a 1976 law, the Carter administration moved quickly to withhold economic credits and military assistance from Chile, Argentina, Paraguay, Brazil, Nicaragua, and El Salvador, and accompanied these decisions with a policy of deliberate slights and insults that helped delegitimize these governments at the same time it rendered them less open to U.S. influence.
President Carter’s 1977 decision to support the mandatory UN arms embargo against South Africa; Secretary Vance’s call, before a meeting of the Organization of American States in June 1979, for the departure of Nicaragua’s President Somoza; the decision in 1979 to withhold U.S. support from the Shah of Iran; and President Carter’s decision, in June 1979, not to lift economic sanctions against the Muzorewa government in Zimbabwe Rhodesia expressed the same predilection for the selective application of an “absolute” commitment to human rights.
Why were South American military regimes judged so much more harshly than African ones? Why were friendly autocrats treated less indulgently than hostile ones? Why were authoritarian regimes treated more harshly than totalitarian ones? Part of the reason was the curious focus on those countries that received some form of U.S. assistance, as though our interest in human rights were limited to the requirements of the 1976 Foreign Assistance Act; and part of the reason was the exclusive concern with violations of human rights by governments. By definition, guerrilla murders did not qualify as violations of human rights, while a government’s efforts to eliminate terrorism qualified as repression. This curious focus not only permitted Carter policy-makers to condemn government “repression” while ignoring guerrilla violence, it encouraged consideration of human-rights violations independently of their context.
Universal in its rhetoric, unflagging in its pursuit of perceived violations—“I’ve worked day and night to make sure that a concern for human rights is woven through everything our government does, both at home and abroad” (Jimmy Carter, December 15, 1977)—the Carter human-rights policy alienated non-democratic but friendly nations, enabled anti-Western opposition groups to come to power in Iran, and totalitarians in Nicaragua, and reduced American influence throughout the world.
The Carter administration made an operational (if inarticulate) distinction between authoritarianism and totalitarianism and preferred the latter. The reason for its preference lay, I believe, not only in the affinity of contemporary liberalism for other secular egalitarian development-oriented ideologies (such as Communism) but also in the progressive disappearance from modern liberalism of the distinction between state and society. The assumption that governments can create good societies, affluent economies, just distributions of wealth, abundant opportunity, and all the other prerequisites of the good life creates the demand that they should do so, and provokes harsh criticism of governments which fail to provide these goods. The fact that primitive technology, widespread poverty, gross discrepancies of wealth, rigid class and caste structures, and low social and economic mobility are characteristic of most societies which also feature authoritarian governments is ground enough for the modern liberal to hold the existing governments morally responsible for having caused these hardships.
The same indifference to the distinction between state and society also renders the new liberals insensitive to the pitfalls and consequences of extending the jurisdiction and the coercive power of government over all institutions and aspects of life in society. It is, of course, precisely this extension of government’s plans and power over society, culture, and personality that makes life in totalitarian societies unbearable to so many. Authoritarian governments are frequently corrupt, inefficient, arbitrary, and brutal, but they make limited claims on the lives, property; and loyalties of their citizens. Families, churches, businesses, independent schools and labor unions, fraternal lodges, and other institutions compete with government for loyalties and resources, and so limit its power.
Authoritarian governments—traditional and modern—have many faults and one significant virtue: their power is limited and where the power of government is limited, the damage it can do is limited also. So is its duration in office. Authoritarian systems do not destroy all alternative power bases in a society. The persistence of dispersed economic and social power renders those regimes less repressive than a totalitarian system and provides the bases for their eventual transformation. Totalitarian regimes, to the contrary, in claiming a monopoly of power over all institutions, eliminate competitive, alternative elites. This is the reason history provides not one but numerous examples of the evolution of authoritarian regimes into democracies (not only Spain and Portugal, but Venezuela, Peru, Ecuador, Bangladesh, among others) and no example of the democratic transformation of totalitarian regimes.
Authoritarian governments have significant moral and political faults, all the worst of which spring from the possession of arbitrary power. But compared to totalitarian governments, their arbitrary power is limited. Only democracies do a reliable job of protecting the rights of all their citizens. That is why their survival must be the first priority of those committed to the protection of human rights.
The restoration of the subjective conviction that American power is a necessary precondition for the survival of liberal democracy in the modern world is the most important development in U.S. foreign policy in the past decade. During the Vietnam epoch that subjective link between American power and the survival of liberal democratic societies was lost. Its restoration marks the beginning of a new era.
The first implication of that fact is that human-rights policies should be and, one trusts, will be, scrutinized not only for their effect on other societies but also for their effect on the total strategic position of the United States and its democratic allies—not because power is taking precedence over morality, but because the power of the U.S. and its allies is a necessary condition for the national independence, self-determination, self-government, and freedom of other nations. The human-rights policy of the Reagan administration has not been fully articulated, but the myriad concrete decisions made so far suggest that it will manifest the following characteristics:
First, clarity about our own commitment to due process, rule of law, democratic government and all its associated freedoms.
Second, aggressive statements in information programs and official pronouncements of the case for constitutional democracy. As the party of freedom we should make the case for freedom by precept as well as by example.
Third, careful assessment of all relevant aspects of any situation in another country in which we may be tempted to intervene, symbolically, economically, or otherwise. In Poland as in El Salvador we should be careful neither to overestimate our power to shape events according to our own preference, nor to underestimate the potential negative consequences of our acts.
Finally, a steady preference for the lesser over the greater evil.
Such policies will not make a perfect world, but at least they will not make the lives of actual people more difficult or perilous, less free than they already are. Conceivably, they might leave some people in some places more secure and less oppressed than they are today.
While Jeane Kirkpatrick has brilliantly broken the mold that had settled around liberal thinking on dictatorships and human rights, my own approach is from a somewhat different angle of vision. I stress the sources from which attitudes and doctrines derive, including the concepts of authority, the social organism, and the social contract. I also stress the climates, mystiques, and myths that move the intellectual elites both in the West and the Third World, and how policy might take account of them.
It will be a long struggle to influence the intellectual climate, and just possibly it can be done. But it can’t be done without digging to the intellectual roots of the prevailing doctrines. Herewith some too summary reflections on them.
On elites. To rely on the military and governing elites of authoritarian regimes, in Latin America and elsewhere—as the Reagan administration is now doing—is probably a necessary makeshift tactic but inadequate for the long run. If Bonapartism is the model, we must recall the obvious—that Napoleon was a genius, and that he had a new business elite and a nationalist intellectual elite supporting him. If Cromwell is the model, we must also recall that he had a new anti-establishment confessional elite supporting him, along with its followers among the people.
This isn’t true of the military elites in Latin America, or of the civilian governing elites which rely on them. In both cases the strength of the regimes lies in police and arms, their weakness lies in their isolation from other elites and often from the people. True, in much of Latin America the military career is likely to be the one most open to talent, and draws upon some of the ablest energies among the people. But more generally the isolation of the military, and the pressures upon it when it tries to govern, are brutalizing in their effect.
This can be best relieved if the technical and confessional elites can be reached by an appeal to the common national interest. Both of them, along with the intellectual elite, are part of a New Class in developing economies. The power they wield, unlike that of the military, is over the mind and conscience of the people and over the way they make their living. Without reaching out to these elites, the military governs in a psychological and moral vacuum.
In the U.S. the intellectual elite has grown sterile and precious by an increasing isolation from the people. In Latin America it has stayed closer to the people, in part to propagandize them, in part because of a mystique of the “People” which is linked with the revolutionary mystique. Something of the same is true of the younger and more radical Catholic priests.
It is not true of the growing technical elite. Where once the army and clergy represented the careers open to talent, today the technical career has joined and may outrun them. The technical elite is most closely linked with the business elite and the middle classes. It is inherently fact-minded and Western-oriented. No governing elite—military or civilian—can succeed without using the technicians as a bridge between itself and the intellectual and confessional elites. To the extent that it does succeed, it may loosen the hold of the left-wing mystique upon the university and the church. On authority and power. It follows that any authoritarian government which is isolated from church, university, businessmen, technicians, and the middle class will lack true authority. I use the term to distinguish it from the power concept. Power goes with the office—with those who control the means of codes and coercion. Authority goes with the credibility of the holders of power, which depends in part on their linkage with institutions, groups, and symbols that evoke belief.
Jeane Kirkpatrick’s “Dictatorships & Double Standards” [COMMENTARY, November 1979] is bound to become a classic analysis of the dynamics of political change in traditional and semi-traditional societies. It is worth noting that, in terms of theory, she has the hardest time with the authority concept. She doesn’t introduce it until toward the end of her article, when she first speaks of traditional authoritarian government as “less repressive” than “revolutionary autocracies . . . more susceptible of liberalization . . . more compatible with U.S. interests.” Until that point she has mostly termed such regimes “personal . . . autocracies.”
Perhaps the real contrast is between the “totalitarian” and the “personal” dictatorships. In both cases the problem is to develop and maintain genuine authority. When it is undercut, in the case of personal dictatorships, everything crumbles, including the fealty of the army and police. In the totalitarian case the impersonal power of the party, police, army, and propaganda agencies is bolstered by the power of other Communist regimes, as with the Soviet intervention in Afghanistan (and earlier in Hungary and Czechoslovakia) and the Cuban intervention in Angola.
On the organismic. Jeane Kirkpatrick is right about the slow growth of the “political culture” which a democracy must develop, and which a guerrilla revolution cuts off prematurely. This means seeing societies as organisms, in the sense in which Burke and de Tocqueville saw them—fragile, vulnerable, needing continuity and growth, all too capable of being snuffed out by destructive elements from within or without. The lives of ordinary people—in El Salvador, in Guatemala—are today caught in the conflict between government and guerrillas, which they are helpless to resolve but which their communities as organisms cannot endure. It should be the aim of American policy to help restore conditions for organismic growth and for a continuing social contract.
On the social contract. Every human organism—social as well as individual—exists by virtue of a continuing social contract, an equilibrium between individual aggressiveness and shared purposes and meanings. Hobbes and Rousseau, so different otherwise in their thinking, both understood this, and if we abandon their insight it is at our peril.
Hobbes had a nightmare vision of what follows a savaging of the public order, when the pre-contract wolfishness of man’s nature (homo homini lupus) is recreated. If liberals today recoil from Hobbes as a conservative, they can turn to Rousseau, a radical who believed in man as perfectible under a civil order.
In personal dictatorships there is a rudimentary contract, with most of the advantages on the side of the governing elite, a kind of centralized feudalism. Yet it is capable of evolving into the beginnings of a democracy, with authority transferred from the person of the dictator to impersonal institutions. In totalitarian dictatorships the evolutionary is ruled out. The idea of stasis and permanence is built into the party, the politburo, the ideology and its guardians in the police and army.
In time we shall have to evolve a more extended social contract which will save mankind from its global civil war and a pre-contract nuclear savaging. It will take a long time to form a larger moral community, going beyond nations, if it can ever be done. But a first step must be the assurance of civil order within nations, against the destabilizing impact of created insurgencies.
On the revolutionary mystique. Those who help create the insurgencies have on their side the revolutionary mystique, a strong elemental force whose history James Billington has recounted in his Fire in the Minds of Men. Jeane Kirkpatrick recognizes the outer strength of this mystique, but I wish she were, realistically readier to recognize its inner strength.
Its roots lie deep in the history of Western ideas and go back even beyond the 18th-century Enlightenment and its utopias and its American and French Revolutions, back to the Christian allegory of the martyrdom of God and the redemption of man, back to Plato’s dialectic which Marx cleverly stood on its head, back even farther (in Freud’s speculations on pre-history) to the alienation of the sons and the revolt against the fathers, back to human discontents and suffering and to the rankling sense of human injustice.
I say this in the interest of some depth of understanding in our effort to limit Soviet expansionism and keep the societies of America’s most vulnerable allies from unraveling. To confront Soviet power is one thing, to confront a worldwide revolutionary mystique quite another. If the fire is in men’s minds, then the counter-fire must also be in men’s minds. How to light that counter-fire and keep it burning will be the problem of the rest of the century.
On human rights. In the Jacobo Timerman case I felt that the arrest, imprisonment, and torture were an outrage and that there could be no silence and no “quiet diplomacy” about it. I still feel that way. Timerman was no insurgent, and the charge that he was aiding the guerrillas was shadowy at best and should have been handled by legal process. When torture is used, no matter by whom, we cannot be quiet.
In one sense the insurgents have the advantage on human rights. Both they and the governments they seek to overthrow mount a campaign of terror and counter-terror against each other. But the kidnappings and killings by insurgents are seen as part of the “revolution,” while those by the government are seen as violations of human rights. It is a built-in double standard and is likely to remain so. The contradiction is that both human rights and the right of revolution have become absolutes and universals, while the right of imperfect governments to survive is seen as a matter of opportunism.
In policy terms we all operate in an imperfect world, not one of absolutes. There is no moral imperative for America to undercut its allies, even when they are dictatorships. There are no “inevitable” revolutions. The Iranian revolution, whose consequences have proved so devastating, didn’t have to take place. The fact that it did was largely due to American blunders and the American policy climate.
Climates and symbols. My overall thesis is that we are in fact in the midst of a war, but that while military power is indispensable it is far from being the heart of the matter. The decisive power is in ideas, using the term in its broadest sense to include how we view our world, our nation and culture, ourselves, and how our adversaries and allies in turn view us.
This puts a premium on the climate of ideas in America and the West—and that we can do something about. Jeane Kirkpatrick has already had an impact on the climate by demystifying some aspects of the revolutionary mystique, and by hacking away some of the jungle growth surrounding the hard ground of policy decisions. Yet it remains true that we have shown few skills and have had scanty leadership in the undeclared war of ideas and symbols. Many of us—including much of the intellectual elite—have not even caught on to our being in the midst of it, and how much of a life-and-death struggle it is.
Seymour Martin Lipset:
American foreign policy has been influenced through much of our history by an emphasis on moralism, by the insistence on the part of many Americans that we should do what is right, even when to do so conflicts with national self-interest. The American focus on moralism derives from the fact that the United States is the only country in the world where the majority has adhered to Protestant sects which have remained independent of the state, economically and theologically. Whereas Catholic, Anglican, Lutheran, and Orthodox state churches in other societies once taught that throne and altar were mutually supportive, that the interests of the rulers were also those of the church, the doctrines of the Methodists, Baptists, and myriad other sects in America have always implied that their parishioners should follow their own conscience rather than the policies of the state.
The emphasis on doing what is moral has fostered the doctrine of conscientious objection, non-recognition of evil states, and unconditional surrender of adversaries. Protestant sectarians who regard a war as wrong are morally obligated to oppose it, even after war has been declared. There has been sizable opposition to every war this nation has been in, with the exception of World War II, the only one initiated by an attack on American soil. Some of the New England states threatened to secede in an effort to stop the War of 1812. During the Mexican War, thousands of American soldiers actually deserted and joined the Mexican army because they thought the Mexicans were right and we were wrong. Both North and South faced large-scale internal opposition during the Civil War. World War I witnessed hundreds of thousands of conscientious objectors, and the Socialist party, opposed to the war, secured its largest vote ever (around 20 percent) in the election of 1917.
The concept of unconditional surrender is the other side of the coin. When we go to war, we do so for moral reasons; we fight for right against evil, for God against Satan. In a conflict between right and wrong, compromise is not possible. The only acceptable outcome is the total defeat of the Satanic enemy, i.e., unconditional surrender. Similarly, the peculiarly American refusal to recognize immoral states is a moralistic response, reflecting the values of evangelical sectarian Protestantism. Nations dominated religiously by erstwhile state churches, like the Catholic or Anglican, have been more tolerant of the failings of inherently imperfect people and societies and more open to recognizing and working with “immoral” states; e.g., conservative governments in Britain, France, and Spain recognized Communist regimes long before the Unites States did.
The real world, of course, cannot be packaged neatly into a conflict between the good guys and the bad guys. When we go to war or engage in other forms of international conflict, we need allies, we must work with the enemies of our enemies. And these often are morally imperfect. The American solution has been to perceive these allies as virtuous states. Thus, before and during World War II, many anti-fascists chose to see Stalin and the Soviet dictatorship as benign and progressive, as moving toward democracy. During the war, the Reader’s Digest published an article by Eddie Rickenbacker praising “Uncle Joe” (Stalin) and noting signs that capitalism and a free economy were emerging in the Soviet Union. Chiang Kai-shek and Marshal Tito were both described as democratic nationalists, as their nations’ equivalents of George Washington.
Repression, persecution, authoritarianism characterized the Axis powers, not America’s allies. Hence the shock and puzzlement after 1945 when portrayals of our past allies changed in tandem with new lines of international cleavage.
A high-ranking administration official recently told me that Ronald Reagan has been able to reconcile his belief that Communist rule is evil with our de facto alliance with the People’s Republic of China by holding that China is no longer dedicated to Communism. The source of this judgment is a private statement by a major Chinese leader that Communism does not work, that it is a failure both in the Soviet Union and China, and that the latter will gradually move toward a market economy.
The current controversy about authoritarian and totalitarian regimes is part of an effort to accommodate moralistic orientations toward foreign policy. Critics of an active interventionist policy that seeks to halt or reverse Soviet expansionism argue that we are not being moral, that we have become selective in our opposition to inhumane or dictatorial systems. If our concern is to extend or protect human rights, then we should be as opposed to authoritarianism in Argentina as in East Germany, in Chile as in Cuba, in South Africa as in Angola, in Saudi Arabia as in Libya. The supporters of the policy reply by pointing to differences between totalitarianism, a term that describes fascist and Communist systems, and authoritarianism, a concept applied to more loosely integrated undemocratic societies, like many of our non-Communist Third World allies. The latter are not as oppressive and are more likely than not to be on the road to democracy.
The distinction between totalitarianism and authoritarianism is clearly a useful one. Perhaps the best indicator of the difference was noted by a Communist ruler, Janos Kadar of Hungary, who stated, “He who is not against us is with us,” presumably in contradistinction to pre-1956 rule in Hungary, which still operates in other Communist countries where “He who is not with us is against us.” Totalitarian systems like Nazi Germany, Stalinist Russia, and Maoist China devoted considerable energies to educating or forcing their population to support the party and regime publicly through attendance at meetings, public statements, votes in elections, etc. In a totalitarian system, no one is allowed to abstain.
Authoritarian systems, however, as Kadar has implied, do not demand public affirmation; what they require is the absence of anti-regime opposition. Private dissent is tolerated. People are not required to report criticism of the regime, and freedom of speech may even exist within the framework of independent organizations. Franco Spain, Argentina and Chile today, Communist Hungary, Poland, and Yugoslavia all have permitted a certain amount of freedom of dissent as long as it takes place outside the framework of organized political opposition.
The distinction between totalitarianism and authoritarianism does not, therefore, correspond to that between Communist and non-Communist autocratic systems. Some Communist nations are clearly authoritarian. Poland has permitted personal liberty and even, at various times, considerable freedom of public expression, ever since the Poznan strikes of 1955. Solidarity emerged out of a situation in which samizdat publications, satirical political cabarets, criticism of the regime in the streets and in private discussion groups were common.
The Soviet Union itself is less repressive than it was in Stalin’s day, a fact reflected not only in greater freedom in private discussions, but in the arts, and in intercourse with foreigners. E. P. Thompson, the British Marxist historian and leader of the Western anti-nuclear-weapons protest, complained after a recent visit to the Soviet Union that the intellectuals he talked to there sounded like Ronald Reagan. If this description of their views, as expressed to Thompson, is even partially accurate, it certainly rules out the continued categorization of the Soviet Union today as a totalitarian state, though it clearly remains much more repressive than Hungary, Poland, and Yugoslavia.
What those who perceive Communist states as morally worse than non-Communist authoritarian nations really mean is that the Soviet Union and its allies are expansionist and constitute a military threat to others, while contemporary non-Communist dictatorships, Libya apart, are not concerned with expanding their control. They further note that non-Communist autocracies are politically unstable, frequently experience a change of regimes, and on occasion become democratic, as in the recent examples of Greece, Portugal, and Spain, while no country, once Communist, has been allowed to shed the system. These points are valid, but they have less to do with totalitarianism than is sometimes supposed.
Communist countries are as likely as other dictatorial systems to experience tensions that produce challenges to the political control of the governing elite. Within the scope of continued allegiance to Communism, and the maintenance of a one-party system, much variety is possible. What is different about Communism is the Brezhnev doctrine (in place long before he enunciated it) which does not allow a Communist state to become democratic, while sustaining the efforts of Communist movements to overthrow non-Communist regimes. The Soviet’ Union openly proclaims that it will intervene to prevent any country, once Communist, from changing to a non-Communist system. It did so in East Germany in 1953, in Hungary in 1956, in Czechoslovakia in 1968, and in Afghanistan in 1980-81. (Many forget that the Soviet action in Afghanistan is not an effort to impose a Communist regime on that country, but rather an effort to preserve Communist rule there.)
Reactionary, repressive authoritarian regimes have the capacity to change their basic political character, which can include becoming anti-American. But though Communist systems may modify their institutions, they may not give up Communism. Hence, as long as the Brezhnev doctrine is applied, we have no option but to resist Communist expansionism, to give aid and counsel to every regime threatened by Communist takeover.
The insistence that Communism is totalitarian and American-linked dictatorships are authoritarian may help preserve a moral distinction between our friends and enemies but it is inaccurate. Some non-Communist regimes in Africa, Asia, and Latin America are more repressive than some Communist ones in Eastern Europe. We gain nothing by denying this.
Should the United States support human rights, and oppose oppression in non-Communist societies? The answer, of course, must be yes. But to do so does not mean that we should ostracize such countries or that we should refuse to help them against Communist attack or subversion. We were allied to Great Britain and France while giving help and comfort to anti-imperialist independence movements in their colonies. Many dissidents in Chile, Argentina, and various African countries have attested to the help in gaining their release from prison or in securing better treatment that resulted from American intervention or public statements.
The trouble with this policy is that it goes against the dictates of American moralism which requires that if a country does evil we should reject it. But Americans are learning that there are limits to their country’s power and that they cannot refuse to recognize or deal with immoral states. What is needed is a practical moralism, a commitment to democracy and human rights which is tied to national interest. The two elements, of course, are reinforcing, for the more widespread democracy, the stronger America, but equally, the stronger the United States, the greater the chances to expand democracy and human rights elsewhere.
Charles William Maynes:
Much of the angry debate over U.S. human-rights policy overlooks one obdurate fact: America is a liberal country. It is not liberal in the sense that conservatives always lose elections. Numerous elections, including those in 1980, have shown that to be false. America is liberal in the sense that even conservative administrations are under pressure to pursue liberal political values.
America’s behavior throughout the 20th century demonstrates just how strong the American liberal tradition is. Repeatedly, the country has been willing to sacrifice quite concrete commercial or security interests in order to respond to its liberal tradition. In 1911, when big business dominated American political life in a way it has seldom done since, the United States nonetheless abrogated its commercial treaty with Czarist Russia because of American outrage over that regime’s treatment of its Jewish population. In the early 1920’s, the vehemently anti-Communist Harding administration undertook a massive food program to feed the starving Russian people even though that move helped to save the new and hated Bolshevik regime. Under President Carter, although the U.S. relationship with Vietnam was one of intense hostility, the United States provided food to millions of starving Cambodians, a step that meant propping up the Vietnamese-supported puppet regime in Phnom Penh.
The existence of the liberal tradition does not mean that the U.S. always has liberal policies. It does mean that a foreign policy that is in flagrant conflict with that tradition is in trouble. The Reagan administration has recognized this point by shifting its stance on human rights. Although it earlier attempted to draw a distinction between human-rights abuses committed by authoritarian regimes and those committed by totalitarian regimes, it now contends it will have a single standard for all countries.
In short, the American liberal tradition of interest in the human rights of others is deeply rooted in the American body politic. It has manifested itself repeatedly throughout our history in both Republican and Democratic administrations. It is in this regard that Americans—whether conservative or radical—are in the end liberal.
Even the heated debates over the U.S. human-rights policy that have taken place in COMMENTARY are a tribute to the strength of the U.S. liberal tradition. Many of COMMENTARY’s authors want policy results different from those suggested by that tradition. But they are reluctant to call openly for a departure from that tradition. To defend unpopular recommendations, they are forced to argue counter-intuitively that in the Third World the best way to pursue democratic liberties is not to strike out for them directly but to support authoritarian regimes that allegedly will evolve in a democratic direction. Even if the immediate policy recommended violates the American liberal tradition, in other words, the underlying message is that the final result will conform to that liberal tradition.
The traditional American attitude toward human rights has acquired a new contemporary potency, however. The reason is modern-day ethnic politics. Today there is scarcely a nation on earth without some of its citizens or their descendants living in the United States. And in our system of government, with its checks and balances and with the unique power our Congress enjoys in the field of foreign policy, the more significant groups have had and will continue to have a major voice in the development of American foreign policy. In particular, they will be very concerned about the degree of political and economic welfare of their former countrymen or co-religionists. Inevitably, they will seize on the emotive power of the American liberal tradition and its support for democracy and the human rights that flow from that system of government to buttress their concern. Convincing other Americans that the issue is not simply a form of tribal loyalty to Israel or Cyprus or black South Africans but a form of liberal concern for democracy, self-determination, or common decency can only broaden the base of national support.
Can this approach lead to a conflict with U.S. national interest? The answer depends on the time-frame through which one is viewing the national interest. Certainly in the short run the conflict can be severe. The U.S. concern with human rights in the Soviet Union has troubled sensitive negotiations with that country in recent years. When non-Jewish Americans have based their support for Israel on the issue of self-determination and democracy, U.S. relations with oil-producing Arab states have been affected. Relations with South Africa have become increasingly strained because of U.S. attitudes toward the inhumane treatment of blacks in that country. Our bases in Turkey were closed down temporarily because we opposed Turkish suppression of self-determination in democratic Cyprus. Our influence with Argentina has fallen because of opposition to government-sanctioned slaughter of dissidents, real and imagined, in that country.
But those who shake their heads at this price in the American approach to foreign policy should ask themselves: what kind of foreign policy would we end up with over the long run if we were to follow the approach of clear-headed Realpolitik they advocate? Isn’t our aim a policy that serves our interests and that commands popular support? And in that regard can one imagine the American people over the long run ever supporting a policy toward the Soviet Union that overlooks completely the fate of communities inside the Soviet Union that have so many ties to communities inside the United States? As long as we have a free press, could a policy of Realpolitik toward South Africa or Guatemala long survive the continued shocks of the exposé of one human-rights outrage after another? Could any relationship with the Arab world be healthy that did not reflect the strong American support for a Jewish people expressing its democratic right of self-determination?
The reality for American foreign-policy “realists” is that their fellow citizens will not support a foreign policy over the long run that offends too frontally the American liberal tradition. Indeed, this is why the Begin government’s attitude toward the Palestinians is so critical. For it is not clear that the traditionally warm relationship between the United States and Israel can survive the incorporation into Israel proper of the West Bank, with permanent political repression or expulsion of the Arab majority living there.
Given the American attitude, how should the U.S. handle the hard realities of international politics? In the short run the U.S. should deal on a pragmatic basis with both totalitarian and authoritarian regimes to protect U.S. security and welfare. It should buy key minerals from authoritarian South Africa. It should assist totalitarian China, at least with economic aid, to stand up to the Soviet Union. But over the longer run it must be opposed to the political system of both authoritarian and totalitarian regimes, and it should not hesitate to say so. Our people will reject any short-run policy that ignores this long-run American preference. Foreign policy is basically the effort to manage the resulting tension between short-run policy needs and long-run policy preferences.
This observation about tension in any foreign policy is relevant to the contention that somehow authoritarian governments are better than totalitarian governments. Viewed closely, some of the distinctions drawn between the two seem weak at best. For example, it is not at all clear that one is more likely than the other to evolve in directions that we would like to see. There have been repeated efforts to gain political freedom in totalitarian Eastern Europe. Is it not likely that one day they will succeed? Would they not have succeeded already except for the intervention of the Soviet army, which may not be able to move so easily into non-contiguous areas?
Nor are all totalitarian states always more bloody than all authoritarian states. Few places have been more bloody than Guatemala in recent years.
Another major problem with the asserted distinction between authoritarianism and totalitarianism is that both labels cover too vast a spectrum of countries to be meaningful. Is Mexico, authoritarian but relatively benign, to be placed in the same category as authoritarian El Salvador, in which political opponents are hunted down like some tagged member of the animal kingdom? If we accept, as many who draw this distinction do but I would not, that Communist states cannot change and remain forever totalitarian, then are we comfortable with the fact that we must place Yugoslavia and North Korea in the same pigeonhole? If we are forced to group such wildly different countries under the two labels, is the distinction not useless for policy purposes?
Nor is it always true—certainly in the longer run—that right-wing dictatorships serve U.S. interests better than left-wing dictatorships. Did Somoza of Nicaragua serve U.S. interests? As a right-wing foreign minister from a major Latin American country once explained to the Carter administration, Somoza’s main achievement was to develop a plantation and to lose a country.
There is, however, one condition under which the distinction between authoritarianism and totalitarianism might acquire new significance, or at least be viewed in a new light. Suppose that the United States were now effectively at war with the Soviet Union. In wartime a country cannot always be overly selective in its choice of allies. Survival becomes the key issue and at virtually any price. Finland, after the Soviet attack in 1939, later accepted the support of Nazi Germany in an attempt to regain its territory. The Western allies did not hesitate to join hands with Stalin, a dictator of comparable moral degradation, in their effort to crush Hitler.
Are we at war? Some, including the editor of COMMENTARY, Norman Podhoretz, in effect argue that we are. The Soviet Union is seen as “exactly” like Nazi Germany. It is seen as posing precisely the same kind of threat to American security and welfare. Whether intended or not, the equation of the Soviet Union with Nazi Germany is incendiary in its policy implications. Given our collective memory of World War II and the lessons we all believe we learned from the history of the 1930’s, the evocation of Nazi Germany can only suggest inevitable and fairly immediate conflict. Negotiations begin to seem foolish. Even preemptive war might be in order. We would not want to make the mistake we made in the 1930’s of letting the aggressor power choose the time and place of the inevitable attack. In any event, we should join with any allies we can find in combating this new menace, whose appetite, like that of Nazi Germany, cannot be sated. Against such a threat, some would also take action at home. The new chairman of the Senate Judiciary Committee has stated that Senator Joseph McCarthy was doing the right thing, only in the wrong way.
Few would deny that the Soviet Union poses a severe challenge to American interests worldwide. Indeed, were the Soviet Union to invade Poland, the international situation would begin to resemble the summer of 1914 in its tensions and dangers. Vigorous military and diplomatic measures would become even more pressing than they are now. But even viewing the Soviet Union today in the way the rest of Europe viewed Imperial Germany in the summer of 1914 is very different from viewing the Soviet Union as the modern-day equivalent of Nazi Germany. In the former case, there could still remain some hope that through logic, diplomacy, and appeals to common interest catastrophe could be avoided. The margin of maneuver would be small but it would still allow some room for attention to be given to longer-run considerations. In the latter case, the margin for maneuver disappears altogether. The only value is survival, and the sole test of a foreign-policy relationship is whether it contributes to the pressing goal of survival.
Among the prominent supporters of the Reagan administration there are some who do see American options in the single blinding light of “the present danger” that now transfixes many of the contributors to COMMENTARY in its high beam. These supporters would drive the administration to court South Africa, to embrace reaction in Central America, and to condone human-rights abuses so long as they are committed by our friends. They might even nod their heads approvingly when New York Times columnist William Safire writes: “What is ‘winning’ [in El Salvador]? Is it a military junta that kills the opposition but by its repressive nature produces more opposition that it becomes necessary to kill? If need be, yes—considering the aggressive totalitarian alternative. . . .”
The problem for those who espouse such a policy is that their fellow citizens will not accept it. The American people remain adherents to the liberal tradition. They fear the Soviet Union but they are not so terrified that they are willing to abandon long-standing American values. For that reason they have already rejected decisively the administration’s initial hard-line and callous policy toward El Salvador. They will reject similar policies elsewhere. The Reagan administration would save itself much political pain if it acknowledged that there are some things that it cannot change and one of them is the basic liberal character of the country it governs.
Eugene J. McCarthy:
1. Concern for human rights as defined in the Constitution of the United States and in our commitment to the Charter of the United Nations must be a continuing and pervasive influence on American foreign policy. The United States has both legal and formal commitments to human rights as well as a deep philosophical, if not theological, commitment.
G.K. Chesterton in his book, What I Saw in America, written in 1922, declared: “America is the only nation in the world that is founded on a creed. That creed is set forth with dogmatic theological lucidity in the Declaration of Independence, perhaps the only piece of practical politics which is also theoretical politics and also great literature.”
The creed to which he referred is contained in these words in the Declaration of Independence: “[A]ll Men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty, and the Pursuit of Happiness. . . .” These words and these ideas were taken seriously by the men who drafted the Declaration and the Constitution. They were spoken and written by men in danger of being shot or hanged if the Revolution they led turned out to be a failure. The words and the ideas they expressed were meant to be taken literally. They were not stated merely as a justification for the Revolution but were intended to establish a foundation in principle and theory upon which democratic institutions and traditions could be established in the United States and which could be used as valid principles for men in political societies in all places and times.
The society which existed following the adoption of the Constitution was not one in which universal human rights were honored. Religions were protected and sustained in some states. Slavery was to continue for nearly a hundred years after the adoption of the Constitution, and gross discrimination against members of the black race for another one hundred years after the Civil War. Women did not receive protection under the law equal to that given to men, nor were they allowed full participation in the political life of the new nation. Property tests were a mark of politics and means tests marked both politics and welfare programs.
Obviously, as there has been failure and compromise within the bounds of our own laws and practices, so room must be allowed for compromise in dealing with other nations. The compromise should, however, be with reality, not with principle, and under the aegis of policy positions, such as that stated in Roosevelt’s Four Freedoms speech and restated and amplified by Paul-Henri Spaak, the Secretary General of NATO, in September 1957, when he said: “There can be no respect of persons if there is no political democracy and there can be no respect of persons without social justice, and only when we have carried this to a maximum, not only in our own countries but also in all places where we have undertaken political responsibilities, shall we find ourselves in a state of quiet conscience and moral peace which will allow us genuinely to take up the challenge of Communism. But this challenge the whole Western world must take up together. This is the prerequisite making possible our success.”
The principle having been laid out, compromise begins. John Bennett of Union Theological Seminary, in attempting to work out the reconciliation between principle and practical political judgments, in an article published in Christianity and Crisis many years ago, concluded that in the showdown national interest must be the ultimate determinant of action, unless what was involved was wholly abhorrent. A similar point was made by Jacques Maritain in his book, Man and the State, when he wrote: “When the moralists insist on the immutability of moral principles they are reproached for imposing unlivable requirements on us. When they explain the way in which these immutable principles are to be put into force, taking into account the diversity of concrete situations, they are reproached for making morality relative. In both cases, however, they are only upholding the claims of reason to direct life.”
The same can be said of political compromise, in making practical compromises and thereby upholding the claims of reason to direct national life. Obviously in a regressive or barbarous society (that of Nazism) the freedom of choice may be reduced to the point where a choice which is morally questionable, rather than one which is simply and wholly bad, may have to be made. The choice may not be one of the lesser of two evils, but that of a course which has some good in it, or a potential for good, no matter how limited.
Thomas More, in his Utopia, took this same position, in writing that “If evil opinion and naughty persuasion cannot be utterly and altogether plucked out of hearts; if you cannot, even as you would, remedy vices which habit and custom have confirmed, yet this is no cause for leaving and forsaking the commonwealth.”
Mistakes, if they are to be made, should be made on the side of support for freedom and democracy, and the national interest not invoked carelessly or lightly in support of policies which are contrary to our expressed beliefs in liberty and self-determination.
2. The distinction between authoritarianism and totalitarianism is a worthwhile distinction if made carefully in historical context and not on an arbitrary ideological basis. Nazism began as a democratic political movement. It very quickly became authoritarian, and eventually totalitarian, a development observed and analyzed by Hannah Arendt in The Origins of Totalitarianism.
The roots of totalitarianism she noted in European, especially German, government are in bureaucracy, government by decree, issued by anonymous bureaucrats not identified with legislative bodies or legislators, and marked by an acceptance that government is above politics and must be isolated from it—a view not very different from that implicit in the Common Cause approach to government in the United States. Hitler became an intruder, eventually, into the self-contained, effectively amoral, non-political bureaucracy history has labeled Nazism. He was authoritarian, and continued to attempt to exercise authority after he had lost power to the bureaucracy. Thus Nazism, conceived as authoritarian, once bureaucratized, became totalitarian, with “nothing above the state, nothing against the state, nothing outside the state.”
Communism, on the other hand, is conceived in principle to be totalitarian. In practice it may become authoritarian, when those in power lose faith in their principles, or see the principles, if carried into action, as a threat to their own power. Russian Communism today, even though run by the Central Committee rather than by a dictator, is more authoritarian than it is totalitarian. The same authoritarianism is demonstrated in the Russian response to the attempt on the part of Solidarity, a labor commune, to take power in Poland. Theoretically, at some stage of Communist development, the state, i.e., the central authority, should wither away and the commune prevail. Evidently the Russian leaders do not think that Poland has reached this stage of perfection.
A comparison of the FBI under J. Edgar Hoover and under, or within, its present form of operation sustains the points of distinction made by Hannah Arendt and in a limited way helps to clarify the difference between authoritarianism and totalitarianism.
While Hoover headed the FBI, that agency was a projection of its head, its director. His personality and judgment affected the methods of operation of the Bureau as well as the character of its personnel. Hoover gave the Bureau an institutional independence of such strength that a series of Presidents of the United States were either unable or unwilling to challenge the autonomy of its operation. He was in charge. He was responsible. The lines were drawn.
Without Hoover, the potential of the FBI to develop into a purer and more impersonal bureaucracy is beginning to show. In an inquiry directed to the present head of the FBI, William H. Webster, concerning the Abscam venture, as a result of which a number of persons, including several members of Congress, have been convicted of crimes, Webster was asked whether he had ordered the project. He replied that he had not. He was asked whether he knew about it. He replied that he did. When asked where the plan had come from, he responded that it had come out of the Bureau.
I do not think that much good can come from any arbitrary foundation or think-tank distinction between totalitarianism and authoritarianism. Judgment must be based on what happens under an authoritarian regime and what happens within one that is totalitarian and on projections of what may happen in either or both.
I see most South American non-democratic regimes today as authoritarian. The use of authority in some is restrained; in others, vicious and barbarous. Cuba, which is in theory and in structure Communistic, is far from being totalitarian; it is, I believe, closer to the authoritarian model. China, a country with which we have reopened relations, is, in practice, a most totalitarian country (although not cruel and oppressive in the Nazi mode).
3. The approach of the Reagan administration to the application of the human-rights standard to foreign policy is as yet undefined in principle and unproved in practice, out of which some formulation of overall policy might be developed.
In the Carter administration, the formulation became clearer as the administration continued, but was less clear in its application. “Human rights” became the watchwords, not for policy coupled with program, but for policy without program. The result was indecision, procrastination, manifest in the failure to support the government of the Shah in Iran; to have an alternative ready, if any such possibility existed; and in an appearance of readiness to accept the revolutionary government. Such an acceptance could not have been based on anything more than hope that the new government would respect “human rights”—a hope which was dashed by the hostage-taking.
If human rights was the standard by which policy was guided relative to Iran, it proved inadequate. A harsher pragmatism, both before the fall of the Shah’s government and after the hostage-taking, might have been better.
Quite properly, the Carter administration joined governments of other nations to protest the Russian invasion of Afghanistan (or if it was not an invasion, “intervention”) in the name of international order and law. Carter then added the broader charge of violation of human rights, moving him to cut off grain shipments, limit other items of trade, and keep United States athletes out of the Olympics. At the same time, he insisted that he was seeking better relations with the Russians and hoping for a nuclear-arms agreement with them. There seemed to be little proportion between the latter objectives, in their importance, and the actions taken relative to Afghanistan, which might have precluded the accomplishment of those objectives.
Reagan as a candidate said that he was opposed to our participating in the Olympics, but that as President he would not have used the pressures the Carter administration did, such as the threat of passport denial, the refusal of permission to take money out of the country (under the provisions of “trading-with-the-enemy” laws), and possible challenges to the tax exemption of the Olympics Committee, and to the deductibility of contributions to it, if that Committee persisted in trying to take athletes to the Olympics.
Reagan also said in the campaign that he opposed the embargo on grain shipments. Whether he did this out of principle or under pressure to win farm votes remains unclear. In office Reagan lifted the embargo on wheat and corn, although under the influence of Secretary Haig, who according to press reports wanted “to send the Russians a message,” he did forbid the sale of surplus butter to the Russians. Evidently, he accepted the principle that wheat from America could be used to make bread for the Russians and corn to manufacture oleomargarine, but that there should be no butter for Russian bread from either contented or discontented U.S. cows. As yet there are no marked successes or failures attributable to the Reagan approach.
1. I think a Proper concern for human rights in American foreign policy is inescapable. From the Revolutionary War on, the United States has been linked with the idea of rights in both domestic and foreign policy. It would be a serious mistake to flout or jettison official interest in human rights at this time, even though the concept of human rights as we find it in scores of UN and other documents represents a debauching of the idea of rights as we have known it for more than two centuries in the United States. More than any other country, the Soviet Union is responsible for this debauching.
The essence of the American doctrine of rights comes directly from natural-law philosophy. Certain rights were presumed to inhere in human beings simply by virtue of their humanity and thus to be anterior to political society. In the thinking of the Founding Fathers, such rights as freedom of speech, freedom of assembly, equality before the law, and due process were not only precedent to the political state but were envisaged as rights against the state. If the Founding Fathers did not in the first instance choose to list individual rights, it was from no lack of respect but rather, as Hamilton put it, that’ no listing was necessary inasmuch as nothing in the Constitution challenged or infringed upon these rights. And it is significant that when the first ten Amendments were added to the Constitution as a Bill of Rights, the Ninth Amendment read in whole: “The enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people” (emphasis added).
Very different is the Soviet conception of rights. In the first place, there is no pretense that rights emanate anywhere but from the Soviet state. They are thus conferred upon the Russian people by the same sovereign power that presides over the Gulag Archipelago. Characteristically, rights in the Soviet Constitution are guarantees by the government of such things as jobs, housing, medical care, and pensions. How honestly and efficiently these guarantees are fulfilled is not the point here, which is simply that such guarantees, however good they may be as social objectives, are far from being equivalent to American constitutional rights.
The debauching effect of the Soviet philosophy upon the idea of rights that took root in the United States in the 18th century is apparent at any international convocation on human rights or in any UN document on the subject. We see so-called rights to housing and jobs, rights conferred by the sovereign power, dealt with as equal to, if not actually more important than, the rights which in American doctrine inhere in individuals and are rights against the sovereign power. My experience thus far with conferences on human rights is not encouraging. Any effort to argue the thesis that the right to free speech or to a fair and public trial in law is more important to the human estate than the right to a job or to housing is promptly met with the question, invariably accompanied by a smile of irrefutable revelation, “But how can a jobless or hungry person be concerned with the luxury of free speech?”
Sadly, that kind of question—which would have been simply incomprehensible to Jefferson or Tom Paine—comes as readily to the minds of some Americans as it does to representatives of the Communist and the Third Worlds. At a recent, largely Anglo-American conference on human rights that I attended, there were not more than a half-dozen out of better than two dozen attending who thought that individual rights against the government such as those in the American Bill of Rights were more important than “social rights,” that is, the right to employment, housing, medical care, and the like.
To be sure, American liberals, like Communist and Third World socialist propagandists, show a sprightly interest in individual rights to freedom and due process when an Argentina, South Korea, or Taiwan is involved. Skies darken under the imprecations hurled by liberals at these countries. Well and good. But why not when the countries are the Soviet Union, China, Vietnam, and Cuba? It is this flabby hypocrisy that made American intellectuals soft touches in the beginning for Jacobo Timerman’s efforts, after his release from imprisonment, to manipulate American foreign policy.
But with all the flaws recognized in what currently passes for human-rights thought and action, it is still an obligation of the United States to hold firmly and steadily to the doctrine of human rights, as we have traditionally understood it, in our foreign policy. We should stand ready to put the lash of contempt on the so-called human-rights policies of the Communist bloc in the world and to expose relentlessly the mockery of such policies in the face of iron exit barriers in these countries. Argentinians and South Africans are free to leave their countries at any time. Russians, Vietnamese, Chinese, and Cubans (save when Castro chooses to empty his jails at American expense) are not. This one fundamental fact needs to be broadcast incessantly by the United States. We also owe it to the world and to ourselves to keep the pressure on Argentinians, Chileans, South Africans, and the other countries with which we find ourselves bound strategically to clean up their houses. Not the pressure of Wilsonian righteousness but of pragmatic interest. Not loudly for media thrill but quietly for actual effect.
2. The distinction between authoritarian and totalitarian societies is not only important, it is unavoidable and vital. Despite the agitated bleats from the Left over Jeane Kirkpatrick’s recent, bold, and telling use of the distinction, it is really an old one in the literature of political power, and applies to history as well as to the contemporary anatomy of states. We in the West are familiar with our own authoritarian governments three centuries back in time. Modern European democracies have emerged from authoritarian societies—the England of Elizabeth I, the France of Louis XIV, the Prussia of Frederick II, and so on. Sovereigns could be despotic, even savage at times, especially when the smell of subversion was in the air. But such authoritarianism did not, as we know, prevent the age of Shakespeare and the age of Molière from flourishing, or the Enlightenment from reaching Prussia and other German states. A king who ruled by self-proclaimed divine right, making occasional use of secret trials (or no trials at all), of torture, executions, and solitary imprisonment in the dispatching of enemies, still had the church, the aristocracy, the towns and their charters, the guilds, and other largely autonomous institutions to cope with. And, a matter of simple record, it was possible for individual freedoms to take root and grow slowly, to become fixed eventually, in all of these authoritarian countries.
This historical fact should be kept in mind when we look at Chile, South Korea, and South Africa in our day. Military despotism in the two and an odious racism in the third do not prevent the existence of flourishing private sectors—intellectual and cultural as well as economic. Impotent though it is at the moment, there is, for all the world to see, an anti-apartheid movement in South Africa. The generals in Chile, while imprisoning and executing without due process, nevertheless permit and even encourage a free, competitive economy—always the best possible base for any political democracy that may arise, as it almost certainly will in Chile. But the crowning attribute of the authoritarian, in contrast to the totalitarian, society is its institutional pluralism and its general linkage with the past through tradition. No age of Shakespeare looms in any authoritarian country at present (nor in the United States and England, heaven knows), but an independent culture exists in each such country, even if it has to avail itself of what the Spanish under Franco called possibilismo (something Shakespeare was quite familiar with).
What identifies the totalitarian country and stamps it with far greater menace to human freedom than any other form of state known to us in history is its systematic and relentless effort to destroy every possible form of the past and present that in any way might militate against the formation of “the New Soviet Man,” or whatever his counterpart may be in other totalitarianisms. Herman Rauschning, one of the earliest to identify the true nature of totalitarianism and to see it as a distinctively 20th-century phenomenon, one made possible indeed by modernity in all its aspects, accurately labeled Stalinist Russia and Nazi Germany revolutions of nihilism. This they are; but they are also revolutions of affirmation, of deadly dogma. All the while linkages with the past are either destroyed or adapted to new ends, the dogma of the absolute and total state advances. Nothing outside the state, everything in the state: this is the dogma that the Soviet Union, in the name of the people, has broadcast to the world for sixty years, with the number of national converts or puppets increasing steadily. Wherever found—China, North Korea, Cuba, and so on—the dogma is set in a monolithic political collectivism, a social and cultural desolation, and a compulsive zeal for spreading the dogma farther.
It is naive to think of totalitarian states—the major ones at least—as in any way related to authoritarian ones. The power over individuals in the former is very far from being a mere intensification of the power in the latter. It is not the fact of military centralization, of dictatorship, that is central in the Soviet Union or Cuba; it is, rather, the absence of institutional pluralism in any degree, the absorption by the Communist leviathan of every economic and social function, and the unceasing effort to capture every part of the individual mind. If the Soviet Union were governed in fact by its purported representative assemblies instead of by the Politburo, the Soviet Union would be no less totalitarian, assuming nothing else were changed. Everything would still be confined absolutely in the iron frame of the military state.
Is there some canker in the liberal mind in the West that somehow makes the authoritarian state more repugnant than the totalitarian state? There must be; what else explains the indulgence with which liberals consider Cuba, Vietnam, and China despite their mind-stunning histories of terror, torture, and mass slaughter, their absolute repression of citizens who wish to depart their confines? Perhaps it is the disease of secular millennialism born largely in 1789; perhaps the canker I speak of goes back to the French Enlightenment with its generally greater revulsion for the authorities of church, class, guild, patriarchal family, and other traditional institutions than for those of the political state as such. The state indeed could be seen by a good many philosophes as an admirable instrument for expunging from the social order these detested authorities—even the total state.
But whatever the root or cause, the larger hatred for an Argentina or Chile than for the Soviet Union, China, and Cuba is a luminous aspect of the liberal mind. When Huber Matos was released from prison in Cuba and came here, after twenty years that included occasional torture, almost constant physical abuse, and frequent solitary confinement, liberal journals and papers in this country were almost silent. This in sordid contrast to the nearly ecstatic reception accorded Jacobo Timerman somewhat later. When the Soviet Union or China sends one of its athletic teams to the U.S., huzzahs fill the air, and receptions, dinners, and parties abound in liberal society. Nothing is said of the 30 (or is it 40 or 50?) million killed in the Soviet Union alone during its Great Terror or of the continuing imprisonments and torments inflicted upon those who seek to leave the USSR. It is possible that Mao’s record is more appalling even than Stalin’s, but that does not interfere with the liberal’s expansive joy as he gazes upon the charming Chinese who come as official emissaries from their prisonlike society. The fawning upon Castro during his visit to New York a year or two ago must go down as one of the more embarrassing curtsies of the liberal, radical-chic mind in this century. As I write, demonstrations are being mounted against the South African rugby team, now in this country. Splendid. But why not similar demonstrations against the Soviet hockey team or Chinese ping-pong players? But why even bother to ask the question?
From the point of view of American foreign policy, though, the overriding distinction to be made between authoritarian and totalitarian states is simply that none of the first has been aggressive to the American interest whereas all of the second have either made war on the United States or else dedicated themselves to permanent conflict. The first requisite of a foreign policy is a nation’s capacity for distinguishing between potential allies and potential aggressors. If this requisite is not fulfilled, no other considerations can compensate. I find the idea of strategic relations with Chile and South Africa not nearly as offensive to my moral values as I did relations with the Soviet Union in World War II, which were, however, necessary, indeed unavoidable.
When I say that one of the grounds for differentiation of authoritarian and totalitarian states is the greater likelihood of an occasional seed of freedom sprouting in the former, I have no doubt that someone will bring up Poland in riposte. But despite government and party in Poland and despite the existence of the basic structure and some of the trappings of totalitarianism since the Soviet domination after World War II, it stretches meaning and reference to think of Poland—and two or three other Eastern European states—as totalitarian in the sense that China, the Soviet Union, and Cuba are. If the threat of the Soviet military were removed entirely from this area, I think we might be surprised how quickly other Solidarity movements, other resurgent churches, would come into existence. And for a variety of reasons Poland has had, has been permitted to have, quite untypical relationships with Western countries during the last two or three decades. Authoritarian and totalitarian are, at bottom, ideal-types in the logical sense; so are democracy and capitalism. In the world of actuality, there are more and less faithful representations of ideal-types. The Soviet Union, China, and Cuba, among others, conform very closely to the ideal-type of totalitarianism; Poland and, on the evidence of Soviet invasions in 1956 and 1968, Hungary and Czechoslovakia are rather less close.
3. The approach of the Reagan administration to matters of human rights and their relationship to foreign policy is immensely more promising than the Carter administration’s approach was at any point in its duration. The early appointment of Jeane Kirkpatrick alone provided a favorable augury. As all the world knew, she had full understanding of the crucial difference between authoritarian and totalitarian societies, and also of the strategic requirements of American foreign policy. Her career since the appointment brilliantly exemplifies such understanding. She is as conscientious about the human rights of freedom, equality before the law, and due process as any of her predecessors in the ambassadorship has been. But far more than her recent predecessors, she knows where the grimmest, most repellent, and dangerous violations of these rights are to be found—that is, in the Soviet sphere. In the eight months the Reagan administration has been in office, nothing I have read suggests that her views and policies are in the slightest conflict with those of the President. That is a good omen for us all.
1. Most of the 150-odd nations with which the United States has relations, as one may see from the Freedom House annual reports, have deplorable records on human rights. We have to deal with them anyway. Ideally, we would be safest in a world composed of nations with values and institutions like ours. There have been times, as recently as World War II, when many hoped that such a world was coming to pass. One of the bitter fruits of the cold war is that it is not. Nonetheless, every new nation which begins to show respect for the inalienable rights of individuals, for free associations among its citizens, and for those institutional arrangements (due process, a free press, vital mediating structures, etc.) which make human rights real, makes our own institutions more secure.
Human rights do not exist in words in air or on parchment. They become real only in institutions vivified by free associations. Thus, in Poland, rights which before had only a paper existence in the Polish Constitution and the Helsinki Accords have at last become rather more real through the lively efforts of Solidarity. Every such advance moves the world closer to our historical ideal. With this ideal in view, ten years ago, the Coalition for a Democratic Majority was established upon the plank that human rights ought to be a major priority in American foreign policy. In September 1976, Senator Henry Jackson succeeded in getting a pledge in writing from candidate Jimmy Carter that he would adopt this emphasis during his administration. Considerations of human rights have now become inescapable in foreign policy.
Meanwhile, the Soviet Union has also been active in making human rights—in non-socialist nations—part of its foreign policy. It has been particularly unrelenting in its attacks upon Israel through UN resolutions on Zionism as racism (more recently, absurd as it seems, Zionism as anti-Semitism) and in other ways. Indeed, since 1969, the chief human-rights targets of the USSR have been Israel and South Africa and, since 1974, Chile. No nations in the United Nations bear such heavy criticism on human rights as these three. Moreover, for the USSR, Marxist nations are by definition exemplars of human rights, whereas non-Marxist nations are by definition abusers of human rights. Regimes of the former type must be defended at all costs, even including armed invasion; regimes of the latter type must be delegitimated, destabilized, subverted, and replaced by Marxist regimes.
In an odd way, therefore, human-rights issues have become the very center of the cold war, particularly in the war of ideas and in public opinion. In this ideological assault, the USSR has been particularly astute. By an aggressive assault on other nations, it has been able to divert international attention from the abuses of human rights within its own empire. Solzhenitsyn asserts that the Soviet Union has put to death at least 60 million of its own citizens since the 1920’s in its vas Gulag Archipelago. The USSR thus has reason to divert attention elsewhere.
In addition, the world as a whole is rife with human-rights violations: most nations are still at a rather low stage of development in their human-rights institutions, civil and political liberties, literacy and personal freedoms. In each region, the USSR protects its own allies and clients from international scrutiny—Ethiopia and Libya in Africa, Cuba and Nicaragua in Latin America—and attempts to delegitimate the other regimes in the area.
From the point of view of the West, this Soviet activity is tragically effective. By our own standards, we cannot play quite the same game. According to our own values, the human-rights institutions of many of the nations with which we are in alliance are scarcely admirable. Their sins against human rights are plainly visible. Indeed, the more open such societies are, the more visible their real abuses.
Thus there arises a profound dilemma. A sense of fairness and intellectual honesty obliges us to call abuses by their proper name. On the other hand, the uses to which such public naming is put do not always work toward real institutional improvements. Lane Kirkland of the AFL-CIO put the dilemma clearly early in 1981: “A human-rights policy that . . . in effect transfers political territory from the authoritarians to the totalitarians is an anti-human rights policy.”
2. In this context, the distinction between authoritarian and totalitarian regimes is not only well-founded in political science but inescapable in practical politics. Since improvements in human-rights practices depend upon the slow but sound development of institutions of human rights, it is of no use to human rights merely to destroy governments in the name of improving them. Not every “change” or “revolution” produces improvement. Those who, facing real abuses, argue for “change” and “revolution” are simplistic and naive, unless they can demonstrate a real likelihood of improvement. The delegitimation of the Shah of Iran led to the chaos, executions, and tyranny of the mullahs; the delegitimation of Somoza has led to the gradual hardening of Marxist-Leninist doctrine among the Sandinistas. Neither South Vietnam nor Cambodia was, a dozen years ago, an exemplar of human rights; but the conditions under which their people now live can hardly be called improvements. The unspeakable suffering of the Ethiopians under their Marxist “revolution” has not advanced the cause of human rights in Africa.
The totalitarian claim is fundamentally ideological. All morality and all rights are intellectually reduced to the claims of state rulers. No other rights remain, even in theory. In the light of this justification, total rule is ruthlessly enforced through military and police power and through neighborhood, even familial, surveillance. Yet ideology is for long periods easy to mask. Thus in practice, during the period of revolution and, if successful, the consolidation of power, totalitarian cells invariably take cover in “popular fronts,” which also include many genuine democrats, social reformers, religious leaders, idealistic youths, and others. They speak not for totalitarianism, but for “the people.” This pattern has now been followed in thirty or forty nations. It should be recognizable to all. Yet it seems to have an ever-fresh and ever-deceptive appeal. It is, moreover, difficult to counter, precisely because the “soft utopians” (in Reinhold Niebuhr’s phrase) always believe they can control the “hard utopians,” the Marxist-Leninist faction. The “soft utopians” eagerly provide the cover the Marxist-Leninists need until the latter’s grip upon the levers of power (police, foreign policy, and media) is firm.
In practice, therefore, revolutions in the name of human rights, whether in Africa, Latin America, or Asia, commonly contain genuine democrats and reformers, as well as Marxist cells. Abuses of human rights in such locations are real. The problem is how to bring about real improvements and to achieve results that make the human-rights situation better than it is. Weak and unstable regimes, even those which hold on for years, are typical in most of the non-Marxist world. How can one simultaneously help them to improve their institutions of human rights and also not delegitimate them in international and internal public opinion?
Some thinkers argue that we have little leverage over Marxist regimes and that, in any case, public denunciation is likely to antagonize and embitter them further. Thus, they say, the U.S. should use its public condemnations and leverage chiefly on its own allies. Yet the same logic would seem to apply: such allies, too, are likely to be antagonized and embittered. Indeed, recognizing their instability and weakness, some are likely to become yet more panicked and repressive. Nonetheless, the fact that totalitarian regimes are worse does not make authoritarian regimes, even those of our allies, admirable. Their own best interests, and ours, depend on the growing legitimacy they can win through building human-rights institutions. Yet these very institutions require the utmost in political wisdom.
3. As a critic of the human-rights policies of the Nixon, Ford, and Carter administrations, I am painfully aware of the difficulties faced in these matters by the Reagan administration. The Reagan policy may be succinctly stated: (1) Condone no abuse of human rights, by friend or foe, by totalitarian regime or authoritarian regime; call abuses by their proper name. (2) No more double standards, no more unequal applications of attention and condemnation to various regions or nations; insist upon the equal application of justice to all regions and all nations. (3) Extend help to all nations which struggle to improve the institutions of human rights, and which enlarge the liberties of associations and individuals to be active in all dimensions of society. (4) Think critically about the probable results of statements and actions with respect to human rights, so as to insure insofar as possible that such statements and actions do not destroy weak human-rights institutions in the name of improving them.
The critics of the Reagan policies often seem to me less than objective. Human-rights issues have become politicized domestically as well as internationally. This is one consequence of making human rights a political issue. On the other hand, even the passion of the critics shows how important an emphasis on human rights is to all Americans. Americans care. The very meaning of our nation is human rights. Those who carry out American policy in this area may properly expect to be criticized by their fellow citizens from many points of view. They should judge themselves on whether they have brought about results—some disasters averted, some real improvements noted, some persons relieved from their sufferings, a word of hope and honesty flashed around the world.
Nations built on respect for human rights are humankind’s noblest experiment. Spreading that experiment is not as easy as it once looked, but it must be done.
A last note. A war of ideas built around the presuppositions and institutional values of the inalienable rights of human beings has enough potency to undermine the Marxist empire and its dead dogma from within. Nations based on human rights are now feeding the nations not so based. Human rights produce not only liberty but bread-In Poland today, and perhaps in Ukrainia and Byelorussia and Bulgaria and other nations tomorrow, the power of the individual conscience gathers strength against decades of Marxist abuse. For years, we have refrained from ideological combat. This seems to have been a great error. The human race should not have to live with the Marxist empire for a thousand years. The best weapon against this empire is not military hardware but ideas. The ideas of human rights, rooted in individual conscience and in religion, are already undermining the foes of human rights from within. We should articulate and announce those ideas—and especially support the institution-building which makes them real—with all possible vigor.
“Which government,” I asked an old friend not too long after the signing of the Camp David agreements, “does the current issue of the Nation attack as a tyranny?” With his sure sense of why and under what circumstances a particular regime does or does not give offense to the Left—and without knowing or blinking—he answered, “Egypt.” He was correct. Do not think Egypt had done anything especially noteworthy to warrant this attention at that time or, for that matter, that it has really done so since; for by “noteworthy,” I mean only something more oppressive to its subjects than what its other Arab neighbors routinely do to theirs, though most are still going about in society as “progressives” and without the insincere flattery of outraged attention by large elements of the opinion elite.
The fact is that, if you’re careful in watching the fashions among those seen as the world’s top human-rights violators, you can almost intuit which one will be the target of the next half-page advertisement placed in the Sunday Times by that standing ad-hoc committee composed of Bella Abzug, Daniel Berrigan, Ramsey Clark, Ossie Davis . . . all the way down the alphabet of clerics and professors, free-lance writers and full-time activists through to George Wald and Howard Zinn.
One other fact is that the actual human-rights record is not much known to the signatories, to whom not much complexity is, in any case, congenial. To most of them, I’d wager, the record hardly even matters. What counts is not what the regime does to freedom; there are, after all, lots of regimes these folks love which have kept freedom in prison for years—and for some of these regimes, that is the least they’ve done to freedom. While even Lillian Hellman and I. F. Stone have by now felt obliged to speak up for a Soviet dissident or two, the victims of leftist tyrannies still do not arouse much sympathy from the protesting classes—and they arouse even less indignation—because they have not had the good fortune to be persecuted by a regime identified with the U.S. So what counts is which side of the world divide the regime adheres to. There is, then, a certain nasty symmetry in the attitudes of both the critics and defenders of, let us say, Pinochet’s Chile. You don’t need much precise information about how few or many political dissidents he jails or exiles or even has killed to figure out what Jessica Mitford feels about his junta and his human-rights record; all you need to know about Miss Mitford’s feelings is how his delegation votes at the UN. The same may be said for William F. Buckley’s feelings. And it is also true for both their feelings about Assad’s Syria and Husayn’s Iraq, where people with dangerous thoughts are ordinarily rounded up and shot (Syria) or hanged (Iraq). (Oh, how simple it would be if only these two murderous pro-Soviet regimes didn’t also want to murder each other.) Their dead won’t rouse the members of that permanent ad-hoc committee; infinitely greater enormities have left them resting complacently till some petty dictatorship favored by the Pentagon suddenly affronts their sensibilities.
I’d always wondered how the Greek colonels had so enraged the Left and even many usually sober relativistic liberals. It was a dictatorship, all right, and Greek democracy went back to everybody’s freshman course in the Humanities. Also Byron died for Greece . . . or was it in Greece? . . . or maybe it was Keats, anyway. But the Greek dictatorship, at least the way we measure dictatorships these days, was not an especially cruel one. The Greek resistance even won an Academy Award with Z. When the dictatorship withered away—which is pretty much what happened to it—the monarchy was finally abolished and democracy restored so securely that, as I write, a left-wing victory is predicted (or at least foreseeable) in the Greek parliamentary elections. Yet I had colleagues who refused to set foot in the Parthenon, lest they seem indifferent to tyranny in democracy’s birthplace, but would wend their way through Mitteleuropa to see the charms of old Prague where twice in our lifetime the boots of the Red Army had snuffed out, as the Nazis had done before, the most hopeful political alternatives of mankind.
This is the famous and vile double standard. I do not prefer, even in retrospect, the former Greek dictatorship to the still-not-yet-altogether awful one emerging in liberation Nicaragua. I am not indifferent to the house arrest of the composer Mikos Theodorakis in the lonely wastes of Zatounia. But even that desolate island was not, not by anyone’s reckoning, even remotely like Kolyma, the lamentations of whose many millions of inmates never managed to penetrate the thick ideological ramparts which Theodorakis and his comrades built and festooned in Greece’s fashionable districts to celebrate the great Soviet experiment.
Here, at least, is one difference between defenders of right-wing dictatorships and defenders of left-wing ones. Defenders of the former do not much pretend that they are better or really other than what they are; at most, they are said to be what is possible. Defenders of the latter, however, lie first to others and then to themselves; they lie not only about the virtues of the regimes they serve but also about the alleged crimes of the victims, of those regimes. No one of us knows anyone who pretends that Argentina or El Salvador is the good society. But all of us knew many who claimed, and some who still do, that Cuba is, and that its outcasts are gusanos or worms, not even people. (The Left has to its dubious credit this continuous improvisation of animal-equivalents for people out of favor: running dogs, hyenas, parasites, pigs, etc.)
This matter of pretensions, then, is a difference, but it is not enough. It is certainly not different enough for Jews. Sometimes it is the old order which torments the Jews, sometimes the new. This shifting process has not been neatly cyclical; in fact, often the forces of both compete to seize the legitimacy of popular anti-Semitism or compete in provoking it. The civil rights of Jews suffer under both. In such times, Jews are caught between the red and the black, these self-evident adversaries across the ultimate barricades of ideology and power, but barricades behind which Jews do not really belong and are never really secure.
Yet Jews, and democrats too, chose to fight with Stalin’s legions against Hitler’s. Who would say today that they were wrong? Many, of course, thought the alliance deeper than one of desperate convenience, and these people were wrong, tragically wrong, some of them also deceitfully so. But given the time and its exigencies, there was no alternative to the alliance with the pock-marked father of peoples and his party. If that were so in the choice between two ruthless totalitarianisms, surely it cannot be entirely out of the question today that an alliance with an authoritarian regime against a totalitarian revolutionary system might be proper. This is not, then, a matter of a priori principle but of situational prudence. It would be nice if the facts allowed always and everywhere for the application of a clear principle: no alliances with other than pluralistic democracies.
Unfortunately, such innocence in one’s associations is not a realistic strategy for any nation in a harsh world. There just aren’t enough pluralistic democracies propitiously placed around the globe for such a strategy to work. And in the Third World, there aren’t really many (are there any?) movements genuinely and also plausibly committed to those values we want to protect. Inevitably, we will deal with and aid any number of imperfect governments and movements, some engaged in armed struggles with other—also imperfect, mostly more imperfect—governments and movements. Let us do so, putting hygienic distance between us and some of the recipients of our largesse, prodding them, even bullying them, toward more humane canons, but aware that our successes will be far less notable and far less numerous than our failures.
So there are many necessary liaisons which the U.S. makes that do not meet the scrupulous standard of innocence, the standard which many critics of U.S. foreign policy sometimes seem to apply. The truth is, though, that it is a standard they want to apply only to U.S. foreign policy but not to their own preferences, identifications, and loyalties abroad. One clue as to who these disingenuous people are is that they are likely to be among those who ridicule even the analytic distinction between totalitarian and authoritarian governments. The reason, in my view, is simple: today they prefer the totalitarian governments, believing them somehow progressive rather than reactionary, as North Korea is progressive and South Korea reactionary—the facts of the case be damned.
Authoritarian regimes are not, however, intrinsically less cruel or evil. Witness South Africa, where brutality has been routinized. Similarly, there is no regime effectively less totalitarian—and none more closely resembling the prototypical authoritarian regime under siege—than that of the mad mullahs now putting dozens of people a day to the firing squad in Iran. But that it is not totalitarian does not commend it to us at all, either for friendship or empathy. There is also little reason for friendship or empathy toward those the mullahs are now shooting. Our friends have long since been murdered. Many of those now being shot murdered them. If today’s victims could, they would make a pro-Soviet totalitarianism of Iran, and they might yet succeed in doing so.
Which brings me back to our friends with the insincere standard. It was not so long ago that there were demonstrations mounted and advertisements placed in the U.S. press against the Shah’s rule. The views of these protests were the orthodoxy of the day, in the circles in which I move, certainly. Reza Pahlevi was our top ally in West Asia and, well, his society was not a pluralistic democracy. You know the rest of the awful litany—not all false, though much of it exaggerated. What did the protesters want in Iran? What did they want that they weren’t saying even when they said they wanted only the departure of the Shah and the return of Khomeini from Paris? Did any but the stupidest really believe that the freedoms traduced by the Pahlevis would emerge in Iran once the Shah was out? The one possibility that this might have developed over the long haul would have been a U.S.-backed military action in support of Bakhtiar’s nascent bourgeois republic. But this is what the protesters, here and there, wanted least of all. That is what their agitation was aimed at preventing. They grasped the propaganda initiative, held it tenaciously in the Western media and the universities, drowning out the skepticism of others, and succeeded in having the U.S. withdraw its influence and power from the Iranian stage. Jimmy Carter and Cyrus Vance sat in the seats of the mighty then, and, whatever sparse virtues they possessed, these demonstrably did not include an appreciation of the stakes their timidity left for others, far more ruthless, to seize. By now, doubtless, most of the protest community in the U.S. would wish the rule of the imam to end. Isn’t it just a little bit embarrassing to Professor Richard Falk? Iran is on the verge of civil war. After the Islamic fundamentalists come the Marxist fundamentalists, under whom neither doctrinal heresy nor human rights nor life itself will be more secure.
Ignazio Silone wrote in 1955 of how treacherous an enterprise choosing one’s comrades can be. You are often betrayed. It is by now altogether too clear that many of the self-described human-rights activists in our country, so agitated about the abuse of freedoms in South Korea and Chile, in Zaire and the Philippines, and lately also in Egypt, are not really concerned for freedom at all. It is even clear that most of their campaigns for dissidents or for the oppressed are useful tactics (and only rarely inadvertent ones) of organized international movements ever so much more cruel than the regimes they oppose. But what right do these folk have—the people who trumpeted the victories of North Vietnam and of the Khmer Rouge as victories for humanity, who welcomed Khomeini’s revolution, and who think of themselves as comrades of Fidel Castro—to hector anyone about human rights?
The instance of Egypt as a prime target of human-rights activists is a case in point. Why the attention of recent years? Was it a surprise that Sadat did not head a pluralistic democracy? Is that, perchance, what he overthrew when he replaced Nasser or what he expelled when he threw out the Russians? To all these questions the answer is negative. Sadat’s crime was twofold, the second perhaps even more serious than the first. He put Egypt on the road to peace with Israel, and he linked the destiny of his people with the sway of the U.S. in the Middle East. For these reasons it became open season on Sadat and his country.
Sadat understood the cynical alliance between Muslim and Marxist fundamentalisms. He was unwilling to allow them to reenact the experience of Iran in Egypt, even if he offended Carl Rowan’s and the Nation’s devotion to habeas corpus (a principle never recognized in the Middle East except in Israel). So he moved to prevent the chaos which militant Islam can organize in the streets and from which Left totalitarians profit—but, alas, not decisively enough or fast enough.
Many of our allies’ societies are vulnerable to ruthlessly undemocratic forces. But freedom still has a better chance with these allies and us than with our foes, as freedom would have had a better chance with Bakhtiar backed by Carter than it does with Khomeini or whoever comes after him backed by Brezhnev. Americans should have an aggressive human-rights policy, but it should be one different and differently motivated from the policy of those now amply discredited givers of evil counsel who have won the recent arguments over Indochina and Iran.
1. The editors of COMMENTARY ask what role, if any, should concern for human rights play in American policy. In my view this is not a question which can be open to discussion. A concern for the advancement of human rights must be integral to the overall course of American foreign policy.
Indeed, it is precisely because the United States provides a humane and democratic alternative to the totalitarians and authoritarians of the world that many of us who are concerned with the advancement of human rights have been at the forefront of efforts aimed at stemming what we perceived to be America’s retreat from its international responsibilities. Without the moral prerogatives that are provided by an advocacy of human rights, the American role in the world would be limited to national interest in its most narrow sense.
America has been and remains a country which has a deep respect for fundamental human rights. This respect is an essential component of our system of governance and is the product of our values and traditions. These traditions can and must inform our conduct of foreign policy. Nonetheless, it is clear that the vast majority of mankind lives under systems which violate the principles upon which our nation was founded.
Realistically, we cannot hope to transform the world overnight. However, we must recognize that if we fail to play an assertive role within the world we will be abandoning it to the influences of totalitarians and authoritarians and thus condemning the world to a future that is bleak and cruel. Therefore if we believe in the universality of human rights we must also believe that the future of the world cannot be decided by those powers which most clearly deny human rights to their citizens.
It is clear that in furthering our national-security objectives and our national interest we frequently find ourselves allied to governments which share little or nothing of our conception of human rights. In this sense, conflicts between our democratic traditions and aspirations and the anti-democratic policies of our allies can and do arise.
Such conflicts are unavoidable and cannot be swept under the rug. Our President and our nation’s leaders must be free to speak the truth about the state of human rights throughout the world. However, such pronouncements must be informed by an awareness that expectations cannot be raised beyond the limits of the possible.
We must freely and openly defend democratic traditions and advocate universal adherence to the principles of human rights. Our policy in this regard must be most forceful in those situations in which a democratic alternative has a reasonable chance of attaining power. Furthermore, we must strive, whenever possible, to win the release of political prisoners.
In the case of strategic allies we must be careful that we are not aiding the coming to power of forces which not only are anti-democratic but anti-American as well. Thus, we must be aware that our right to speak out on human-rights matters carries with it the obligation of conducting an activist foreign policy which is committed to encouraging and aiding democratic forces and blocking the coming to power of those groupings which seek to impose systems of government which are even more inhuman than the systems they seek to replace.
In my view, such an advocacy of human rights, if it is tempered by a healthy sense of reality is, ultimately, in the geopolitical interests of the U.S. The fact remains that the Soviet Union’s expansionism is integrally related to its internal suppression of human rights. Soviet intervention in Afghanistan, Soviet meddling in Africa, and the Soviet threat to Poland would not be possible if opponents of such policies within the Soviet Union were free to express their criticisms of their government’s actions. Moreover, it is equally true that the United States is allied with virtually all the democracies of the world and that the USSR enjoys the support of none. Moreover, while undemocratic allies of the U.S. have succeeded in democratizing, no similar examples exist in the Soviet bloc. And while Poland may well be a watershed, the gains made by Solidarity are anything but secure.
Ultimately a concern for human rights, and a policy which has democratization as its aim, is a policy which is aimed at peace and stability. For any country which systematically denies human rights to its citizens cannot be regarded as stable and reliable. Thus, in helping to democratize the undemocratic states with which we are allied, we help in solidifying and strengthening our alliances and thus serve our own national-security interests.
2. A human-rights policy must reflect an understanding of the degree of freedom enjoyed in a given society. Thus, the distinction between authoritarianism and totalitarianism remains a real and important one. It is, pure and simple, a fact that societies such as the Soviet Union, Kampuchea, and the People’s Republic of China exert a significantly greater degree of social control over their citizens than do authoritarian societies. Yet for those who are political prisoners or prisoners of conscience, the distinction between authoritarianism and totalitarianism is academic.
Regrettably, most prisons are totalitarian institutions whether they are located on Dawson Island in Chile or in the Moldavian Autonomous Soviet Socialist Republic. And political torture is equally painful whether it is administered by a right or left hand. Some human-rights organizations in the West appear to have lost sight of this plain fact. Assessing the degree of freedom in a society by engaging in the cold calculus of instances of torture and the numbers of political prisoners in that society obscures the truth about how a given society functions outside the prison. Authoritarian societies which tolerate some press criticism, and do not seek to take over trade unions, religious associations, or the universities, undoubtedly provide a greater latitude to their citizens than do totalitarian societies. This is not the result of the benevolence of their leaders. Rather, it is because such authoritarian leaders are incapable of imposing total control.
To date, no totalitarian society has been transformed into a democracy. However, this is not to suggest that such transformations are impossible. Indeed, the events which we are observing in Poland suggest that we must not be blind to the possibilities of the democratization of totalitarian societies.
Our leaders must have an awareness of the distinctions between societies which exert pervasive or total control over their citizens and those societies which repress the manifestations of dissatisfaction and discontent. These distinctions should be used in the development of an overall strategy which has as its goal the extension of human rights and the advancement of democracy. The close relationship which appears to be emerging between the U.S. and the People’s Republic of China suggests that while the difference between totalitarians and authoritarians may be an important philosophical distinction to certain members of the new administration, it does not enter at all into the conduct of foreign affairs. It is both unrigorous and unfair to suggest, for example, that our policy in Latin America is based on a distinction between totalitarianism and authoritarianism, and then proceed to ignore this distinction in Asia.
3. In my view it is not useful to attempt to compare the Reagan administration’s approach to human rights with that of the Carter administration. Both approaches are somewhat flawed.
In his first months in office, President Reagan has been profoundly disappointing in his failure to highlight the issue of human rights. He has not used his office to express his concern for the treatment of political prisoners, East or West. And although he has articulated forcefully the nature of the Soviet threat, he has not succeeded in clearly indicating the moral dimensions of the threat to freedom which is posed by totalitarianism.
United States-Soviet relations are not merely a matter of a struggle between two superpowers. The conflict between the USSR and the United States is as much a conflict between fundamentally antithetical systems of beliefs and values. And just as President Carter proved that a human-rights policy cannot be effective in the absence of a coherent foreign policy, so President Reagan may well prove that a foreign policy without a human-rights dimension will be reduced to a matter of military strength.
The Reagan administration’s unfortunate handling of the aftermath of the Lefever nomination, its delay in naming a second candidate for the post of Assistant Secretary of State for Human Rights and Humanitarian Affairs, and, indeed, its frequent suggestions that the State Department Office for Human Rights should be abolished or renamed and weakened, are harmful to the image of the United States as an advocate and defender of democratic values and human rights. In this sense they are also harmful to the national interest. Human rights have not emerged as an integral component in the foreign policy of the new administration. Regrettably, human rights also were not an integral part of the Carter administration’s conduct of foreign affairs. In fact, the fundamental mistake of the Carter administration in this area was that the President behaved as if he believed that human rights were somehow above national interest and national security. Under Carter, human rights became something that was artificially grafted onto the conduct of foreign affairs.
In advocating human rights in a manner that was profoundly and indeed exclusively moral, Jimmy Carter detached them from the praxis of diplomacy. Carter’s pronouncements on human-rights matters had an almost homiletic character to them. They were, above all, ethical considerations, moral preachments, and not precise formulations of policy. They created a climate of raised expectations which could not be fulfilled and in point of fact were not fulfilled. In this way, by elevating the concept of human rights from the level of what was possible to the level of what was desirable, Carter succeeded in diminishing the concept of human rights. Yet despite the failures of the Carter foreign policy and the inconsistency with which human-rights criteria were applied, it is to the lasting credit of the former President that he succeeded in conveying to the world a sense that the United States is a staunch defender of personal liberties and human rights.
While the Carter human-rights policy was overly zealous and flawed, it nonetheless contrasted favorably with the curious amorality of the Nixon administration’s approach to human-rights matters. Even former Secretary of State Kissinger has admitted the shortcomings of the Nixon approach. He has asserted: “When I was in office, I believed—and perhaps excessively—that the best method in dealing with totalitarian systems was not to utilize a direct, visible means of confrontation but to use quiet diplomacy. . . . I would say in retrospect that more public statements might well have been a useful adjunct to that policy.”
Ford’s signing of the Helsinki Accords, and their clear mention of human rights, inspired a wave of activism by advocates of freedom in the Soviet bloc. In part, these advocates were further encouraged by Carter’s articulation of the universality of human rights. Moreover, the President’s statements succeeded in focusing increased attention in the Western media on the question of the suppression of human rights and in this way provided human-rights advocates in Eastern Europe with a broader audience. The Helsinki Monitoring Groups in Moscow, Lithuania, the Ukraine, Armenia, and Georgia, Charter 77 in Czechoslovakia, and the Polish Worker’s Defense Committee (KOR) were clearly the products of internal forces and of the problems plaguing each of these Soviet bloc-societies. Yet there can be no question that the emergence and continued activity of these groups was in part aided by the moral support that was provided by Carter’s strong and forceful articulation of human-rights concerns.
The Carter administration, however, profoundly disappointed many of these same human-rights advocates by abandoning its forceful criticism of the totalitarian denial of human rights at a time when it sought a SALT II agreement.
Carter’s term in office demonstrated that a human-rights policy cannot be substituted effectively for a strategic response to the Soviet threat. Nonetheless, we should not infer from that experience that to develop a strategic response is to disregard matters of individual and collective freedoms.
In the final analysis, concern for human rights can and should be a component of American foreign policy. Yet in order to be effective, it must be free of lurches from one extreme to another. America cannot afford the zigzags in policy which have accompanied the shift from the quiet diplomacy of Nixon to the moral diplomacy of Carter to the strategic diplomacy of Reagan. If we are to advance the cause of human rights, a balance must be struck. While we must continue to speak the truth about every country which violates human rights, we must recognize that our influence is limited.
The ultimate guarantors of human rights are the people of a given nation. Our role must be to make them understand that the United States is on their side.
This symposium is sponsored by the Harry Elson Commentary Fund and is being published in observance of the 75th Anniversary Year of the American Jewish Committee, the pioneer human-rights organization in the United States.Click to write a letter to the editor
Human Rights and American Foreign Policy A Symposium
Must-Reads from Magazine
t can be said that the Book of Samuel launched the American Revolution. Though antagonistic to traditional faith, Thomas Paine understood that it was not Montesquieu, or Locke, who was inscribed on the hearts of his fellow Americans. Paine’s pamphlet Common Sense is a biblical argument against British monarchy, drawing largely on the text of Samuel.
Today, of course, universal biblical literacy no longer exists in America, and sophisticated arguments from Scripture are all too rare. It is therefore all the more distressing when public intellectuals, academics, or religious leaders engage in clumsy acts of exegesis and political argumentation by comparing characters in the Book of Samuel to modern political leaders. The most common victim of this tendency has been the central character in the Book of Samuel: King David.
Most recently, this tendency was made manifest in the writings of Dennis Prager. In a recent defense of his own praise of President Trump, Prager wrote that “as a religious Jew, I learned from the Bible that God himself chose morally compromised individuals to accomplish some greater good. Think of King David, who had a man killed in order to cover up the adultery he committed with the man’s wife.” Prager similarly argued that those who refuse to vote for a politician whose positions are correct but whose personal life is immoral “must think God was pretty flawed in voting for King David.”
Prager’s invocation of King David was presaged on the left two decades ago. The records of the Clinton Presidential Library reveal that at the height of the Lewinsky scandal, an email from Dartmouth professor Susannah Heschel made its way into the inbox of an administration policy adviser with a similar comparison: “From the perspective of Jewish history, we have to ask how Jews can condemn President Clinton’s behavior as immoral, when we exalt King David? King David had Batsheva’s husband, Uriah, murdered. While David was condemned and punished, he was never thrown off the throne of Israel. On the contrary, he is exalted in our Jewish memory as the unifier of Israel.”
One can make the case for supporting politicians who have significant moral flaws. Indeed, America’s political system is founded on an awareness of the profound tendency to sinfulness not only of its citizens but also of its statesmen. “If men were angels, no government would be necessary,” James Madison informs us in the Federalist. At the same time, anyone who compares King David to the flawed leaders of our own age reveals a profound misunderstanding of the essential nature of David’s greatness. David was not chosen by God despite his moral failings; rather, David’s failings are the lens that reveal his true greatness. It is in the wake of his sins that David emerges as the paradigmatic penitent, whose quest for atonement is utterly unlike that of any other character in the Bible, and perhaps in the history of the world.
While the precise nature of David’s sins is debated in the Talmud, there is no question that they are profound. Yet it is in comparing David to other faltering figures—in the Bible or today—that the comparison falls flat. This point is stressed by the very Jewish tradition in whose name Prager claimed to speak.
It is the rabbis who note that David’s predecessor, Saul, lost the kingship when he failed to fulfill God’s command to destroy the egregiously evil nation of Amalek, whereas David commits more severe sins and yet remains king. The answer, the rabbis suggest, lies not in the sin itself but in the response. Saul, when confronted by the prophet Samuel, offers obfuscations and defensiveness. David, meanwhile, is similarly confronted by the prophet Nathan: “Thou hast killed Uriah the Hittite with the sword, and hast taken his wife to be thy wife, and hast slain him with the sword of the children of Ammon.” David’s immediate response is clear and complete contrition: “I have sinned against the Lord.” David’s penitence, Jewish tradition suggests, sets him apart from Saul. Soon after, David gave voice to what was in his heart at the moment, and gave the world one of the most stirring of the Psalms:
Have mercy upon me, O God, according to thy lovingkindness: according unto the multitude of thy tender mercies blot out my transgressions.
Wash me thoroughly from mine iniquity, and cleanse me from my sin. For I acknowledge my transgressions: and my sin is ever before me.
. . . Deliver me from bloodguiltiness, O God, thou God of my salvation: and my tongue shall sing aloud of thy righteousness.
O Lord, open thou my lips; and my mouth shall shew forth thy praise.
For thou desirest not sacrifice; else would I give it: thou delightest not in burnt offering.
The sacrifices of God are a broken spirit: a broken and a contrite heart, O God, thou wilt not despise.
The tendency to link David to our current age lies in the fact that we know more about David than any other biblical figure. The author Thomas Cahill has noted that in a certain literary sense, David is the only biblical figure that is like us at all. Prior to the humanist autobiographies of the Renaissance, he notes, “we can count only a few isolated instances of this use of ‘I’ to mean the interior self. But David’s psalms are full of I’s.” In David’s Psalms, Cahill writes, we “find a unique early roadmap to the inner spirit—previously mute—of ancient humanity.”
At the same time, a study of the Book of Samuel and of the Psalms reveals how utterly incomparable David is to anyone alive today. Haym Soloveitchik has noted that even the most observant of Jews today fail to feel a constant intimacy with God that the simplest Jew of the premodern age might have felt, that “while there are always those whose spirituality is one apart from that of their time, nevertheless I think it safe to say that the perception of God as a daily, natural force is no longer present to a significant degree in any sector of modern Jewry, even the most religious.” Yet for David, such intimacy with the divine was central to his existence, and the Book of Samuel and the Psalms are an eternal testament to this fact. This is why simple comparisons between David and ourselves, as tempting as they are, must be resisted. David Wolpe, in his book about David, attempts to make the case as to why King David’s life speaks to us today: “So versatile and enduring is David in our culture that rare is the week that passes without some public allusion to his life…We need to understand David better because we use his life to comprehend our own.”
The truth may be the opposite. We need to understand David better because we can use his life to comprehend what we are missing, and how utterly unlike our lives are to his own. For even the most religious among us have lost the profound faith and intimacy with God that David had. It is therefore incorrect to assume that because of David’s flaws it would have been, as Amos Oz has written, “fitting for him to reign in Tel Aviv.” The modern State of Israel was blessed with brilliant leaders, but to which of its modern warriors or statesmen should David be compared? To Ben Gurion, who stripped any explicit invocation of the Divine from Israel’s Declaration of Independence? To Moshe Dayan, who oversaw the reconquest of Jerusalem, and then immediately handed back the Temple Mount, the locus of King David’s dreams and desires, to the administration of the enemies of Israel? David’s complex humanity inspires comparison to modern figures, but his faith, contrition, and repentance—which lie at the heart of his story and success—defy any such engagement.
And so, to those who seek comparisons to modern leaders from the Bible, the best rule may be: Leave King David out of it.
Three attacks in Britain highlight the West’s inability to see the threat clearly
This lack of seriousness manifests itself in several ways. It’s perhaps most obvious in the failure to reform Britain’s chaotic immigration and dysfunctional asylum systems. But it’s also abundantly clear from the grotesque underfunding and under-resourcing of domestic intelligence. In MI5, Britain has an internal security service that is simply too small to do its job effectively, even if it were not handicapped by an institutional culture that can seem willfully blind to the ideological roots of the current terrorism problem.
In 2009, Jonathan Evans, then head of MI5, confessed at a parliamentary hearing about the London bus and subway attacks of 2005 that his organization only had sufficient resources to “hit the crocodiles close to the boat.” It was an extraordinary metaphor to use, not least because of the impression of relative impotence that it conveys. MI5 had by then doubled in size since 2001, but it still boasted a staff of only 3,500. Today it’s said to employ between 4,000 and 5,000, an astonishingly, even laughably, small number given a UK population of 65 million and the scale of the security challenges Britain now faces. (To be fair, the major British police forces all have intelligence units devoted to terrorism, and the UK government’s overall counterterrorism strategy involves a great many people, including social workers and schoolteachers.)
You can also see that unseriousness at work in the abject failure to coerce Britain’s often remarkably sedentary police officers out of their cars and stations and back onto the streets. Most of Britain’s big-city police forces have adopted a reactive model of policing (consciously rejecting both the New York Compstat model and British “bobby on the beat” traditions) that cripples intelligence-gathering and frustrates good community relations.
If that weren’t bad enough, Britain’s judiciary is led by jurists who came of age in the 1960s, and who have been inclined since 2001 to treat terrorism as an ordinary criminal problem being exploited by malign officials and politicians to make assaults on individual rights and to take part in “illegal” foreign wars. It has long been almost impossible to extradite ISIS or al-Qaeda–linked Islamists from the UK. This is partly because today’s English judges believe that few if any foreign countries—apart from perhaps Sweden and Norway—are likely to give terrorist suspects a fair trial, or able to guarantee that such suspects will be spared torture and abuse.
We have a progressive metropolitan media elite whose primary, reflexive response to every terrorist attack, even before the blood on the pavement is dry, is to express worry about an imminent violent anti-Muslim “backlash” on the part of a presumptively bigoted and ignorant indigenous working class. Never mind that no such “backlash” has yet occurred, not even when the young off-duty soldier Lee Rigby was hacked to death in broad daylight on a South London street in 2013.
Another sign of this lack of seriousness is the choice by successive British governments to deal with the problem of internal terrorism with marketing and “branding.” You can see this in the catchy consultant-created acronyms and pseudo-strategies that are deployed in place of considered thought and action. After every atrocity, the prime minister calls a meeting of the COBRA unit—an acronym that merely stands for Cabinet Office Briefing Room A but sounds like a secret organization of government superheroes. The government’s counterterrorism strategy is called CONTEST, which has four “work streams”: “Prevent,” “Pursue,” “Protect,” and “Prepare.”
Perhaps the ultimate sign of unseriousness is the fact that police, politicians, and government officials have all displayed more fear of being seen as “Islamophobic” than of any carnage that actual terror attacks might cause. Few are aware that this short-term, cowardly, and trivial tendency may ultimately foment genuine, dangerous popular Islamophobia, especially if attacks continue.R
ecently, three murderous Islamist terror attacks in the UK took place in less than a month. The first and third were relatively primitive improvised attacks using vehicles and/or knives. The second was a suicide bombing that probably required relatively sophisticated planning, technological know-how, and the assistance of a terrorist infrastructure. As they were the first such attacks in the UK, the vehicle and knife killings came as a particular shock to the British press, public, and political class, despite the fact that non-explosive and non-firearm terror attacks have become common in Europe and are almost routine in Israel.
The success of all three plots indicates troubling problems in British law-enforcement practice and culture, quite apart from any other failings on the parts of the state in charge of intelligence, border control, and the prevention of radicalization. At the time of writing, the British media have been full of encomia to police courage and skill, not least because it took “only” eight minutes for an armed Metropolitan Police team to respond to and confront the bloody mayhem being wrought by the three Islamist terrorists (who had ploughed their rented van into people on London Bridge before jumping out to attack passersby with knives). But the difficult truth is that all three attacks would be much harder to pull off in Manhattan, not just because all NYPD cops are armed, but also because there are always police officers visibly on patrol at the New York equivalents of London’s Borough Market on a Saturday night. By contrast, London’s Metropolitan police is a largely vehicle-borne, reactive force; rather than use a physical presence to deter crime and terrorism, it chooses to monitor closed-circuit street cameras and social-media postings.
Since the attacks in London and Manchester, we have learned that several of the perpetrators were “known” to the police and security agencies that are tasked with monitoring potential terror threats. That these individuals were nevertheless able to carry out their atrocities is evidence that the monitoring regime is insufficient.
It also seems clear that there were failures on the part of those institutions that come under the leadership of the Home Office and are supposed to be in charge of the UK’s border, migration, and asylum systems. Journalists and think tanks like Policy Exchange and Migration Watch have for years pointed out that these systems are “unfit for purpose,” but successive governments have done little to take responsible control of Britain’s borders. When she was home secretary, Prime Minister Theresa May did little more than jazz up the name, logo, and uniforms of what is now called the “Border Force,” and she notably failed to put in place long-promised passport checks for people flying out of the country. This dereliction means that it is impossible for the British authorities to know who has overstayed a visa or whether individuals who have been denied asylum have actually left the country.
It seems astonishing that Youssef Zaghba, one of the three London Bridge attackers, was allowed back into the country. The Moroccan-born Italian citizen (his mother is Italian) had been arrested by Italian police in Bologna, apparently on his way to Syria via Istanbul to join ISIS. When questioned by the Italians about the ISIS decapitation videos on his mobile phone, he declared that he was “going to be a terrorist.” The Italians lacked sufficient evidence to charge him with a crime but put him under 24-hour surveillance, and when he traveled to London, they passed on information about him to MI5. Nevertheless, he was not stopped or questioned on arrival and had not become one of the 3,000 official terrorism “subjects of interest” for MI5 or the police when he carried out his attack. One reason Zaghba was not questioned on arrival may have been that he used one of the new self-service passport machines installed in UK airports in place of human staff after May’s cuts to the border force. Apparently, the machines are not yet linked to any government watch lists, thanks to the general chaos and ineptitude of the Home Office’s efforts to use information technology.
The presence in the country of Zaghba’s accomplice Rachid Redouane is also an indictment of the incompetence and disorganization of the UK’s border and migration authorities. He had been refused asylum in 2009, but as is so often the case, Britain’s Home Office never got around to removing him. Three years later, he married a British woman and was therefore able to stay in the UK.
But it is the failure of the authorities to monitor ringleader Khuram Butt that is the most baffling. He was a known and open associate of Anjem Choudary, Britain’s most notorious terrorist supporter, ideologue, and recruiter (he was finally imprisoned in 2016 after 15 years of campaigning on behalf of al-Qaeda and ISIS). Butt even appeared in a 2016 TV documentary about ISIS supporters called The Jihadist Next Door. In the same year, he assaulted a moderate imam at a public festival, after calling him a “murtad” or apostate. The imam reported the incident to the police—who took six months to track him down and then let him off with a caution. It is not clear if Butt was one of the 3,000 “subjects of interest” or the additional 20,000 former subjects of interest who continue to be the subject of limited monitoring. If he was not, it raises the question of what a person has to do to get British security services to take him seriously as a terrorist threat; if he was in fact on the list of “subjects of interest,” one has to wonder if being so designated is any barrier at all to carrying out terrorist atrocities. It’s worth remembering, as few do here in the UK, that terrorists who carried out previous attacks were also known to the police and security services and nevertheless enjoyed sufficient liberty to go at it again.B
ut the most important reason for the British state’s ineffectiveness in monitoring terror threats, which May addressed immediately after the London Bridge attack, is a deeply rooted institutional refusal to deal with or accept the key role played by Islamist ideology. For more than 15 years, the security services and police have chosen to take note only of people and bodies that explicitly espouse terrorist violence or have contacts with known terrorist groups. The fact that a person, school, imam, or mosque endorses the establishment of a caliphate, the stoning of adulterers, or the murder of apostates has not been considered a reason to monitor them.
This seems to be why Salman Abedi, the Manchester Arena suicide bomber, was not being watched by the authorities as a terror risk, even though he had punched a girl in the face for wearing a short skirt while at university, had attended the Muslim Brotherhood-controlled Didsbury Mosque, was the son of a Libyan man whose militia is banned in the UK, had himself fought against the Qaddafi regime in Libya, had adopted the Islamist clothing style (trousers worn above the ankle, beard but no moustache), was part of a druggy gang subculture that often feeds individuals into Islamist terrorism, and had been banned from a mosque after confronting an imam who had criticized ISIS.
It was telling that the day after the Manchester Arena suicide-bomb attack, you could hear security officials informing radio and TV audiences of the BBC’s flagship morning-radio news show that it’s almost impossible to predict and stop such attacks because the perpetrators “don’t care who they kill.” They just want to kill as many people as possible, he said.
Surely, anyone with even a basic familiarity with Islamist terror attacks over the last 15 or so years and a nodding acquaintance with Islamist ideology could see that the terrorist hadn’t just chosen the Ariana Grande concert in Manchester Arena because a lot of random people would be crowded into a conveniently small area. Since the Bali bombings of 2002, nightclubs, discotheques, and pop concerts attended by shameless unveiled women and girls have been routinely targeted by fundamentalist terrorists, including in Britain. Among the worrying things about the opinion offered on the radio show was that it suggests that even in the wake of the horrific Bataclan attack in Paris during a November 2015 concert, British authorities may not have been keeping an appropriately protective eye on music venues and other places where our young people hang out in their decadent Western way. Such dereliction would make perfect sense given the resistance on the part of the British security establishment to examining, confronting, or extrapolating from Islamist ideology.
The same phenomenon may explain why authorities did not follow up on community complaints about Abedi. All too often when people living in Britain’s many and diverse Muslim communities want to report suspicious behavior, they have to do so through offices and organizations set up and paid for by the authorities as part of the overall “Prevent” strategy. Although criticized by the left as “Islamophobic” and inherently stigmatizing, Prevent has often brought the government into cooperative relationships with organizations even further to the Islamic right than the Muslim Brotherhood. This means that if you are a relatively secular Libyan émigré who wants to report an Abedi and you go to your local police station, you are likely to find yourself speaking to a bearded Islamist.
From its outset in 2003, the Prevent strategy was flawed. Its practitioners, in their zeal to find and fund key allies in “the Muslim community” (as if there were just one), routinely made alliances with self-appointed community leaders who represented the most extreme and intolerant tendencies in British Islam. Both the Home Office and MI5 seemed to believe that only radical Muslims were “authentic” and would therefore be able to influence young potential terrorists. Moderate, modern, liberal Muslims who are arguably more representative of British Islam as a whole (not to mention sundry Shiites, Sufis, Ahmmadis, and Ismailis) have too often found it hard to get a hearing.
Sunni organizations that openly supported suicide-bomb attacks in Israel and India and that justified attacks on British troops in Iraq and Afghanistan nevertheless received government subsidies as part of Prevent. The hope was that in return, they would alert the authorities if they knew of individuals planning attacks in the UK itself.
It was a gamble reminiscent of British colonial practice in India’s northwest frontier and elsewhere. Not only were there financial inducements in return for grudging cooperation; the British state offered other, symbolically powerful concessions. These included turning a blind eye to certain crimes and antisocial practices such as female genital mutilation (there have been no successful prosecutions relating to the practice, though thousands of cases are reported every year), forced marriage, child marriage, polygamy, the mass removal of girls from school soon after they reach puberty, and the epidemic of racially and religiously motivated “grooming” rapes in cities like Rotherham. (At the same time, foreign jihadists—including men wanted for crimes in Algeria and France—were allowed to remain in the UK as long as their plots did not include British targets.)
This approach, simultaneously cynical and naive, was never as successful as its proponents hoped. Again and again, Muslim chaplains who were approved to work in prisons and other institutions have sometimes turned out to be Islamist extremists whose words have inspired inmates to join terrorist organizations.
Much to his credit, former Prime Minister David Cameron fought hard to change this approach, even though it meant difficult confrontations with his home secretary (Theresa May), as well as police and the intelligence agencies. However, Cameron’s efforts had little effect on the permanent personnel carrying out the Prevent strategy, and cooperation with Islamist but currently nonviolent organizations remains the default setting within the institutions on which the United Kingdom depends for security.
The failure to understand the role of ideology is one of imagination as well as education. Very few of those who make government policy or write about home-grown terrorism seem able to escape the limitations of what used to be called “bourgeois” experience. They assume that anyone willing to become an Islamist terrorist must perforce be materially deprived, or traumatized by the experience of prejudice, or provoked to murderous fury by oppression abroad. They have no sense of the emotional and psychic benefits of joining a secret terror outfit: the excitement and glamor of becoming a kind of Islamic James Bond, bravely defying the forces of an entire modern state. They don’t get how satisfying or empowering the vengeful misogyny of ISIS-style fundamentalism might seem for geeky, frustrated young men. Nor can they appreciate the appeal to the adolescent mind of apocalyptic fantasies of power and sacrifice (mainstream British society does not have much room for warrior dreams, given that its tone is set by liberal pacifists). Finally, they have no sense of why the discipline and self-discipline of fundamentalist Islam might appeal so strongly to incarcerated lumpen youth who have never experienced boundaries or real belonging. Their understanding is an understanding only of themselves, not of the people who want to kill them.
Review of 'White Working Class' By Joan C. Williams
Williams is a prominent feminist legal scholar with degrees from Yale, MIT, and Harvard. Unbending Gender, her best-known book, is the sort of tract you’d expect to find at an intersectionality conference or a Portlandia bookstore. This is why her insightful, empathic book comes as such a surprise.
Books and essays on the topic have accumulated into a highly visible genre since Donald Trump came on the American political scene; J.D. Vance’s Hillbilly Elegy planted itself at the top of bestseller lists almost a year ago and still isn’t budging. As with Vance, Williams’s interest in the topic is personal. She fell “madly in love with” and eventually married a Harvard Law School graduate who had grown up in an Italian neighborhood in pre-gentrification Brook-lyn. Williams, on the other hand, is a “silver-spoon girl.” Her father’s family was moneyed, and her maternal grandfather was a prominent Reform rabbi.
The author’s affection for her “class-migrant” spouse and respect for his family’s hardships—“My father-in-law grew up on blood soup,” she announces in her opening sentence—adds considerable warmth to what is at bottom a political pamphlet. Williams believes that elite condescension and “cluelessness” played a big role in Trump’s unexpected and dreaded victory. Enlightening her fellow elites is essential to the task of returning Trump voters to the progressive fold where, she is sure, they rightfully belong.
Liberals were not always so dense about the working class, Williams observes. WPA murals and movies like On the Waterfront showed genuine fellow feeling for the proletariat. In the 1970s, however, the liberal mood changed. Educated boomers shifted their attention to “issues of peace, equal rights, and environmentalism.” Instead of feeling the pain of Arthur Miller and John Steinbeck characters, they began sneering at the less enlightened. These days, she notes, elite sympathies are limited to the poor, people of color (POC), and the LGBTQ population. Despite clear evidence of suffering—stagnant wages, disappearing manufacturing jobs, declining health and well-being—the working class gets only fly-over snobbery at best and, more often, outright loathing.
Williams divides her chapters into a series of explainers to questions she has heard from her clueless friends and colleagues: “Why Does the Working Class Resent the Poor?” “Why Does the Working Class Resent Professionals but Admire the Rich?” “Why Doesn’t the Working Class Just Move to Where the Jobs Are?” “Is the Working Class Just Racist?” She weaves her answers into a compelling picture of a way of life and worldview foreign to her targeted readers. Working-class Americans have had to struggle for whatever stability and comfort they have, she explains. Clocking in for midnight shifts year after year, enduring capricious bosses, plant closures, and layoffs, they’re reliant on tag-team parenting and stressed-out relatives for child care. The campus go-to word “privileged” seems exactly wrong.
Proud of their own self-sufficiency and success, however modest, they don’t begrudge the self-made rich. It’s snooty professionals and the dysfunctional poor who get their goat. From their vantage point, subsidizing the day care for a welfare mother when they themselves struggle to manage care on their own dime mocks both their hard work and their beliefs. And since, unlike most professors, they shop in the same stores as the dependent poor, they’ve seen that some of them game the system. Of course that stings.
White Working Class is especially good at evoking the alternate economic and mental universe experienced by Professional and Managerial Elites, or “PMEs.” PMEs see their non-judgment of the poor, especially those who are “POC,” as a mark of their mature understanding that we live in an unjust, racist system whose victims require compassion regardless of whether they have committed any crime. At any rate, their passions lie elsewhere. They define themselves through their jobs and professional achievements, hence their obsession with glass ceilings.
Williams tells the story of her husband’s faux pas at a high-school reunion. Forgetting his roots for a moment, the Ivy League–educated lawyer asked one of his Brooklyn classmates a question that is the go-to opener in elite social settings: “What do you do?” Angered by what must have seemed like deliberate humiliation by this prodigal son, the man hissed: “I sell toilets.”
Instead of stability and backyard barbecues with family and long-time neighbors and maybe the occasional Olive Garden celebration, PMEs are enamored of novelty: new foods, new restaurants, new friends, new experiences. The working class chooses to spend its leisure in comfortable familiarity; for the elite, social life is a lot like networking. Members of the professional class may view themselves as sophisticated or cosmopolitan, but, Williams shows, to the blue-collar worker their glad-handing is closer to phony social climbing and their abstract, knowledge-economy jobs more like self-important pencil-pushing.
White Working Class has a number of proposals for creating the progressive future Williams would like to see. She wants to get rid of college-for-all dogma and improve training for middle-skill jobs. She envisions a working-class coalition of all races and ethnicities bolstered by civics education with a “distinctly celebratory view of American institutions.” In a saner political environment, some of this would make sense; indeed, she echoes some of Marco Rubio’s 2016 campaign themes. It’s little wonder White Working Class has already gotten the stink eye from liberal reviewers for its purported sympathies for racists.
Alas, impressive as Williams’s insights are, they do not always allow her to transcend her own class loyalties. Unsurprisingly, her own PME biases mostly come to light in her chapters on race and gender. She reduces immigration concerns to “fear of brown people,” even as she notes elsewhere that a quarter of Latinos also favor a wall at the southern border. This contrasts startlingly with her succinct observation that “if you don’t want to drive working-class whites to be attracted to the likes of Limbaugh, stop insulting them.” In one particularly obtuse moment, she asserts: “Because I study social inequality, I know that even Malia and Sasha Obama will be disadvantaged by race, advantaged as they are by class.” She relies on dubious gender theories to explain why the majority of white women voted for Trump rather than for his unfairly maligned opponent. That Hillary Clinton epitomized every elite quality Williams has just spent more than a hundred pages explicating escapes her notice. Williams’s own reflexive retreat into identity politics is itself emblematic of our toxic divisions, but it does not invalidate the power of this astute book.
When music could not transcend evil
he story of European classical music under the Third Reich is one of the most squalid chapters in the annals of Western culture, a chronicle of collective complaisance that all but beggars belief. Without exception, all of the well-known musicians who left Germany and Austria in protest when Hitler came to power in 1933 were either Jewish or, like the violinist Adolf Busch, Rudolf Serkin’s father-in-law, had close family ties to Jews. Moreover, most of the small number of non-Jewish musicians who emigrated later on, such as Paul Hindemith and Lotte Lehmann, are now known to have done so not out of principle but because they were unable to make satisfactory accommodations with the Nazis. Everyone else—including Karl Böhm, Wilhelm Furtwängler, Walter Gieseking, Herbert von Karajan, and Richard Strauss—stayed behind and served the Reich.
The Berlin and Vienna Philharmonics, then as now Europe’s two greatest orchestras, were just as willing to do business with Hitler and his henchmen, firing their Jewish members and ceasing to perform the music of Jewish composers. Even after the war, the Vienna Philharmonic was notorious for being the most anti-Semitic orchestra in Europe, and it was well known in the music business (though never publicly discussed) that Helmut Wobisch, the orchestra’s principal trumpeter and its executive director from 1953 to 1968, had been both a member of the SS and a Gestapo spy.
The management of the Berlin Philharmonic made no attempt to cover up the orchestra’s close relationship with the Third Reich, no doubt because the Nazi ties of Karajan, who was its music director from 1956 until shortly before his death in 1989, were a matter of public record. Yet it was not until 2007 that a full-length study of its wartime activities, Misha Aster’s The Reich’s Orchestra: The Berlin Philharmonic 1933–1945, was finally published. As for the Vienna Philharmonic, its managers long sought to quash all discussion of the orchestra’s Nazi past, steadfastly refusing to open its institutional archives to scholars until 2008, when Fritz Trümpi, an Austrian scholar, was given access to its records. Five years later, the Viennese, belatedly following the precedent of the Berlin Philharmonic, added a lengthy section to their website called “The Vienna Philharmonic Under National Socialism (1938–1945),” in which the damning findings of Trümpi and two other independent scholars were made available to the public.
Now Trümpi has published The Political Orchestra: The Vienna and Berlin Philharmonics During the Third Reich, in which he tells how they came to terms with Nazism, supplying pre- and postwar historical context for their transgressions.1 Written in a stiff mixture of academic jargon and translatorese, The Political Orchestra is ungratifying to read. Even so, the tale that it tells is both compelling and disturbing, especially to anyone who clings to the belief that high art is ennobling to the spirit.U
nlike the Vienna Philharmonic, which has always doubled as the pit orchestra for the Vienna State Opera, the Berlin Philharmonic started life in 1882 as a fully independent, self-governing entity. Initially unsubsidized by the state, it kept itself afloat by playing a grueling schedule of performances, including “popular” non-subscription concerts for which modest ticket prices were levied. In addition, the orchestra made records and toured internationally at a time when neither was common.
These activities made it possible for the Berlin Philharmonic to develop into an internationally renowned ensemble whose fabled collective virtuosity was widely seen as a symbol of German musical distinction. Furtwängler, the orchestra’s principal conductor, declared in 1932 that the German music in which it specialized was “one of the very few things that actually contribute to elevating [German] prestige.” Hence, he explained, the need for state subsidy, which he saw as “a matter of [national] prestige, that is, to some extent a requirement of national prudence.” By then, though, the orchestra was already heavily subsidized by the city of Berlin, thus paving the way for its takeover by the Nazis.
The Vienna Philharmonic, by contrast, had always been subsidized. Founded in 1842 when the orchestra of what was then the Vienna Court Opera decided to give symphonic concerts on its own, it performed the Austro-German classics for an elite cadre of longtime subscribers. By restricting membership to local players and their pupils, the orchestra cultivated what Furtwängler, who spent as much time conducting in Vienna as in Berlin, described as a “homogeneous and distinct tone quality.” At once dark and sweet, it was as instantly identifiable—and as characteristically Viennese—as the strong, spicy bouquet of a Gewürztraminer wine.
Unlike the Berlin Philharmonic, which played for whoever would pay the tab and programmed new music as a matter of policy, the Vienna Philharmonic chose not to diversify either its haute-bourgeois audience or its conservative repertoire. Instead, it played Beethoven, Brahms, Haydn, Mozart, and Schubert (and, later, Bruckner and Richard Strauss) in Vienna for the Viennese. Starting in the ’20s, the orchestra’s recordings consolidated its reputation as one of the world’s foremost instrumental ensembles, but its internal culture remained proudly insular.
What the two orchestras had in common was a nationalistic ethos, a belief in the superiority of Austro-German musical culture that approached triumphalism. One of the darkest manifestations of this ethos was their shared reluctance to hire Jews. The Berlin Philharmonic employed only four Jewish players in 1933, while the Vienna Philharmonic contained only 11 Jews at the time of the Anschluss, none of whom was hired after 1920. To be sure, such popular Jewish conductors as Otto Klemperer and Bruno Walter continued to work in Vienna for as long as they could. Two months before the Anschluss, Walter led and recorded a performance of the Ninth Symphony of Gustav Mahler, his musical mentor and fellow Jew, who from 1897 to 1907 had been the director of the Vienna Court Opera and one of the Philharmonic’s most admired conductors. But many members of both orchestras were open supporters of fascism, and not a few were anti-Semites who ardently backed Hitler. By 1942, 62 of the 123 active members of the Vienna Philharmonic were Nazi party members.
The admiration that Austro-German classical musicians had for Hitler is not entirely surprising since he was a well-informed music lover who declared in 1938 that “Germany has become the guardian of European culture and civilization.” He made the support of German art, music very much included, a key part of his political program. Accordingly, the Berlin Philharmonic was placed under the direct supervision of Joseph Goebbels, who ensured the cooperation of its members by repeatedly raising their salaries, exempting them from military service, and guaranteeing their old-age pensions. But there had never been any serious question of protest, any more than there would be among the members of the Vienna Philharmonic when the Nazis gobbled up Austria. Save for the Jews and one or two non-Jewish players who were fired for reasons of internal politics, the musicians went along unhesitatingly with Hitler’s desires.
With what did they go along? Above all, they agreed to the scrubbing of Jewish music from their programs and the dismissal of their Jewish colleagues. Some Jewish players managed to escape with their lives, but seven of the Vienna Philharmonic’s 11 Jews were either murdered by the Nazis or died as a direct result of official persecution. In addition, both orchestras performed regularly at official government functions and made tours and other public appearances for propaganda purposes, and both were treated as gems in the diadem of Nazi culture.
As for Furtwängler, the most prominent of the Austro-German orchestral conductors who served the Reich, his relationship to Nazism continues to be debated to this day. He had initially resisted the firing of the Berlin Philharmonic’s Jewish members and protected them for as long as he could. But he was also a committed (if woolly-minded) nationalist who believed that German music had “a different meaning for us Germans than for other nations” and notoriously declared in an open letter to Goebbels that “we all welcome with great joy and gratitude . . . the restoration of our national honor.” Thereafter he cooperated with the Nazis, by all accounts uncomfortably but—it must be said—willingly. A monster of egotism, he saw himself as the greatest living exponent of German music and believed it to be his duty to stay behind and serve a cause higher than what he took to be mere party politics. “Human beings are free wherever Wagner and Beethoven are played, and if they are not free at first, they are freed while listening to these works,” he naively assured a horrified Arturo Toscanini in 1937. “Music transports them to regions where the Gestapo can do them no harm.”O
nce the war was over, the U.S. occupation forces decided to enlist the Berlin Philharmonic in the service of a democratic, anti-Soviet Germany. Furtwängler and Herbert von Karajan, who succeeded him as principal conductor, were officially “de-Nazified” and their orchestra allowed to function largely undisturbed, though six Nazi Party members were fired. The Vienna Philharmonic received similarly privileged treatment.
Needless to say, there was more to this decision than Cold War politics. No one questioned the unique artistic stature of either orchestra. Moreover, the Vienna Philharmonic, precisely because of its insularity, was now seen as a living museum piece, a priceless repository of 19th-century musical tradition. Still, many musicians and listeners, Jews above all, looked askance at both orchestras for years to come, believing them to be tainted by Nazism.
Indeed they were, so much so that they treated many of their surviving Jewish ex-members in a way that can only be described as vicious. In the most blatant individual case, the violinist Szymon Goldberg, who had served as the Berlin Philharmonic’s concertmaster under Furtwängler, was not allowed to reassume his post in 1945 and was subsequently denied a pension. As for the Vienna Philharmonic, the fact that it made Helmut Wobisch its executive director says everything about its deep-seated unwillingness to face up to its collective sins.
Be that as it may, scarcely any prominent musicians chose to boycott either orchestra. Leonard Bernstein went so far as to affect a flippant attitude toward the morally equivocal conduct of the Austro-German artists whom he encountered in Europe after the war. Upon meeting Herbert von Karajan in 1954, he actually told his wife Felicia that he had become “real good friends with von Karajan, whom you would (and will) adore. My first Nazi.”
At the same time, though, Bernstein understood what he was choosing to overlook. When he conducted the Vienna Philharmonic for the first time in 1966, he wrote to his parents:
I am enjoying Vienna enormously—as much as a Jew can. There are so many sad memories here; one deals with so many ex-Nazis (and maybe still Nazis); and you never know if the public that is screaming bravo for you might contain someone who 25 years ago might have shot me dead. But it’s better to forgive, and if possible, forget. The city is so beautiful, and so full of tradition. Everyone here lives for music, especially opera, and I seem to be the new hero.
Did Bernstein sell his soul for the opportunity to work with so justly renowned an orchestra—and did he get his price by insisting that its members perform the symphonies of Mahler, with which he was by then closely identified? It is a fair question, one that does not lend itself to easy answers.
Even more revealing is the case of Bruno Walter, who never forgave Furtwängler for staying behind in Germany, informing him in an angry letter that “your art was used as a conspicuously effective means of propaganda for the regime of the Devil.” Yet Walter’s righteous anger did not stop him from conducting in Vienna after the war. Born in Berlin, he had come to identify with the Philharmonic so closely that it was impossible for him to seriously consider quitting its podium permanently. “Spiritually, I was a Viennese,” he wrote in Theme and Variations, his 1946 autobiography. In 1952, he made a second recording with the Vienna Philharmonic of Mahler’s Das Lied von der Erde, whose premiere he had conducted in 1911 and which he had recorded in Vienna 15 years earlier. One wonders what Walter, who had converted to Christianity but had been driven out of both his native lands for the crime of being Jewish, made of the text of the last movement: “My friend, / On this earth, fortune has not been kind to me! / Where do I go?”
As for the two great orchestras of the Third Reich, both have finally acknowledged their guilt and been forgiven, at least by those who know little of their past. It would occur to no one to decline on principle to perform with either group today. Such a gesture would surely be condemned as morally ostentatious, an exercise in what we now call virtue-signaling. Yet it is impossible to forget what Samuel Lipman wrote in 1993 in Commentary apropos the wartime conduct of Furtwängler: “The ultimate triumph of totalitarianism, I suppose it can be said, is that under its sway only a martyred death can be truly moral.” For the only martyrs of the Berlin and Vienna Philharmonics were their Jews. The orchestras themselves live on, tainted and beloved.
He knows what to reveal and what to conceal, understands the importance of keeping the semblance of distance between oneself and the story of the day, and comprehends the ins and outs of anonymous sourcing. Within days of his being fired by President Trump on May 9, for example, little green men and women, known only as his “associates,” began appearing in the pages of the New York Times and Washington Post to dispute key points of the president’s account of his dismissal and to promote Comey’s theory of the case.
“In a Private Dinner, Trump Demanded Loyalty,” the New York Times reported on May 11. “Comey Demurred.” The story was a straightforward narrative of events from Comey’s perspective, capped with an obligatory denial from the White House. The next day, the Washington Post reported, “Comey associates dispute Trump’s account of conversations.” The Post did not identify Comey’s associates, other than saying that they were “people who have worked with him.”
Maybe they were the same associates who had gabbed to the Times. Or maybe they were different ones. Who can tell? Regardless, the story these particular associates gave to the Post was readable and gripping. Comey, the Post reported, “was wary of private meetings and discussions with the president and did not offer the assurance, as Trump has claimed, that Trump was not under investigation as part of the probe into Russian interference in last year’s election.”
On May 16, Michael S. Schmidt of the Times published his scoop, “Comey Memo Says Trump Asked Him to End Flynn Investigation.” Schmidt didn’t see the memo for himself. Parts of it were read to him by—you guessed it—“one of Mr. Comey’s associates.” The following day, Robert Mueller was appointed special counsel to oversee the Russia investigation. On May 18, the Times, citing “two people briefed” on a call between Comey and the president, reported, “Comey, Unsettled by Trump, Is Said to Have Wanted Him Kept at a Distance.” And by the end of that week, Comey had agreed to testify before the Senate Intelligence Committee.
As his testimony approached, Comey’s people became more aggressive in their criticisms of the president. “Trump Should Be Scared, Comey Friend Says,” read the headline of a CNN interview with Brookings Institution fellow Benjamin Wittes. This “Comey friend” said he was “very shocked” when he learned that President Trump had asked Comey for loyalty. “I have no doubt that he regarded the group of people around the president as dishonorable,” Wittes said.
Comey, Wittes added, was so uncomfortable at the White House reception in January honoring law enforcement—the one where Comey lumbered across the room and Trump whispered something in his ear—that, as CNN paraphrased it, he “stood in a position so that his blue blazer would blend in with the room’s blue drapes in an effort for Trump to not notice him.” The integrity, the courage—can you feel it?
On June 6, the day before Comey’s prepared testimony was released, more “associates” told ABC that the director would “not corroborate Trump’s claim that on three separate occasions Comey told the president he was not under investigation.” And a “source with knowledge of Comey’s testimony” told CNN the same thing. In addition, ABC reported that, according to “a source familiar with Comey’s thinking,” the former director would say that Trump’s actions stopped short of obstruction of justice.
Maybe those sources weren’t as “familiar with Comey’s thinking” as they thought or hoped? To maximize the press coverage he already dominated, Comey had authorized the Senate Intelligence Committee to release his testimony ahead of his personal interview. That testimony told a different story than what had been reported by CNN and ABC (and by the Post on May 12). Comey had in fact told Trump the president was not under investigation—on January 6, January 27, and March 30. Moreover, the word “obstruction” did not appear at all in his written text. The senators asked Comey if he felt Trump obstructed justice. He declined to answer either way.
My guess is that Comey’s associates lacked Comey’s scalpel-like, almost Jesuitical ability to make distinctions, and therefore misunderstood what he was telling them to say to the press. Because it’s obvious Comey was the one behind the stories of Trump’s dishonesty and bad behavior. He admitted as much in front of the cameras in a remarkable exchange with Senator Susan Collins of Maine.
Comey said that, after Trump tweeted on May 12 that he’d better hope there aren’t “tapes” of their conversations, “I asked a friend of mine to share the content of the memo with a reporter. Didn’t do it myself, for a variety of reasons. But I asked him to, because I thought that might prompt the appointment of a special counsel. And so I asked a close friend of mine to do it.”
Collins asked whether that friend had been Wittes, known to cable news junkies as Comey’s bestie. Comey said no. The source for the New York Times article was “a good friend of mine who’s a professor at Columbia Law School,” Daniel Richman.
Every time I watch or read that exchange, I am amazed. Here is the former director of the FBI just flat-out admitting that, for months, he wrote down every interaction he had with the president of the United States because he wanted a written record in case the president ever fired or lied about him. And when the president did fire and lie about him, that director set in motion a series of public disclosures with the intent of not only embarrassing the president, but also forcing the appointment of a special counsel who might end up investigating the president for who knows what. And none of this would have happened if the president had not fired Comey or tweeted about him. He told the Senate that if Trump hadn’t dismissed him, he most likely would still be on the job.
Rarely, in my view, are high officials so transparent in describing how Washington works. Comey revealed to the world that he was keeping a file on his boss, that he used go-betweens to get his story into the press, that “investigative journalism” is often just powerful people handing documents to reporters to further their careers or agendas or even to get revenge. And as long as you maintain some distance from the fallout, and stick to the absolute letter of the law, you will come out on top, so long as you have a small army of nightingales singing to reporters on your behalf.
“It’s the end of the Comey era,” A.B. Stoddard said on Special Report with Bret Baier the other day. On the contrary: I have a feeling that, as the Russia investigation proceeds, we will be hearing much more from Comey. And from his “associates.” And his “friends.” And persons “familiar with his thinking.”
In April, COMMENTARY asked a wide variety of writers,
thinkers, and broadcasters to respond to this question: Is free speech under threat in the United States? We received twenty-seven responses. We publish them here in alphabetical order.
Floyd AbramsFree expression threatened? By Donald Trump? I guess you could say so.
When a president engages in daily denigration of the press, when he characterizes it as the enemy of the people, when he repeatedly says that the libel laws should be “loosened” so he can personally commence more litigation, when he says that journalists shouldn’t be allowed to use confidential sources, it is difficult even to suggest that he has not threatened free speech. And when he says to the head of the FBI (as former FBI director James Comey has said that he did) that Comey should consider “putting reporters in jail for publishing classified information,” it is difficult not to take those threats seriously.
The harder question, though, is this: How real are the threats? Or, as Michael Gerson put it in the Washington Post: Will Trump “go beyond mere Twitter abuse and move against institutions that limit his power?” Some of the president’s threats against the institution of the press, wittingly or not, have been simply preposterous. Surely someone has told him by now that neither he nor Congress can “loosen” libel laws; while each state has its own libel law, there is no federal libel law and thus nothing for him to loosen. What he obviously takes issue with is the impact that the Supreme Court’s 1964 First Amendment opinion in New York Times v. Sullivan has had on state libel laws. The case determined that public officials who sue for libel may not prevail unless they demonstrate that the statements made about them were false and were made with actual knowledge or suspicion of that falsity. So his objection to the rules governing libel law is to nothing less than the application of the First Amendment itself.
In other areas, however, the Trump administration has far more power to imperil free speech. We live under an Espionage Act, adopted a century ago, which is both broad in its language and uncommonly vague in its meaning. As such, it remains a half-open door through which an administration that is hostile to free speech might walk. Such an administration could initiate criminal proceedings against journalists who write about defense- or intelligence-related topics on the basis that classified information was leaked to them by present or former government employees. No such action has ever been commenced against a journalist. Press lawyers and civil-liberties advocates have strong arguments that the law may not be read so broadly and still be consistent with the First Amendment. But the scope of the Espionage Act and the impact of the First Amendment upon its interpretation remain unknown.
A related area in which the attitude of an administration toward the press may affect the latter’s ability to function as a check on government relates to the ability of journalists to protect the identity of their confidential sources. The Obama administration prosecuted more Espionage Act cases against sources of information to journalists than all prior administrations combined. After a good deal of deserved press criticism, it agreed to expand the internal guidelines of the Department of Justice designed to limit the circumstances under which such source revelation is demanded. But the guidelines are none too protective and are, after all, simply guidelines. A new administration is free to change or limit them or, in fact, abandon them altogether. In this area, as in so many others, it is too early to judge the ultimate treatment of free expression by the Trump administration. But the threats are real, and there is good reason to be wary.
Floyd Abrams is the author of The Soul of the First Amendment (Yale University Press, 2017).
Ayaan Hirsi AliFreedom of speech is being threatened in the United States by a nascent culture of hostility to different points of view. As political divisions in America have deepened, a conformist mentality of “right thinking” has spread across the country. Increasingly, American universities, where no intellectual doctrine ought to escape critical scrutiny, are some of the most restrictive domains when it comes to asking open-ended questions on subjects such as Islam.
Legally, speech in the United States is protected to a degree unmatched in almost any industrialized country. The U.S. has avoided unpredictable Canadian-style restrictions on speech, for example. I remain optimistic that as long as we have the First Amendment in the U.S., any attempt at formal legal censorship will be vigorously challenged.
Culturally, however, matters are very different in America. The regressive left is the forerunner threatening free speech on any issue that is important to progressives. The current pressure coming from those who call themselves “social-justice warriors” is unlikely to lead to successful legislation to curb the First Amendment. Instead, censorship is spreading in the cultural realm, particularly at institutions of higher learning.
The way activists of the regressive left achieve silence or censorship is by creating a taboo, and one of the most pernicious taboos in operation today is the word “Islamophobia.” Islamists are similarly motivated to rule any critical scrutiny of Islamic doctrine out of order. There is now a university center (funded by Saudi money) in the U.S. dedicated to monitoring and denouncing incidences of “Islamophobia.”
The term “Islamophobia” is used against critics of political Islam, but also against progressive reformers within Islam. The term implies an irrational fear that is tainted by hatred, and it has had a chilling effect on free speech. In fact, “Islamophobia” is a poorly defined term. Islam is not a race, and it is very often perfectly rational to fear some expressions of Islam. No set of ideas should be beyond critical scrutiny.
To push back in this cultural realm—in our universities, in public discourse—those favoring free speech should focus more on the message of dawa, the set of ideas that the Islamists want to promote. If the aims of dawa are sufficiently exposed, ordinary Americans and Muslim Americans will reject it. The Islamist message is a message of divisiveness, misogyny, and hatred. It’s anachronistic and wants people to live by tribal norms dating from the seventh century. The best antidote to Islamic extremism is the revelation of what its primary objective is: a society governed by Sharia. This is the opposite of censorship: It is documenting reality. What is life like in Saudi Arabia, Iran, the Northern Nigerian States? What is the true nature of Sharia law?
Islamists want to hide the true meaning of Sharia, Jihad, and the implications for women, gays, religious minorities, and infidels under the veil of “Islamophobia.” Islamists use “Islamophobia” to obfuscate their vision and imply that any scrutiny of political Islam is hatred and bigotry. The antidote to this is more exposure and more speech.
As pressure on freedom of speech increases from the regressive left, we must reject the notions that only Muslims can speak about Islam, and that any critical examination of Islamic doctrines is inherently “racist.”
Instead of contorting Western intellectual traditions so as not to offend our Muslim fellow citizens, we need to defend the Muslim dissidents who are risking their lives to promote the human rights we take for granted: equality for women, tolerance of all religions and orientations, our hard-won freedoms of speech and thought.
It is by nurturing and protecting such speech that progressive reforms can emerge within Islam. By accepting the increasingly narrow confines of acceptable discourse on issues such as Islam, we do dissidents and progressive reformers within Islam a grave disservice. For truly progressive reforms within Islam to be possible, full freedom of speech will be required.
Ayaan Hirsi Ali is a research fellow at the Hoover Institution, Stanford University, and the founder of the AHA Foundation.
Lee C. BollingerI know it is too much to expect that political discourse mimic the measured, self-questioning, rational, footnoting standards of the academy, but there is a difference between robust political debate and political debate infected with fear or panic. The latter introduces a state of mind that is visceral and irrational. In the realm of fear, we move beyond the reach of reason and a sense of proportionality. When we fear, we lose the capacity to listen and can become insensitive and mean.
Our Constitution is well aware of this fact about the human mind and of its negative political consequences. In the First Amendment jurisprudence established over the past century, we find many expressions of the problematic state of mind that is produced by fear. Among the most famous and potent is that of Justice Brandeis in Whitney v. California in 1927, one of the many cases involving aggravated fears of subversive threats from abroad. “It is the function of (free) speech,” he said, “to free men from the bondage of irrational fears.” “Men feared witches,” Brandeis continued, “and burned women.”
Today, our “witches” are terrorists, and Brandeis’s metaphorical “women” include the refugees (mostly children) and displaced persons, immigrants, and foreigners whose lives have been thrown into suspension and doubt by policies of exclusion.
The same fears of the foreign that take hold of a population inevitably infect our internal interactions and institutions, yielding suppression of unpopular and dissenting voices, victimization of vulnerable groups, attacks on the media, and the rise of demagoguery, with its disdain for facts, reason, expertise, and tolerance.
All of this poses a very special obligation on those of us within universities. Not only must we make the case in every venue for the values that form the core of who we are and what we do, but we must also live up to our own principles of free inquiry and fearless engagement with all ideas. This is why recent incidents on a handful of college campuses disrupting and effectively censoring speakers is so alarming. Such acts not only betray a basic principle but also inflame a rising prejudice against the academic community, and they feed efforts to delegitimize our work, at the very moment when it’s most needed.
I do not for a second support the view that this generation has an unhealthy aversion to engaging differences of opinion. That is a modern trope of polarization, as is the portrayal of universities as hypocritical about academic freedom and political correctness. But now, in this environment especially, universities must be at the forefront of defending the rights of all students and faculty to listen to controversial voices, to engage disagreeable viewpoints, and to make every effort to demonstrate our commitment to the sort of fearless and spirited debate that we are simultaneously asking of the larger society. Anyone with a voice can shout over a speaker; but being able to listen to and then effectively rebut those with whom we disagree—particularly those who themselves peddle intolerance—is one of the greatest skills our education can bestow. And it is something our democracy desperately needs more of. That is why, I say to you now, if speakers who are being denied access to other campuses come here, I will personally volunteer to introduce them, and listen to them, however much I may disagree with them. But I will also never hesitate to make clear why I disagree with them.
Lee C. Bollinger is the 19th president of Columbia University and the author of Uninhibited, Robust, and Wide-Open: A Free Press for a New Century. This piece has been excerpted from President Bollinger’s May 17 commencement address.
Richard A. Epstein
Today, the greatest threat to the constitutional protection of freedom of speech comes from campus rabble-rousers who invoke this very protection. In their book, the speech of people like Charles Murray and Heather Mac Donald constitutes a form of violence, bordering on genocide, that receives no First Amendment protection. Enlightened protestors are both bound and entitled to shout them down, by force or other disruptive actions, if their universities are so foolish as to extend them an invitation to speak. Any indignant minority may take the law into its own hands to eradicate the intellectual cancer before it spreads on their own campus.
By such tortured logic, a new generation of vigilantes distorts the First Amendment doctrine: Speech becomes violence, and violence becomes heroic acts of self-defense. The standard First Amendment interpretation emphatically rejects that view. Of course, the First Amendment doesn’t let you say what you want when and wherever you want to. Your freedom of speech is subject to the same limitations as your freedom of action. So you have no constitutional license to assault other people, to lie to them, or to form cartels to bilk them in the marketplace. But folks such as Murray, Mac Donald, and even Yiannopoulos do not come close to crossing into that forbidden territory. They are not using, for example, “fighting words,” rightly limited to words or actions calculated to provoke immediate aggression against a known target. Fighting words are worlds apart from speech that provokes a negative reaction in those who find your speech offensive solely because of the content of its message.
This distinction is central to the First Amendment. Fighting words have to be blocked by well-tailored criminal and civil sanctions lest some people gain license to intimidate others from speaking or peaceably assembling. The remedy for mere offense is to speak one’s mind in response. But it never gives anyone the right to block the speech of others, lest everyone be able to unilaterally increase his sphere of action by getting really angry about the beliefs of others. No one has the right to silence others by working himself into a fit of rage.
Obviously, it is intolerable to let mutual animosity generate factional warfare, whereby everyone can use force to silence rivals. To avoid this war of all against all, each side claims that only its actions are privileged. These selective claims quickly degenerate into a form of viewpoint discrimination, which undermines one of the central protections that traditional First Amendment law erects: a wall against each and every group out to destroy the level playing field on which robust political debate rests. Every group should be at risk for having its message fall flat. The new campus radicals want to upend that understanding by shutting down their adversaries if their universities do not. Their aggression must be met, if necessary, by counterforce. Silence in the face of aggression is not an acceptable alternative.
Richard A. Epstein is the Laurence A. Tisch Professor of Law at the New York University School of Law.
David FrenchWe’re living in the midst of a troubling paradox. At the exact same time that First Amendment jurisprudence has arguably never been stronger and more protective of free expression, millions of Americans feel they simply can’t speak freely. Indeed, talk to Americans living and working in the deep-blue confines of the academy, Hollywood, and the tech sector, and you’ll get a sense of palpable fear. They’ll explain that they can’t say what they think and keep their jobs, their friends, and sometimes even their families.
The government isn’t cracking down or censoring; instead, Americans are using free speech to destroy free speech. For example, a social-media shaming campaign is an act of free speech. So is an economic boycott. So is turning one’s back on a public speaker. So is a private corporation firing a dissenting employee for purely political reasons. Each of these actions is largely protected from government interference, and each one represents an expression of the speaker’s ideas and values.
The problem, however, is obvious. The goal of each of these kinds of actions isn’t to persuade; it’s to intimidate. The goal isn’t to foster dialogue but to coerce conformity. The result is a marketplace of ideas that has been emptied of all but the approved ideological vendors—at least in those communities that are dominated by online thugs and corporate bullies. Indeed, this mindset has become so prevalent that in places such as Portland, Berkeley, Middlebury, and elsewhere, the bullies and thugs have crossed the line from protected—albeit abusive—speech into outright shout-downs and mob violence.
But there’s something else going on, something that’s insidious in its own way. While politically correct shaming still has great power in deep-blue America, its effect in the rest of the country is to trigger a furious backlash, one characterized less by a desire for dialogue and discourse than by its own rage and scorn. So we’re moving toward two Americas—one that ruthlessly (and occasionally illegally) suppresses dissenting speech and the other that is dangerously close to believing that the opposite of political correctness isn’t a fearless expression of truth but rather the fearless expression of ideas best calculated to enrage your opponents.
The result is a partisan feedback loop where right-wing rage spurs left-wing censorship, which spurs even more right-wing rage. For one side, a true free-speech culture is a threat to feelings, sensitivities, and social justice. The other side waves high the banner of “free speech” to sometimes elevate the worst voices to the highest platforms—not so much to protect the First Amendment as to infuriate the hated “snowflakes” and trigger the most hysterical overreactions.
The culturally sustainable argument for free speech is something else entirely. It reminds the cultural left of its own debt to free speech while reminding the political right that a movement allegedly centered around constitutional values can’t abandon the concept of ordered liberty. The culture of free speech thrives when all sides remember their moral responsibilities—to both protect the right of dissent and to engage in ideological combat with a measure of grace and humility.
David French is a senior writer at National Review.
Pamela GellerThe real question isn’t whether free speech is under threat in the United States, but rather, whether it’s irretrievably lost. Can we get it back? Not without war, I suspect, as is evidenced by the violence at colleges whenever there’s the shamefully rare event of a conservative speaker on campus.
Free speech is the soul of our nation and the foundation of all our other freedoms. If we can’t speak out against injustice and evil, those forces will prevail. Freedom of speech is the foundation of a free society. Without it, a tyrant can wreak havoc unopposed, while his opponents are silenced.
With that principle in mind, I organized a free-speech event in Garland, Texas. The world had recently been rocked by the murder of the Charlie Hebdo cartoonists. My version of “Je Suis Charlie” was an event here in America to show that we can still speak freely and draw whatever we like in the Land of the Free. Yet even after jihadists attacked our event, I was blamed—by Donald Trump among others—for provoking Muslims. And if I tried to hold a similar event now, no arena in the country would allow me to do so—not just because of the security risk, but because of the moral cowardice of all intellectual appeasers.
Under what law is it wrong to depict Muhammad? Under Islamic law. But I am not a Muslim, I don’t live under Sharia. America isn’t under Islamic law, yet for standing for free speech, I’ve been:
- Prevented from running our advertisements in every major city in this country. We have won free-speech lawsuits all over the country, which officials circumvent by prohibiting all political ads (while making exceptions for ads from Muslim advocacy groups);
- Shunned by the right, shut out of the Conservative Political Action Conference;
- Shunned by Jewish groups at the behest of terror-linked groups such as the Council on American-Islamic Relations;
- Blacklisted from speaking at universities;
- Prevented from publishing books, for security reasons and because publishers fear shaming from the left;
- Banned from Britain.
A Seattle court accused me of trying to shut down free speech after we merely tried to run an FBI poster on global terrorism, because authorities had banned all political ads in other cities to avoid running ours. Seattle blamed us for that, which was like blaming a woman for being raped because she was wearing a short skirt.
This kind of vilification and shunning is key to the left’s plan to shut down all dissent from its agenda—they make legislation restricting speech unnecessary.
The same refusal to allow our point of view to be heard has manifested itself elsewhere. The foundation of my work is individual rights and equality for all before the law. These are the foundational principles of our constitutional republic. That is now considered controversial. Truth is the new hate speech. Truth is going to be criminalized.
The First Amendment doesn’t only protect ideas that are sanctioned by the cultural and political elites. If “hate speech” laws are enacted, who would decide what’s permissible and what’s forbidden? The government? The gunmen in Garland?
There has been an inversion of the founding premise of this nation. No longer is it the subordination of might to right, but right to might. History is repeatedly deformed with the bloody consequences of this transition.
Pamela Geller is the editor in chief of the Geller Report and president of the American Freedom Defense Initiative.
Jonah GoldbergOf course free speech is under threat in America. Frankly, it’s always under threat in America because it’s always under threat everywhere. Ronald Reagan was right when he said in 1961, “Freedom is never more than one generation away from extinction. We didn’t pass it on to our children in the bloodstream. It must be fought for, protected, and handed on for them to do the same.”
This is more than political boilerplate. Reagan identified the source of the threat: human nature. God may have endowed us with a right to liberty, but he didn’t give us all a taste for it. As with most finer things, we must work to acquire a taste for it. That is what civilization—or at least our civilization—is supposed to do: cultivate attachments to certain ideals. “Cultivate” shares the same Latin root as “culture,” cultus, and properly understood they mean the same thing: to grow, nurture, and sustain through labor.
In the past, threats to free speech have taken many forms—nationalist passion, Comstockery (both good and bad), political suppression, etc.—but the threat to free speech today is different. It is less top-down and more bottom-up. We are cultivating a generation of young people to reject free speech as an important value.
One could mark the beginning of the self-esteem movement with Nathaniel Branden’s 1969 paper, “The Psychology of Self-Esteem,” which claimed that “feelings of self-esteem were the key to success in life.” This understandable idea ran amok in our schools and in our culture. When I was a kid, Saturday-morning cartoons were punctuated with public-service announcements telling kids: “The most important person in the whole wide world is you, and you hardly even know you!”
The self-esteem craze was just part of the cocktail of educational fads. Other ingredients included multiculturalism, the anti-bullying crusade, and, of course, that broad phenomenon known as “political correctness.” Combined, they’ve produced a generation that rejects the old adage “sticks and stones can break my bones but words can never harm me” in favor of the notion that “words hurt.” What we call political correctness has been on college campuses for decades. But it lacked a critical mass of young people who were sufficiently receptive to it to make it a fully successful ideology. The campus commissars welcomed the new “snowflakes” with open arms; truly, these are the ones we’ve been waiting for.
“Words hurt” is a fashionable concept in psychology today. (See Psychology Today: “Why Words Can Hurt at Least as Much as Sticks and Stones.”) But it’s actually a much older idea than the “sticks and stones” aphorism. For most of human history, it was a crime to say insulting or “injurious” things about aristocrats, rulers, the Church, etc. That tendency didn’t evaporate with the Divine Right of Kings. Jonathan Haidt has written at book length about our natural capacity to create zones of sanctity, immune from reason.
And that is the threat free speech faces today. Those who inveigh against “hate speech” are in reality fighting “heresy speech”—ideas that do “violence” to sacred notions of self-esteem, racial or gender equality, climate change, and so on. Put whatever label you want on it, contemporary “social justice” progressivism acts as a religion, and it has no patience for blasphemy.
When Napoleon’s forces converted churches into stables, the clergy did not object on the grounds that regulations regarding the proper care and feeding of animals had been violated. They complained of sacrilege and blasphemy. When Charles Murray or Christina Hoff Summers visits college campuses, the protestors are behaving like the zealous acolytes of St. Jerome. Appeals to the First Amendment have as much power over the “antifa” fanatics as appeals to Odin did to champions of the New Faith.
That is the real threat to free speech today.
Jonah Goldberg is a senior editor at National Review and a fellow at the American Enterprise Institute.
KC JohnsonIn early May, the Washington Post urged universities to make clear that “racist signs, symbols, and speech are off-limits.” Given the extraordinarily broad definition of what constitutes “racist” speech at most institutions of higher education, this demand would single out most right-of-center (and, in some cases, even centrist and liberal) discourse on issues of race or ethnicity. The editorial provided the highest-profile example of how hostility to free speech, once confined to the ideological fringe on campus, has migrated to the liberal mainstream.
The last few years have seen periodic college protests—featuring claims that significant amounts of political speech constitute “violence,” thereby justifying censorship—followed by even more troubling attempts to appease the protesters. After the mob scene that greeted Charles Murray upon his visit to Middlebury College, for instance, the student government criticized any punishment for the protesters, and several student leaders wanted to require that future speakers conform to the college’s “community standard” on issues of race, gender, and ethnicity. In the last few months, similar attempts to stifle the free exchange of ideas in the name of promoting diversity occurred at Wesleyan, Claremont McKenna, and Duke. Offering an extreme interpretation of this point of view, one CUNY professor recently dismissed dialogue as “inherently conservative,” since it reinforced the “relations of power that presently exist.”
It’s easy, of course, to dismiss campus hostility to free speech as affecting only a small segment of American public life—albeit one that trains the next generation of judges, legislators, and voters. But, as Jonathan Chait observed in 2015, denying “the legitimacy of political pluralism on issues of race and gender” has broad appeal on the left. It is only most apparent on campus because “the academy is one of the few bastions of American life where the political left can muster the strength to impose its political hegemony upon others.” During his time in office, Barack Obama generally urged fellow liberals to support open intellectual debate. But the current campus environment previews the position of free speech in a post-Obama Democratic Party, increasingly oriented around identity politics.
Waning support on one end of the ideological spectrum for this bedrock American principle should provide a political opening for the other side. The Trump administration, however, seems poorly suited to make the case. Throughout his public career, Trump has rarely supported free speech, even in the abstract, and has periodically embraced legal changes to facilitate libel lawsuits. Moreover, the right-wing populism that motivates Trump’s base has a long tradition of ideological hostility to civil liberties of all types. Even in campus contexts, conservatives have defended free speech inconsistently, as seen in recent calls that CUNY disinvite anti-Zionist fanatic Linda Sarsour as a commencement speaker.
In a sharply polarized political environment, awash in dubiously-sourced information, free speech is all the more important. Yet this same environment has seen both sides, most blatantly elements of the left on campuses, demand restrictions on their ideological foes’ free speech in the name of promoting a greater good.
KC Johnson is a professor of history at Brooklyn College and the CUNY Graduate Center.
Laura KipnisI find myself with a strange-bedfellows problem lately. Here I am, a left-wing feminist professor invited onto the pages of Commentary—though I’d be thrilled if it were still 1959—while fielding speaking requests from right-wing think tanks and libertarians who oppose child-labor laws.
Somehow I’ve ended up in the middle of the free-speech-on-campus debate. My initial crime was publishing a somewhat contentious essay about campus sexual paranoia that put me on the receiving end of Title IX complaints. Apparently I’d created a “hostile environment” at my university. I was investigated (for 72 days). Then I wrote up what I’d learned about these campus inquisitions in a second essay. Then I wrote about it all some more, in a book exposing the kangaroo-court elements of the Title IX process—and the extra-legal gag orders imposed on everyone caught in its widening snare.
I can’t really comment on whether more charges have been filed against me over the book. I’ll just say that writing about being a Title IX respondent could easily become a life’s work. I learned, shortly after writing this piece, that I and my publisher were being sued for defamation, among other things.
Is free speech under threat on American campuses? Yes. We know all about student activists who wish to shut down talks by people with opposing views. I got smeared with a bit of that myself, after a speaking invitation at Wellesley—some students made a video protesting my visit before I arrived. The talk went fine, though a group of concerned faculty circulated an open letter afterward also protesting the invitation: My views on sexual politics were too heretical, and might have offended students.
I didn’t take any of this too seriously, even as right-wing pundits crowed, with Wellesley as their latest outrage bait. It was another opportunity to mock student activists, and the fact that I was myself a feminist rather than a Charles Murray or a Milo Yiannopoulos, made them positively gleeful.
I do find myself wondering where all my new free-speech pals were when another left-wing professor, Steven Salaita, was fired (or if you prefer euphemism, “his job offer was withdrawn”) from the University of Illinois after he tweeted criticism of Israel’s Gaza policy. Sure the tweets were hyperbolic, but hyperbole and strong opinions are protected speech, too.
I guess free speech is easy to celebrate until it actually challenges something. Funny, I haven’t seen Milo around lately—so beloved by my new friends when he was bashing minorities and transgender kids. Then he mistakenly said something authentic (who knew he was capable of it!), reminiscing about an experience a lot of gay men have shared: teenage sex with older men. He tried walking it back—no, no, he’d been a victim, not a participant—but his fan base was shrieking about pedophilia and fleeing in droves. Gee, they were all so against “political correctness” a few minutes before.
It’s easy to be a free-speech fan when your feathers aren’t being ruffled. No doubt what makes me palatable to the anti-PC crowd is having thus far failed to ruffle them enough. I’m just going to have to work harder.
Laura Kipnis’s latest book is Unwanted Advances: Sexual Paranoia Comes to Campus.
Eugene KontorovichThe free and open exchange of views—especially politically conservative or traditionally religious ones—is being challenged. This is taking place not just at college campuses but throughout our public spaces and cultural institutions. James Watson was fired from the lab he led since 1968 and could not speak at New York University because of petty, censorious students who would not know DNA from LSD. Our nation’s founders and heroes are being “disappeared” from public commemoration, like Trotsky from a photograph of Soviet rulers.
These attacks on “free speech” are not the result of government action. They are not what the First Amendment protects against. The current methods—professional and social shaming, exclusion, and employment termination—are more inchoate, and their effects are multiplied by self-censorship. A young conservative legal scholar might find himself thinking: “If the late Justice Antonin Scalia can posthumously be deemed a ‘bigot’ by many academics, what chance have I?”
Ironically, artists and intellectuals have long prided themselves on being the first defenders of free speech. Today, it is the institutions of both popular and high culture that are the censors. Is there one poet in the country who would speak out for Ann Coulter?
The inhibition of speech at universities is part of a broader social phenomenon of making longstanding, traditional views and practices sinful overnight. Conservatives have not put up much resistance to this. To paraphrase Martin Niemöller’s famous dictum: “First they came for Robert E. Lee, and I said nothing, because Robert E. Lee meant nothing to me.”
The situation with respect to Israel and expressions of support for it deserves separate discussion. Even as university administrators give political power to favored ideologies by letting them create “safe spaces” (safe from opposing views), Jews find themselves and their state at the receiving end of claims of apartheid—modern day blood libels. It is not surprising if Jewish students react by demanding that they get a safe space of their own. It is even less surprising if their parents, paying $65,000 a year, want their children to have a nicer time of it. One hears Jewish groups frequently express concern about Jewish students feeling increasingly isolated and uncomfortable on campus.
But demanding selective protection from the new ideological commissars is unlikely to bring the desired results. First, this new ideology, even if it can be harnessed momentarily to give respite to harassed Jews on campus, is ultimately illiberal and will be controlled by “progressive” forces. Second, it is not so terrible for Jews in the Diaspora to feel a bit uncomfortable. It has been the common condition of Jews throughout the millennia. The social awkwardness that Jews at liberal arts schools might feel in being associated with Israel is of course one of the primary justifications for the Jewish State. Facing the snowflakes incapable of hearing a dissonant view—but who nonetheless, in the grip of intersectional ecstasy, revile Jewish self-determination—Jewish students should toughen up.
Eugene Kontorovich teaches constitutional law at Northwestern University and heads the international law department of the Kohelet Policy Forum in Jerusalem.
Nicholas LemannThere’s an old Tom Wolfe essay in which he describes being on a panel discussion at Princeton in 1965 and provoking the other panelists by announcing that America, rather than being in crisis, is in the middle of a “happiness explosion.” He was arguing that the mass effects of 20 years of post–World War II prosperity made for a larger phenomenon than the Vietnam War, the racial crisis, and the other primary concerns of intellectuals at the time.
In the same spirit, I’d say that we are in the middle of a free-speech explosion, because of 20-plus years of the Internet and 10-plus years of social media. If one understands speech as disseminated individual opinion, then surely we live in the free-speech-est society in the history of the world. Anybody with access to the unimpeded World Wide Web can say anything to a global audience, and anybody can hear anything, too. All threats to free speech should be understood in the context of this overwhelmingly reality.
It is a comforting fantasy that a genuine free-speech regime will empower mainly “good,” but previously repressed, speech. Conversely, repressive regimes that are candid enough to explain their anti-free-speech policies usually say that they’re not against free speech, just “bad” speech. We have to accept that more free speech probably means, in the aggregate, more bad speech, and also a weakening of the power, authority, and economic support for information professionals such as journalists. Welcome to the United States in 2017.
I am lucky enough to live and work on the campus of a university, Columbia, that has been blessedly free of successful attempts to repress free speech. Just in the last few weeks, Charles Murray and Dinesh D’Souza have spoken here without incident. But, yes, the evidently growing popularity of the idea that “hate speech” shouldn’t be permitted on campuses is a problem, especially, it seems, at small private liberal-arts colleges. We should all do our part, and I do, by frequently and publicly endorsing free-speech principles. Opposing the BDS movement falls squarely into that category.
It’s not just on campuses that free-speech vigilance is needed, though. The number-one threat to free speech, to my mind, is that the wide-open Web has been replaced by privately owned platforms such as Facebook and Google as the way most people experience the public life of the Internet. These companies are committed to banning “hate speech,” and they are eager to operate freely in countries, like China, that don’t permit free political speech. That makes for a far more consequential constrained environment than any campus’s speech code.
Also, Donald Trump regularly engages in presidentially unprecedented rhetoric demonizing people who disagree with him. He seems to think this is all in good fun, but, as we have already seen at his rallies, not everybody hears it that way. The place where Trumpism will endanger free speech isn’t in the center—the White House press room—but at the periphery, for example in the way that local police handle bumptious protestors and the journalists covering them. This is already happening around the country. If Trump were as disciplined and knowledgeable as Vladimir Putin or Recep Tayyip Erdogan, which so far he seems not to be, then free speech could be in even more serious danger from government, which in most places is its usual main enemy.
Nicholas Lemann is a professor at Columbia Journalism School and a staff writer for the New Yorker.
Michael J. LewisFree speech is a right but it is also a habit, and where the habit shrivels so will the right. If free speech today is in headlong retreat—everywhere threatened by regulation, organized harassment, and even violence—it is in part because our political culture allowed the practice of persuasive oratory to atrophy. The process began in 1973, an unforeseen side effect of Roe v. Wade. Legislators were delighted to learn that by relegating this divisive matter of public policy to the Supreme Court and adopting a merely symbolic position, they could sit all the more safely in their safe seats.
Since then, one crucial question of public policy after another has been punted out of the realm of politics and into the judicial. Issues that might have been debated with all the rhetorical agility of a Lincoln and a Douglas, and then subjected to a process of negotiation, compromise, and voting, have instead been settled by decree: e.g., Chevron, Kelo, Obergefell. The consequences for speech have been pernicious. Since the time of Pericles, deliberative democracy has been predicated on the art of persuasion, which demands the forceful clarity of thought and expression without which no one has ever been persuaded. But a legislature that relegates its authority to judges and regulators will awaken to discover its oratorical culture has been stunted. When politicians, rather than seeking to convince and win over, prefer to project a studied and pleasant vagueness, debate withers into tedious defensive performance. It has been decades since any presidential debate has seen any sustained give and take over a matter of policy. If there is any suspense at all, it is only the possibility that a fatigued or peeved candidate might blurt out that tactless shard of truth known as a gaffe.
A generation accustomed to hearing platitudes smoothly dispensed from behind a teleprompter will find the speech of a fearless extemporaneous speaker to be startling, even disquieting; unfamiliar ideas always are. Unhappily, they have been taught to interpret that disquiet as an injury done to them, rather than as a premise offered to them to consider. All this would not have happened—certainly not to this extent—had not our deliberative democracy decided a generation ago that it preferred the security of incumbency to the risks of unshackled debate. The compulsory contraction of free speech on college campuses is but the logical extension of the voluntary contraction of free speech in our political culture.
Michael J. Lewis’s new book is City of Refuge: Separatists and Utopian Town Planning (Princeton University Press).
Heather Mac DonaldThe answer to the symposium question depends on how powerful the transmission belt is between academia and the rest of the country. On college campuses, violence and brute force are silencing speakers who challenge left-wing campus orthodoxies. These totalitarian outbreaks have been met with listless denunciations by college presidents, followed by . . . virtually nothing. As of mid-May, the only discipline imposed for 2017’s mass attacks on free speech at UC Berkeley, Middlebury, and Clare-mont McKenna College was a letter of reprimand inserted—sometimes only temporarily—into the files of several dozen Middlebury students, accompanied by a brief period of probation. Previous outbreaks of narcis-sistic incivility, such as the screaming-girl fit at Yale and the assaults on attendees of Yale’s Buckley program, were discreetly ignored by college administrators.
Meanwhile, the professoriate unapologetically defends censorship and violence. After the February 1 riot in Berkeley to prevent Milo Yiannapoulos from speaking, Déborah Blocker, associate professor of French at UC Berkeley, praised the rioters. They were “very well-organized and very efficient,” Blocker reported admiringly to her fellow professors. “They attacked property but they attacked it very sparingly, destroying just enough University property to obtain the cancellation order for the MY event and making sure no one in the crowd got hurt” (emphasis in original). (In fact, perceived Milo and Donald Trump supporters were sucker-punched and maced; businesses downtown were torched and vandalized.) New York University’s vice provost for faculty, arts, humanities, and diversity, Ulrich Baer, displayed Orwellian logic by claiming in a New York Times op-ed that shutting down speech “should be understood as an attempt to ensure the conditions of free speech for a greater group of people.”
Will non-academic institutions take up this zeal for outright censorship? Other ideological products of the left-wing academy have been fully absorbed and operationalized. Racial victimology, which drives much of the campus censorship, is now standard in government and business. Corporate diversity trainers counsel that bias is responsible for any lack of proportional racial representation in the corporate ranks. Racial disparities in school discipline and incarceration are universally attributed to racism rather than to behavior. Public figures have lost jobs for violating politically correct taboos.
Yet Americans possess an instinctive commitment to the First Amendment. Federal judges, hardly an extension of the Federalist Society, have overwhelmingly struck down campus speech codes. It is hard to imagine that they would be any more tolerant of the hate-speech legislation so prevalent in Europe. So the question becomes: At what point does the pressure to conform to the elite worldview curtail freedom of thought and expression, even without explicit bans on speech?
Social stigma against conservative viewpoints is not the same as actual censorship. But the line can blur. The Obama administration used regulatory power to impose a behavioral conformity on public and private entities. School administrators may have technically still possessed the right to dissent from novel theories of gender, but they had to behave as if they were fully on board with the transgender revolution when it came to allowing boys to use girls’ bathrooms and locker rooms.
Had Hillary Clinton had been elected president, the federal bureaucracy would have mimicked campus diversocrats with even greater zeal. That threat, at least, has been avoided. Heresies against left-wing dogma may still enter the public arena, if only by the back door. The mainstream media have lurched even further left in the Trump era, but the conservative media, however mocked and marginalized, are expanding (though Twitter and Facebook’s censorship of conservative speakers could be a harbinger of more official silencing).
Outside the academy, free speech is still legally protected, but its exercise requires ever greater determination.
Heather Mac Donald is a fellow at the Manhattan Institute and the author of The War on Cops.
John McWhorterThere is a certain mendacity, as Brick put it in Cat on a Hot Tin Roof, in our discussion of free speech on college campuses. Namely, none of us genuinely wish that absolutely all issues be aired in the name of education and open-mindedness. To insist so is to pretend that civilized humanity makes nothing we could call advancement in philosophical consensus.
I doubt we need “free speech” on issues such as whether slavery and genocide are okay, whether it has been a mistake to view women as men’s equals, or to banish as antique the idea that whites are a master race while other peoples represent a lower rung on the Darwinian scale. With all due reverence of John Stuart Mill’s advocacy for the regular airing of even noxious views in order to reinforce clarity on why they were rejected, we are also human beings with limited time. A commitment to the Enlightenment justifiably will decree that certain views are, indeed, no longer in need of discussion.
However, our modern social-justice warriors are claiming that this no-fly zone of discussion is vaster than any conception of logic or morality justifies. We are being told that questions regarding the modern proposals about cultural appropriation, about whether even passing infelicitous statements constitute racism in the way that formalized segregation and racist disparagement did, or about whether social disparities can be due to cultural legacies rather than structural impediments, are as indisputably egregious, backwards, and abusive as the benighted views of the increasingly distant past.
That is, the new idea is not only that discrimination and inequality still exist, but that to even question the left’s utopian expectation on such matters justifies the same furious, sloganistic and even physically violent resistance that was once levelled against those designated heretics by a Christian hegemony.
Of course the protesters in question do not recognize themselves in a portrait as opponents of something called heresy. They suppose that Galileo’s opponents were clearly wrong but that they, today, are actually correct in a way that no intellectual or moral argument could coherently deny.
As such, we have students allowed to decree college campuses as “racist” when they are the least racist spaces on the planet—because they are, predictably given the imperfection of humans, not perfectly free of passingly unsavory interactions. Thinkers invited to talk for a portion of an hour from the right rather than the left and then have dinner with a few people and fly home are treated as if they were reanimated Hitlers. The student of color who hears a few white students venturing polite questions about the leftist orthodoxy is supported in fashioning these questions as “racist” rhetoric.
The people on college campuses who openly and aggressively spout this new version of Christian (or even Islamist) crusading—ironically justifying it as a barricade against “fascist” muzzling of freedom when the term applies ominously well to the regime they are fostering—are a minority. However, the sawmill spinning blade of their rhetoric has succeeding in rendering opposition as risky as espousing pedophilia, such that only those natively open to violent criticism dare speak out. The latter group is small. The campus consensus thereby becomes, if only at moralistic gunpoint à la the ISIS victim video, a strangled hard-leftism.
Hence freedom of speech is indeed threatened on today’s college campuses. I have lost count of how many of my students, despite being liberal Democrats (many of whom sobbed at Hillary Clinton’s loss last November), have told me that they are afraid to express their opinions about issues that matter, despite the fact that their opinions are ones that any liberal or even leftist person circa 1960 would have considered perfectly acceptable.
Something has shifted of late, and not in a direction we can legitimately consider forwards.
John McWhorter teaches linguistics, philosophy, and music history at Columbia University and is the author of The Language Hoax, Words on the Move, and Talking Back, Talking Black.
Kate Bachelder OdellIt’s 2021, and Harvard Square has devolved into riots: Some 120 people are injured in protests, and the carnage includes fire-consumed cop cars and smashed-in windows. The police discharge canisters of tear gas, and, after apprehending dozens of protesters, enforce a 1:45 A.M. curfew. Anyone roaming the streets after hours is subject to arrest. About 2,000 National Guardsmen are prepared to intervene. Such violence and disorder is also roiling Berkeley and other elite and educated areas.
Oh, that’s 1970. The details are from the Harvard Crimson’s account of “anti-war” riots that spring. The episode is instructive in considering whether free speech is under threat in the United States. Almost daily, there’s a new YouTube installment of students melting down over viewpoints of speakers invited to one campus or another. Even amid speech threats from government—for example, the IRS’s targeting of political opponents—nothing has captured the public’s attention like the end of free expression at America’s institutions of higher learning.
Yet disruption, confusion, and even violence are not new campus phenomena. And it’s hard to imagine that young adults who deployed brute force in the 1960s and ’70s were deeply committed to the open and peaceful exchange of ideas.
There may also be reason for optimism. The rough and tumble on campus in the 1960s and ’70s produced a more even-tempered ’80s and ’90s, and colleges are probably heading for another course correction. In covering the ruckuses at Yale, Missouri, and elsewhere, I’ve talked to professors and students who are figuring out how to respond to the illiberalism, even if the reaction is delayed. The University of Chicago put out a set of free-speech principles last year, and others schools such as Princeton and Purdue have endorsed them.
The NARPs—Non-Athletic Regular People, as they are sometimes known on campus—still outnumber the social-justice warriors, who appear to be overplaying their hand. Case in point is the University of Missouri, which experienced a precipitous drop in enrollment after instructor Melissa Click and her ilk stoked racial tensions last spring. The college has closed dorms and trimmed budgets. Which brings us to another silver lining: The economic model of higher education (exorbitant tuition to pay ever more administrators) may blow up traditional college before the fascists can.
Note also that the anti-speech movement is run by rich kids. A Brookings Institution analysis from earlier this year discovered that “the average enrollee at a college where students have attempted to restrict free speech comes from a family with an annual income $32,000 higher than that of the average student in America.” Few rank higher in average income than those at Middlebury College, where students evicted scholar Charles Murray in a particularly ugly scene. (The report notes that Murray was received respectfully at Saint Louis University, “where the median income of students’ families is half Middlebury’s.”) The impulses of over-adulated 20-year-olds may soon be tempered by the tyranny of having to show up for work on a daily basis.
None of this is to suggest that free speech is enjoying some renaissance either on campus or in America. But perhaps as the late Wall Street Journal editorial-page editor Robert Bartley put it in his valedictory address: “Things could be worse. Indeed, they have been worse.”
Kate Bachelder Odell is an editorial writer for the Wall Street Journal.
Jonathan RauchIs free speech under threat? The one-syllable answer is “yes.” The three-syllable answer is: “Yes, of course.” Free speech is always under threat, because it is not only the single most successful social idea in all of human history, it is also the single most counterintuitive. “You mean to say that speech that is offensive, untruthful, malicious, seditious, antisocial, blasphemous, heretical, misguided, or all of the above deserves government protection?” That seemingly bizarre proposition is defensible only on the grounds that the marketplace of ideas turns out to be the most powerful engine of knowledge, prosperity, liberty, social peace, and moral advancement that our species has had the good fortune to discover.
Every new generation of free-speech advocates will need to get up every morning and re-explain the case for free speech and open inquiry—today, tomorrow, and forever. That is our lot in life, and we just need to be cheerful about it. At discouraging moments, it is helpful to remember that the country has made great strides toward free speech since 1798, when the Adams administration arrested and jailed its political critics; and since the 1920s, when the U.S. government banned and burned James Joyce’s great novel Ulysses; and since 1954, when the government banned ONE, a pioneering gay journal. (The cover article was a critique of the government’s indecency censors, who censored it.) None of those things could happen today.
I suppose, then, the interesting question is: What kind of threat is free speech under today? In the present age, direct censorship by government bodies is rare. Instead, two more subtle challenges hold sway, especially, although not only, on college campuses. The first is a version of what I called, in my book Kindly Inquisitors, the humanitarian challenge: the idea that speech that is hateful or hurtful (in someone’s estimation) causes pain and thus violates others’ rights, much as physical violence does. The other is a version of what I called the egalitarian challenge: the idea that speech that denigrates minorities (again, in someone’s estimation) perpetuates social inequality and oppression and thus also is a rights violation. Both arguments call upon administrators and other bureaucrats to defend human rights by regulating speech rights.
Both doctrines are flawed to the core. Censorship harms minorities by enforcing conformity and entrenching majority power, and it no more ameliorates hatred and injustice than smashing thermometers ameliorates global warming. If unwelcome words are the equivalent of bludgeons or bullets, then the free exchange of criticism—science, in other words—is a crime. I could go on, but suffice it to say that the current challenges are new variations on ancient themes—and they will be followed, in decades and centuries to come, by many, many other variations. Memo to free-speech advocates: Our work is never done, but the really amazing thing, given the proposition we are tasked to defend, is how well we are doing.
Jonathan Rauch is a senior fellow at the Brookings Institution and the author of Kindly Inquisitors: The New Attacks on Free Thought.
Nicholas Quinn RosenkranzSpeech is under threat on American campuses as never before. Censorship in various forms is on the rise. And this year, the threat to free speech on campus took an even darker turn, toward actual violence. The prospect of Milo Yiannopoulos speaking at Berkeley provoked riots that caused more than $100,000 worth of property damage on the campus. The prospect of Charles Murray speaking at Middlebury led to a riot that put a liberal professor in the hospital with a concussion. Ann Coulter’s speech at Berkeley was cancelled after the university determined that none of the appropriate venues could be protected from “known security threats” on the date in question.
The free-speech crisis on campus is caused, at least in part, by a more insidious campus pathology: the almost complete lack of intellectual diversity on elite university faculties. At Yale, for example, the number of registered Republicans in the economics department is zero; in the psychology department, there is one. Overall, there are 4,410 faculty members at Yale, and the total number of those who donated to a Republican candidate during the 2016 primaries was three.
So when today’s students purport to feel “unsafe” at the mere prospect of a conservative speaker on campus, it may be easy to mock them as “delicate snowflakes,” but in one sense, their reaction is understandable: If students are shocked at the prospect of a Republican behind a university podium, perhaps it is because many of them have never before laid eyes on one.
To see the connection between free speech and intellectual diversity, consider the recent commencement speech of Harvard President Drew Gilpin Faust:
Universities must be places open to the kind of debate that can change ideas….Silencing ideas or basking in intellectual orthodoxy independent of facts and evidence impedes our access to new and better ideas, and it inhibits a full and considered rejection of bad ones. . . . We must work to ensure that universities do not become bubbles isolated from the concerns and discourse of the society that surrounds them. Universities must model a commitment to the notion that truth cannot simply be claimed, but must be established—established through reasoned argument, assessment, and even sometimes uncomfortable challenges that provide the foundation for truth.
Faust is exactly right. But, alas, her commencement audience might be forgiven a certain skepticism. After all, the number of registered Republicans in several departments at Harvard—e.g., history and psychology—is exactly zero. In those departments, the professors themselves may be “basking in intellectual orthodoxy” without ever facing “uncomfortable challenges.” This may help explain why some students will do everything in their power to keep conservative speakers off campus: They notice that faculty hiring committees seem to do exactly the same thing.
In short, it is a promising sign that true liberal academics like Faust have started speaking eloquently about the crucial importance of civil, reasoned disagreement. But they will be more convincing on this point when they hire a few colleagues with whom they actually disagree.
Nicholas Quinn Rosenkranz is a professor of law at Georgetown. He serves on the executive committee of Heterodox Academy, which he co-founded, on the board of directors of the Federalist Society, and on the board of directors of the Foundation for Individual Rights in Education (FIRE).
Ben ShapiroIn February, I spoke at California State University in Los Angeles. Before my arrival, professors informed students that a white supremacist would be descending on the school to preach hate; threats of violence soon prompted the administration to cancel the event. I vowed to show up anyway. One hour before the event, the administration backed down and promised to guarantee that the event could go forward, but police officers were told not to stop the 300 students, faculty, and outside protesters who blocked and assaulted those who attempted to attend the lecture. We ended up trapped in the auditorium, with the authorities telling students not to leave for fear of physical violence. I was rushed from campus under armed police guard.
Is free speech under assault?
Of course it is.
On campus, free speech is under assault thanks to a perverse ideology of intersectionality that claims victim identity is of primary value and that views are a merely secondary concern. As a corollary, if your views offend someone who outranks you on the intersectional hierarchy, your views are treated as violence—threats to identity itself. On campus, statements that offend an individual’s identity have been treated as “microaggressions”–actual aggressions against another, ostensibly worthy of violence. Words, students have been told, may not break bones, but they will prompt sticks and stones, and rightly so.
Thus, protesters around the country—leftists who see verbiage as violence—have, in turn, used violence in response to ideas they hate. Leftist local authorities then use the threat of violence as an excuse to ideologically discriminate against conservatives. This means public intellectuals like Charles Murray being run off of campus and his leftist professorial cohort viciously assaulted; it means Ann Coulter being targeted for violence at Berkeley; it means universities preemptively banning me and Ayaan Hirsi Ali and Condoleezza Rice and even Jason Riley.
The campus attacks on free speech are merely the most extreme iteration of an ideology that spans from left to right: the notion that your right to free speech ends where my feelings begin. Even Democrats who say that Ann Coulter should be allowed to speak at Berkeley say that nobody should be allowed to contribute to a super PAC (unless you’re a union member, naturally).
Meanwhile, on the right, the president’s attacks on the press have convinced many Republicans that restrictions on the press wouldn’t be altogether bad. A Vanity Fair/60 Minutes poll in late April found that 36 percent of Americans thought freedom of the press “does more harm than good.” Undoubtedly, some of that is due to the media’s obvious bias. CNN’s Jeff Zucker has targeted the Trump administration for supposedly quashing journalism, but he was silent when the Obama administration’s Department of Justice cracked down on reporters from the Associated Press and Fox News, and when hacks like Deputy National Security Adviser Ben Rhodes openly sold lies regarding Iran. But for some on the right, the response to press falsities hasn’t been to call for truth, but to instead echo Trumpian falsehoods in the hopes of damaging the media. Free speech is only important when people seek the truth. Leftists traded truth for tribalism long ago; in response, many on the right seem willing to do the same. Until we return to a common standard under which facts matter, free speech will continue to rest on tenuous grounds.
Ben Shapiro is the editor in chief of The Daily Wire and the host of The Ben Shapiro Show.
Judith ShulevitzIt’s tempting to blame college and university administrators for the decline of free speech in America, and for years I did just that. If the guardians of higher education won’t inculcate the habits of mind required for serious thinking, I thought, who will? The unfettered but civil exchange of ideas is the basic operation of education, just as addition is the basic operation of arithmetic. And universities have to teach both the unfettered part and the civil part, because arguing in a respectful manner isn’t something anyone does instinctively.
So why change my mind now? Schools still cling to speech codes, and there still aren’t enough deans like the one at the University of Chicago who declared his school a safe-space-free zone. My alma mater just handed out prizes for “enhancing race and/or ethnic relations” to two students caught on video harassing the dean of their residential college, one screaming at him that he’d created “a space for violence to happen,” the other placing his face inches away from the dean’s and demanding, “Look at me.” All this because they deemed a thoughtful if ill-timed letter about Halloween costumes written by the dean’s wife to be an act of racist aggression. Yale should discipline students who behave like that, even if they’re right on the merits (I don’t think they were, but that’s not the point). They certainly don’t deserve awards. I can’t believe I had to write that sentence.
But in abdicating their responsibilites, the universities have enabled something even worse than an attack on free speech. They’ve unleashed an assault on themselves. There’s plenty of free speech around; we know that because so much bad speech—low-minded nonsense—tests our constitutional tolerance daily, and that’s holding up pretty well. (As Nicholas Lemann observes elsewhere in this symposium, Facebook and Google represent bigger threats to free speech than students and administrators.) What’s endangered is good speech.
Universities were setting themselves up to be used. Provocateurs exploit the atmosphere on campus to goad overwrought students, then gleefully trash the most important bastion of our crumbling civil society. Higher education and everything it stands for—logical argument, the scientific method, epistemological rigor—start to look illegitimate. Voters perceive tenure and research and higher education itself as hopelessly partisan and unworthy of taxpayers’ money.
The press is a secondary victim of this process of delegitimization. If serious inquiry can be waved off as ideology, then facts won’t be facts and reporting can’t be trusted. All journalism will be equal to all other journalism, and all journalists will be reduced to pests you can slam to the ground with near impunity. Politicians will be able to say anything and do just about anything and there will be no countervailing authority to challenge them. I’m pretty sure that that way lies Putinism and Erdoganism. And when we get to that point, I’m going to start worrying about free speech again.
Judith Shulevitz is a critic in New York.
Harvey SilverglateFree speech is, and has always been, threatened. The title of Nat Hentoff’s 1993 book Free Speech for Me – but Not for Thee is no less true today than at any time, even as the Supreme Court has accorded free speech a more absolute degree of protection than in any previous era.
Since the 1980s, the high court has decided most major free-speech cases in favor of speech, with most of the major decisions being unanimous or nearly so.
Women’s-rights advocates were turned back by the high court in 1986 when they sought to ban the sale of printed materials that, because deemed pornographic by some, were alleged to promote violence against women. Censorship in the name of gender–based protection thus failed to gain traction.
Despite the demands of civil-rights activists, the Supreme Court in 1992 declared cross-burning to be a protected form of expression in R.A.V. v. City of St. Paul, a decision later refined to strengthen a narrow exception for when cross-burning occurs primarily as a physical threat rather than merely an expression of hatred.
Other attempts at First Amendment circumvention have been met with equally decisive rebuff. When the Reverend Jerry Falwell sued Hustler magazine publisher Larry Flynt for defamation growing out of a parody depicting Falwell’s first sexual encounter as a drunken tryst with his mother in an outhouse, a unanimous Supreme Court lectured on the history of parody as a constitutionally protected, even if cruel, form of social and political criticism.
When the South Boston Allied War Veterans, sponsor of Boston’s Saint Patrick’s Day parade, sought to exclude a gay veterans’ group from marching under its own banner, the high court unanimously held that as a private entity, even though marching in public streets, the Veterans could exclude any group marching under a banner conflicting with the parade’s socially conservative message, notwithstanding public-accommodations laws. The gay group could have its own parade but could not rain on that of the conservatives.
Despite such legal clarity, today’s most potent attacks on speech are coming, ironically, from liberal-arts colleges. Ubiquitous “speech codes” limit speech that might insult, embarrass, or “harass,” in particular, members of “historically disadvantaged” groups. “Safe spaces” and “trigger warnings” protect purportedly vulnerable students from hearing words and ideas they might find upsetting. Student demonstrators and threats of violence have forced the cancellation of controversial speakers, left and right.
It remains unclear how much campus censorship results from politically correct faculty, control-obsessed student-life administrators, or students socialized and indoctrinated into intolerance. My experience suggests that the bureaucrats are primarily, although not entirely, to blame. When sued, colleges either lose or settle, pay a modest amount, and then return to their censorious ways.
This trend threatens the heart and soul of liberal education. Eventually it could infect the entire society as these students graduate and assume influential positions. Whether a resulting flood of censorship ultimately overcomes legal protections and weakens democracy remains to be seen.
Harvey Silverglate, a Boston-based lawyer and writer, is the co-author of The Shadow University: The Betrayal of Liberty on America’s Campuses (Free Press, 1998). He co-founded the Foundation for Individual Rights in Education in 1999 and is on FIRE’s board of directors. He spent some three decades on the board of the ACLU of Massachusetts, two of those years as chairman. Silverglate taught at Harvard Law School for a semester during a sabbatical he took in the mid-1980s.
Christina Hoff SommersWhen Heather Mac Donald’s “blue lives matter” talk was shut down by a mob at Claremont McKenna College, the president of neighboring Pomona College sent out an email defending free speech. Twenty-five students shot back a response: “Heather Mac Donald is a fascist, a white supremacist . . . classist, and ignorant of interlocking systems of domination that produce the lethal conditions under which oppressed peoples are forced to live.”
Some blame the new campus intolerance on hypersensitive, over-trophied millennials. But the students who signed that letter don’t appear to be fragile. Nor do those who recently shut down lectures at Berkeley, Middlebury, DePaul, and Cal State LA. What they are is impassioned. And their passion is driven by a theory known as intersectionality.
Intersectionality is the source of the new preoccupation with microaggressions, cultural appropriation, and privilege-checking. It’s the reason more than 200 colleges and universities have set up Bias Response Teams. Students who overhear potentially “otherizing” comments or jokes are encouraged to make anonymous reports to their campus BRTs. A growing number of professors and administrators have built their careers around intersectionality. What is it exactly?
Intersectionality is a neo-Marxist doctrine that views racism, sexism, ableism, heterosexism, and all forms of “oppression” as interconnected and mutually reinforcing. Together these “isms” form a complex arrangement of advantages and burdens. A white woman is disadvantaged by her gender but advantaged by her race. A Latino is burdened by his ethnicity but privileged by his gender. According to intersectionality, American society is a “matrix of domination,” with affluent white males in control. Not only do they enjoy most of the advantages, they also determine what counts as “truth” and “knowledge.”
But marginalized identities are not without resources. According to one of intersectionality’s leading theorists, Patricia Collins (former president of the American Sociology Association), disadvantaged groups have access to deeper, more liberating truths. To find their voice, and to enlighten others to the true nature of reality, they require a safe space—free of microaggressive put-downs and imperious cultural appropriations. Here they may speak openly about their “lived experience.” Lived experience, according to intersectional theory, is a better guide to the truth than self-serving Western and masculine styles of thinking. So don’t try to refute intersectionality with logic or evidence: That only proves that you are part of the problem it seeks to overcome.
How could comfortably ensconced college students be open to a convoluted theory that describes their world as a matrix of misery? Don’t they flinch when they hear intersectional scholars like bell hooks refer to the U.S. as an “imperialist, white-supremacist, capitalist patriarchy”? Most take it in stride because such views are now commonplace in high-school history and social studies texts. And the idea that knowledge comes from lived experience rather than painstaking study and argument is catnip to many undergrads.
Silencing speech and forbidding debate is not an unfortunate by-product of intersectionality—it is a primary goal. How else do you dismantle a lethal system of oppression? As the protesting students at Claremont McKenna explained in their letter: “Free speech . . . has given those who seek to perpetuate systems of domination a platform to project their bigotry.” To the student activists, thinkers like Heather MacDonald and Charles Murray are agents of the dominant narrative, and their speech is “a form of violence.”
It is hard to know how our institutions of higher learning will find their way back to academic freedom, open inquiry, and mutual understanding. But as long as intersectional theory goes unchallenged, campus fanaticism will intensify.
Christina Hoff Sommers is a resident scholar at the American Enterprise Institute. She is the author of several books, including Who Stole Feminism? and The War Against Boys. She also hosts The Factual Feminist, a video blog. @Chsommers
John StosselYes, some college students do insane things. Some called police when they saw “Trump 2016” chalked on sidewalks. The vandals at Berkeley and the thugs who assaulted Charles Murray are disgusting. But they are a minority. And these days people fight back.
Someone usually videotapes the craziness. Yale’s “Halloween costume incident” drove away two sensible instructors, but videos mocking Yale’s snowflakes, like “Silence U,” make such abuse less likely. Groups like Young America’s Foundation (YAF) publicize censorship, and the Foundation for Individual Rights in Education (FIRE) sues schools that restrict speech.
Consciousness has been raised. On campus, the worst is over. Free speech has always been fragile. I once took cameras to Seton Hall law school right after a professor gave a lecture on free speech. Students seemed to get the concept. Sean, now a lawyer, said, “Protect freedom for thought we hate; otherwise you never have a society where ideas clash, and we come up with the best idea.” So I asked, “Should there be any limits?” Students listed “fighting words,” “shouting fire in a theater,” malicious libel, etc.— reasonable court-approved exceptions. But then they went further. Several wanted bans on “hate” speech, “No value comes out of hate speech,” said Javier. “It inevitably leads to violence.”
No it doesn’t, I argued, “Also, doesn’t hate speech bring ideas into the open, so you can better argue about them, bringing you to the truth?”
“No,” replied Floyd, “With hate speech, more speech is just violence.”
So I pulled out a big copy of the First Amendment and wrote, “exception: hate speech.”
Two students wanted a ban on flag desecration “to respect those who died to protect it.”
One wanted bans on blasphemy:
“Look at the gravity of the harm versus the value in blasphemy—the harm outweighs the value.”
Several wanted a ban on political speech by corporations because of “the potential for large corporations to improperly influence politicians.”
Finally, Jillian, also now a lawyer, wanted hunting videos banned.
“It encourages harm down the road.”
I asked her, incredulously, “you’re comfortable locking up people who make a hunting film?”
“Oh, yeah,” she said. “It’s unnecessary cruelty to feeling and sentient beings.”
So, I picked up my copy of the Bill of Rights again. After “no law . . . abridging freedom of speech,” I added: “Except hate speech, flag burning, blasphemy, corporate political speech, depictions of hunting . . . ”
That embarrassed them. “We may have gone too far,” said Sean. Others agreed. One said, “Cross out the exceptions.” Free speech survived, but it was a close call. Respect for unpleasant speech will always be thin. Then-Senator Hillary Clinton wanted violent video games banned. John McCain and Russ Feingold tried to ban political speech. Donald Trump wants new libel laws, and if you burn a flag, he tweeted, consequences might be “loss of citizenship or a year in jail!” Courts or popular opinion killed those bad ideas.
Free speech will survive, assuming those of us who appreciate it use it to fight those who would smother it.
John Stossel is a FOX News/FOX Business Network Contributor.
Warren TreadgoldEven citizens of dictatorships are free to praise the regime and to talk about the weather. The only speech likely to be threatened anywhere is the sort that offends an important and intolerant group. What is new in America today is a leftist ideology that threatens speech precisely because it offends certain important and intolerant groups: feminists and supposedly oppressed minorities.
So far this new ideology is clearly dominant only in colleges and universities, where it has become so strong that most controversies concern outside speakers invited by students, not faculty speakers or speakers invited by administrators. Most academic administrators and professors are either leftists or have learned not to oppose leftism; otherwise they would probably never have been hired. Administrators treat even violent leftist protestors with respect and are ready to prevent conservative and moderate outsiders from speaking rather than provoke protests. Most professors who defend conservative or moderate speakers argue that the speakers’ views are indeed noxious but say that students should be exposed to them to learn how to refute them. This is very different from encouraging a free exchange of ideas.
Although the new ideology began on campuses in the ’60s, it gained authority outside them largely by means of several majority decisions of the Supreme Court, from Roe (1973) to Obergefell (2015). The Supreme Court decisions that endanger free speech are based on a presumed consensus of enlightened opinion that certain rights favored by activists have the same legitimacy as rights explicitly guaranteed by the Constitution—or even more legitimacy, because the rights favored by activists are assumed to be so fundamental that they need no grounding in specific constitutional language. The Court majorities found restricting abortion rights or homosexual marriage, as large numbers of Americans wish to do, to be constitutionally equivalent to restricting black voting rights or interracial marriage. Any denial of such equivalence therefore opposes fundamental constitutional rights and can be considered hate speech, advocating psychological and possibly physical harm to groups like women seeking abortions or homosexuals seeking approval. Such speech may still be constitutionally protected, but acting upon it is not.
This ideology of forbidding allegedly offensive speech has spread to most of the Democratic Party and the progressive movement. Rather than seeing themselves as taking one side in a free debate, progressives increasingly argue (for example) that opposing abortion is offensive to women and supporting the police is offensive to blacks. Some politicians object so strongly to such speech that despite their interest in winning votes, they attack voters who disagree with them as racists or sexists. Expressing views that allegedly discriminate against women, blacks, homosexuals, and various other minorities can now be grounds for a lawsuit.
Speech that supposedly offends women or minorities has already cost some people their careers, their businesses, and their opportunities to deliver or hear speeches. Such intimidation is the intended result of an ideology that threatens free speech.
Warren Treadgold is a professor of history at Saint Louis University.
Matt WelchLike a sullen zoo elephant rocking back and forth from leg to leg, there is an oversized paradox we’d prefer not to see standing smack in the sightlines of most our policy debates. Day by day, even minute by minute, America simultaneously gets less free in the laboratory, but more free in the field. Individuals are constantly expanding the limits and applications of their own autonomy, even as government transcends prior restraints on how far it can reach into our intimate business.
So it is that the Internal Revenue Service can charge foreign banks with collecting taxes on U.S. citizens (therefore causing global financial institutions to shun many of the estimated 6 million-plus Americans who live abroad), even while block-chain virtuosos make illegal transactions wholly undetectable to authorities. It has never been easier for Americans to travel abroad, and it’s never been harder to enter the U.S. without showing passports, fingerprints, retinal scans, and even social-media passwords.
What’s true for banking and tourism is doubly true for free speech. Social media has given everyone not just a platform but a megaphone (as unreadable as our Facebook timelines have all become since last November). At the same time, the federal government during this unhappy 21st century has continuously ratcheted up prosecutorial pressure against leakers, whistleblowers, investigative reporters, and technology companies.
A hopeful bulwark against government encroachment unique to the free-speech field is the Supreme Court’s very strong First Amendment jurisprudence in the past decade or two. Donald Trump, like Hillary Clinton before him, may prattle on about locking up flag-burners, but Antonin Scalia and the rest of SCOTUS protected such expression back in 1990. Barack Obama and John McCain (and Hillary Clinton—she’s as bad as any recent national politician on free speech) may lament the Citizens United decision, but it’s now firmly legal to broadcast unfriendly documentaries about politicians without fear of punishment, no matter the electoral calendar.
But in this very strength lies what might be the First Amendment’s most worrying vulnerability. Barry Friedman, in his 2009 book The Will of the People, made the persuasive argument that the Supreme Court typically ratifies, post facto, where public opinion has already shifted. Today’s culture of free speech could be tomorrow’s legal framework. If so, we’re in trouble.
For evidence of free-speech slippage, just read around you. When both major-party presidential nominees react to terrorist attacks by calling to shut down corners of the Internet, and when their respective supporters are actually debating the propriety of sucker punching protesters they disagree with, it’s hard to escape the conclusion that our increasingly shrill partisan sorting is turning the very foundation of post-1800 global prosperity into just another club to be swung in our national street fight.
In the eternal cat-and-mouse game between private initiative and government control, the former is always advantaged by the latter’s fundamental incompetence. But what if the public willingly hands government the power to muzzle? It may take a counter-cultural reformation to protect this most noble of American experiments.
Matt Welch is the editor at large of Reason.
Adam. J. WhiteFree speech is indeed under threat on our university campuses, but the threat did not begin there and it will not end there. Rather, the campus free-speech crisis is a particularly visible symptom of a much more fundamental crisis in American culture.
The problem is not that some students, teachers, and administrators reject traditional American values and institutions, or even that they are willing to menace or censor others who defend those values and institutions. Such critics have always existed, and they can be expected to use the tools and weapons at their disposal. The problem is that our country seems to produce too few students, teachers, and administrators who are willing or able to respond to them.
American families produce children who arrive on campus unprepared for, or uninterested in, defending our values and institutions. For our students who are focused primarily on their career prospects (if on anything at all), “[c]ollege is just one step on the continual stairway of advancement,” as David Brooks observed 16 years ago. “They’re not trying to buck the system; they’re trying to climb it, and they are streamlined for ascent. Hence they are not a disputatious group.”
Meanwhile, parents bear incomprehensible financial burdens to get their kids through college, without a clear sense of precisely what their kids will get out of these institutions in terms of character formation or civic virtue. With so much money at stake, few can afford for their kids to pursue more than career prospects.
Those problems are not created on campus, but they are exacerbated there, as too few college professors and administrators see their institutions as cultivators of American culture and republicanism. Confronted with activists’ rage, they offer no competing vision of higher education—let alone a compelling one.
Ironically, we might borrow a solution from the Left. Where progressives would leverage state power in service of their health-care agenda, we could do the same for education. State legislatures and governors, recognizing the present crisis, should begin to reform and renegotiate the fundamental nature of state universities. By making state universities more affordable, more productive, and more reflective of mainstream American values, they will attract students—and create incentives for competing private universities to follow suit.
Let’s hope they do it soon, for what’s at stake is much more than just free speech on campus, or even free speech writ large. In our time, as in Tocqueville’s, “the instruction of the people powerfully contributes to the support of a democratic republic,” especially “where instruction which awakens the understanding is not separated from moral education which amends the heart.” We need our colleges to cultivate—not cut down—civic virtue and our capacity for self-government. “Republican government presupposes the existence of these qualities in a higher degree than any other form,” Madison wrote in Federalist 55. If “there is not sufficient virtue among men for self-government,” then “nothing less than the chains of despotism” can restrain us “from destroying and devouring one another.”
Adam J. White is a research fellow at the Hoover Institution.
Cathy YoungA writer gets expelled from the World Science Fiction Convention for criticizing the sci-fi community’s preoccupation with racial and gender “inclusivity” while moderating a panel. An assault on free speech, or an exercise of free association? How about when students demand the disinvitation of a speaker—or disrupt the speech? When a critic of feminism gets banned from a social-media platform for unspecified “abuse”?
Such questions are at the heart of many recent free-speech controversies. There is no censorship by government; but how concerned should we be when private actors effectively suppress unpopular speech? Even in the freest society, some speech will—and should—be considered odious and banished to unsavory fringes. No one weeps for ostracized Holocaust deniers or pedophilia apologists.
But shunned speech needs to remain a narrow exception—or acceptable speech will inexorably shrink. As current Federal Communications Commission chairman Ajit Pai cautioned last year, First Amendment protections will be hollowed out unless undergirded by cultural values that support a free marketplace of ideas.
Sometimes, attacks on speech come from the right. In 2003, an Iraq War critic, reporter Chris Hedges, was silenced at Rockford College in Illinois by hecklers who unplugged the microphone and rushed the stage; some conservative pundits defended this as robust protest. Yet the current climate on the left—in universities, on social media, in “progressive” journalism, in intellectual circles—is particularly hostile to free expression. The identity-politics left, fixated on subtle oppressions embedded in everyday attitudes and language, sees speech-policing as the solution.
Is hostility to free-speech values on the rise? New York magazine columnist Jesse Singal argues that support for restrictions on public speech offensive to minorities has remained steady, and fairly high, since the 1970s. Perhaps. But the range of what qualifies as offensive—and which groups are to be shielded—has expanded dramatically. In our time, a leading liberal magazine, the New Republic, can defend calls to destroy a painting of lynching victim Emmett Till because the artist is white and guilty of “cultural appropriation,” and a feminist academic journal can be bullied into apologizing for an article on transgender issues that dares to mention “male genitalia.”
There is also a distinct trend of “bad” speech being squelched by coercion, not just disapproval. That includes the incidents at Middlebury College in Vermont and at Claremont McKenna in California, where mobs not only prevented conservative speakers—Charles Murray and Heather Mac Donald—from addressing audiences but physically threatened them as well. It also includes the use of civil-rights legislation to enforce goodthink in the workplace: Businesses may face stiff fines if they don’t force employees to call a “non-binary” co-worker by the singular “they,” even when talking among themselves.
These trends make a mockery of liberalism and enable the kind of backlash we have seen with Donald Trump’s election. But the backlash can bring its own brand of authoritarianism. It’s time to start rebuilding the culture of free speech across political divisions—a project that demands, above all, genuine openness and intellectual consistency. Otherwise it will remain, as the late, great Nat Hentoff put it, a call for “free speech for me, but not for thee.”
Cathy Young is a contributing editor at Reason.
Robert J. ZimmerFree speech is not a natural feature of human society. Many people are comfortable with free expression for views they agree with but would withhold this privilege for those they deem offensive. People justify such restrictions by various means: the appeal to moral certainty, political agendas, demand for change, opposing change, retaining power, resisting authority, or, more recently, not wanting to feel uncomfortable. Moral certainty about one’s views or a willingness to indulge one’s emotions makes it easy to assert that others are doing true damage or creating unacceptable offense simply by presenting a fundamentally different perspective.
The resulting challenges to free expression may come in the form of laws, threats, pressure (whether societal, group, or organizational), or self-censorship in the face of a prevailing consensus. Specific forms of challenge may be more or less pronounced as circumstances vary. But the widespread temptation to consider the silencing of “objectionable” viewpoints as acceptable implies that the challenge to free expression is always present.
The United States today is no exception. We benefit from the First Amendment, which asserts that the government shall make no law abridging the freedom of speech. However, fostering a society supporting free expression involves matters far beyond the law. The ongoing and increasing demonization of one group by another creates a political and social environment conducive to suppressing speech. Even violent acts opposing speech can become acceptable or encouraged. Such behavior is evident at both political rallies and university events. Our greatest current threat to free expression is the emergence of a national culture that accepts the legitimacy of suppression of speech deemed objectionable by a segment of the population.
University and college campuses present a particularly vivid instance of this cultural shift. There have been many well-publicized episodes of speakers being disinvited or prevented from speaking because of their views. However, the problem is much deeper, as there is significant self-censorship on many campuses. Both faculty and students sometimes find themselves silenced by social and institutional pressures to conform to “acceptable” views. Ironically, the very mission of universities and colleges to provide a powerful and deeply enriching education for their students demands that they embrace and protect free expression and open discourse. Failing to do so significantly diminishes the quality of the education they provide.
My own institution, the University of Chicago, through the words and actions of its faculty and leaders since its founding, has asserted the importance of free expression and its essential role in embracing intellectual challenge. We continue to do so today as articulated by the Chicago Principles, which strongly affirm that “the University’s fundamental commitment is to the principle that debate or deliberation may not be suppressed because the ideas put forth are thought by some or even by most members of the University community to be offensive, unwise, immoral, or wrong-headed.” It is only in such an environment that universities can fulfill their own highest aspirations and provide leadership by demonstrating the value of free speech within society more broadly. A number of universities have joined us in reinforcing these values. But it remains to be seen whether the faculty and leaders of many institutions will truly stand up for these values, and in doing so provide a model for society as a whole.
Robert J. Zimmer is the president of the University of Chicago.