To commemorate Commentary's fiftieth anniversary, the editors addressed the following statement and questions to a group of American intellectuals:
To commemorate Commentary’s fiftieth anniversary, the editors addressed the following statement and questions to a group of American intellectuals:
In the eyes of many observers, the United States, which in 1945 entered upon the postwar era confident in its democratic purposes and serene in the possession of a common culture, is now, fifty years later, moving toward balkanization or even breakdown. Pointing to different sorts of evidence—multiculturalism and/or racial polarization; the effects of unchecked immigration; increased economic and social stratification; distrust of authority; the dissolution of shared moral and religious values—such observers conclude in their various ways that our national project is unraveling.
Do you agree with this conclusion, in whole or in part? Has your own thinking changed in recent years on the question of the basic stability of American institutions?
We are now in the midst of a conservative resurgence, social and cultural as much as political, which arguably arose in response to the trends described above. In your view, is it making any headway toward arresting or reversing them? How would you assess its promise, in both the near and the longer term?
The responses—seventy-two in all—appear below in alphabetical order.
The test America faces in the coming decades is whether our democratic institutions are powerful enough to resist and reverse the destructive policies still being championed by a strange brew of elites.
Those elites are principally a mixture of liberal/Left politicians, members of the media and the academy, with reinforcements from the liberal churches, black leaders, the American Jewish establishment, and (intermittently) the judiciary. In their long march toward victory in remaking American culture, their successes have been great. The amazing proliferation of quota systems in employment and education, the advent of multiculturalism, and the terrible coarsening of social life in only 30 years all give testimony to what they have wrought. Even now, when it is clear to almost everyone that our welfare system and much of our public-education system are spectacular failures, when soaring illegitimacy and urban crime rates are a widely acknowledged scourge, no serious changes in the policies that contribute to these results have yet been achieved.
Pessimism about American society is therefore understandable. And yet, the other side of the coin is brighter: even after being told repeatedly since the mid-1960’s that their traditional views of American society are ridiculous, repressive, and outmoded, most Americans still are not persuaded. While religion has for two decades been pushed increasingly to the margins of public and academic life and ridiculed in most of the media, America remains a country where the great majority of citizens attend church and believe in God. While the judiciary has, for the most part, lent itself to the assault on traditional values, Americans still believe that prayer is a good thing and that employment quotas and, for that matter, abortion on demand are bad ones. There is a genuine counterrevolution under way, fueled by the manifest failings of 50 years of liberal rule and by the generalized sense that public policy is weakening rather than strengthening the society.
Two cultures are now struggling for supremacy, but the rival pairs are neither C. P. Snow’s scientific culture vs. literary culture nor the “Anglo” cultural tradition vs. a new multicultural and primarily Hispanic wave. The battle is between those who believe the government must tutor and discipline—must, indeed, reform and civilize—a society that is instinctively nativist, racist, and unenlightened, and those who believe current policies are corrupting the virtues that reside in the American people and the system of limited government under which the nation lived until the New Deal. What is most striking about those who belong to the first group is not their view of the national government as a tool or even their willingness to use government power as an instrument of coercion in their desire to remake American society. More significant is their belief that the underlying society is so terribly flawed as to require this radical reconstruction.
Since the Great Depression and with the enormous prestige of victory in World War II, the party of government has been more or less in charge, controlling Congress and the organs of our culture. It was not seriously tested until 1980, and even so it then survived the Reagan years almost intact. President Reagan refused to challenge it in critical areas—federal spending grew rapidly—and liberal control of Congress continued. The Bush and Clinton administrations saw a return to liberal normalcy, and it was argued that the Reagan elections had been anomalies or merely personal victories for the old actor. It turns out, of course, that the Reagan victories reflected a broad popular revulsion at the liberal critique of America and the attitudes and program associated with it. This is why George Bush won when he appeared to be Reagan III but lost when he seemed to desert the cause, and why every incumbent defeated in 1994 was a Democrat.
The reason for optimism is that 50 years of lecturing by their supposed betters have not persuaded the American people that the eternal verities of yesteryear—family, work, and faith, and the greatness of America—are instruments of repression. There is a new consensus forming that recognizes how much is lost when the government subsidizes illegitimacy, restricts religious activity, promotes radical views of male-female relationships, divides citizens according to race, or vigorously attempts to undermine the sense of a common history and culture. Both the size and power of government, and the ends it so often seeks, now meet resistance and criticism unthinkable 25 years ago.
What we are seeing is an end to the disjunction between the citizen acting as voter and the citizen acting in private life. As parents or children, neighbors or colleagues, employers or employees, Americans never lost respect for the “old” virtues. If the citizen as voter cast his ballot for liberal candidates, it was to deploy a safety net, expand opportunity, or fight injustice. But liberal government grew beyond these limited goals decades ago, and the gap between the virtues the citizen celebrated in private life and the goals he supported with his vote began to grow. Now it is too large, and the voter is using his ballot to insist that government reinforce rather than subvert the virtues he cherishes in private life.
Regaining lost ground will be most difficult, for it is much easier to damage society than to repair it. Even the strongest consensus in society and the largest majority in Washington cannot quickly fix broken families or schools, reduce urban crime, or lower the abortion rate, when it took decades to break down the restraints and undermine the social and moral standards that once prevented the spread of those pathologies. There will be many more Bill Clintons: candidates who prove that La Rochefoucauld was right in calling hypocrisy the tribute vice pays to virtue. But now that American society has begun to reassert its belief in the existence of vice and virtue alike, and in the worth of its own values, traditions, and achievements, there is reason to believe that the prospect for the country is a good one.
Let me warn the reader not to expect much in the way of prescience. I have a miserable record in foreseeing the future even in areas where I would seem to have the necessary knowledge and experience. One example is my myopia about the emergence of the drug culture, though I was in a sense present at the creation.
I had been friends with Timothy Leary in graduate school and had kept in touch with him in the years that followed. We met at a psychological convention in the early 1960’s and in the course of a long conversation I learned that he was planning to go to Mexico to find and sample mushrooms which could induce hallucinations. He had experimented with them and felt they might be the path to spiritual salvation. Even then he imagined recruiting others to their use, filling their unmet religious needs through the communion with higher powers that these substances could provide. I nodded indulgently, smiled intently—another of Tim’s enthusiasms, I thought, here today and gone tomorrow. I did not for a moment imagine that we were on the threshold of a leap into the widespread use of hallucinogens and other drugs.
Nor did I anticipate any of the other changes which were to come later in that decade. My academic specialty was adolescence, yet I did not foresee the arrival of the youth culture. I had taught at Bennington College, which attracted just those well-to-do, socially conscious, politically sophisticated young women who would a few years later help launch modern feminism, yet I could not at the time construe that they would ever venture from roles as wives and mothers, very narrowly defined.
Throughout this period I did intensive psychotherapy with clients who were to be the seedbed of the counterculture, yet I had not a clue that it would soon develop. Despite a lifelong association with higher education, I did not sense the imminent capture of the American university by the Left, even as it took place before my eyes; I preferred to believe that the faculties of science and engineering and medicine and business would resist the seizure and despoliation of the campus. And despite the fact that I wrote on and around the topic, I did not foresee the continuing decline of the nuclear family, or the growth of fatherlessness that David Blankenhorn has recently discussed, or the deficiencies in impulse control and superego function which would ensue.
You will note in these examples a perhaps incurable optimism; more precisely, an inability to imagine poor outcomes. The error that troubles me most was my misreading of the racial division in our country. The passage of the 1964 Civil Rights Act led me to believe the country would soon settle into an acceptable degree of racial harmony. The racial turbulence that followed, as in Watts, was, I thought, a temporary matter: we would move ahead to the assimilation of blacks in a typically American way. I had in mind the pattern followed by earlier immigrant groups, above all, the Italians and Jews of my own childhood. The path would not necessarily be smooth—it had not been for us—but it would be steady. I simply could not imagine otherwise.
The first extra funds I had earned in my life were invested in a project for interracial housing. For reasons never quite made clear, it did not prosper, and was absorbed, at a considerable loss, into a more conventional real-estate investment trust. That incident now strikes me as emblematic in its belief that given money, opportunity, and good will, most social problems could be corrected. It is emblematic, too, uncomfortably so, in the picture it provides of the consummate liberal gesture, of benevolence de haut en bas.
Both the belief and the gesture address themselves to traditional racism—hating or despising or fearing blacks and other disadvantaged groups. Our current dilemma involves instead a tangle of deceptions and self-deceptions which corrupt public discourse.
Picture the scene: a group of university professors so committed to rigid intellectual honesty that they would not hesitate to destroy anyone mis-reporting or even substantially shading research data. They are gathered to choose the very best among the applicants to their highly selective graduate program. The files before them are a treasure trove of potential academic stars—GRE scores in the 98th percentile and above, grade-point averages near perfection, ga-ga letters of recommendation. Disputes arise about the niceties—yes, that 3.9 is a good GPA, but what about the B+ in organic chemistry?
Somewhat late in the day, the committee finds itself examining a group of applicants with passing but mediocre test scores and grades somewhat better than mediocre but far from outstanding, though with equally euphoric letters. They choose the best among these, though in only one or two cases is the best even close to those chosen initially. There is nevertheless a certain jauntiness in this discussion, a Panglossian happiness, shall we say, and reassurances are given that all those chosen will do quite well. But all will not do well. A few will, some will founder and disappear, and others will merely get by.
Most American institutions now have in place a system which violates the legitimate interests of the majority. Because it is immoral, we proclaim on all possible occasions that it is moral indeed. That was the theme of the latest public address given by my university’s president, a man who was once an able professor of engineering and who has now become a speaker of just such platitudes. My president does not take the next step, though others do: if racial preferences are quintessentially moral, any effort against them is eo ipso evil, hence allowing any and all forms of retribution.
If you think this far-fetched, consider the following, reported in an article by Harold Johnson in National Review. A Democratic political leader says he is looking into the private lives of the two academics who wrote California’s proposed amendment against racial and other preferences. He will find out if they have cheated on taxes, touched students inappropriately, and the like. “These two professors may have white shirts on now, but by the time we’re done with them, they’ll be pretty dirtied up.” An utterly shameful statement, uttered shamelessly, but when racial virtues must be protected, so it goes.
Robert L. Bartley
In taking the temper of the current American prospect, my starting point is the observation that U.S. political history is marked by critical elections, broad and deep changes in partisan alignments that set the political tone for a generation. The last clear realigning election was in 1932, and since the 1960’s we have been overdue for another. The missing beat in the cycle led many students of this phenomenon to conclude that realigning elections are a thing of the past, swept away by the media age.
The leading student of these cycles, Walter Dean Burnham, once toyed with the “dealignment” idea, but has adopted a new position after the 1994 “earthquake.” This was it. Whatever the name or even the fortune of the 1996 GOP presidential nominee, “the shape of American politics will very probably never be the same again.”
In short, the Gingrich Republicans, or at least the ideas they represent, will grow in strength and dominate the first quarter of the dawning century. We can expect to implement much of the Contract With America, and indeed some even more sweeping changes Republicans were too timid to include in that document. Certainly we can expect a smaller, less intrusive, and more decentralized government. We should expect a federal budget balanced by some definition or another.
But at the same time, also, tax reduction—most likely, I think, the “flat tax” that would cap high marginal rates and exempt investment income. Combined with legislation sponsored by Senator Connie Mack to direct the Federal Reserve to take responsibility solely for the price level, this would lead to a resurgence of economic growth. And growth will in turn renew the traditional “can-do” American optimism.
With such a psychological change, many other things would start to look possible. The reform of a parasitical tort-liability system, for openers. A voucher system for Medicare, for example, likely to start in embryonic form this year. And, over time, some form of voucher/privatization for Social Security itself, and certainly for education—currently the biggest blight on the American prospect.
In foreign policy, we could certainly expect a missile-defense system to allow us freedom of action against minor bullies. And unmatched stealth aircraft for projection of power. These capabilities would help us implement a policy of aggressive unilateralism, what Teddy Roosevelt called the big stick. This will not mean military intervention in every corner of the globe simply because with an increasingly confident America, that will not be necessary.
That all of these things will happen is only my opinion, of course, but to steal from Alexander Wolcott, you didn’t ask for my hair clippings. While the real world never precisely acts out a thousand-word blueprint, I definitely believe the above sketch captures what will prove to be the direction of events. Indeed, the trends now culminating have been simmering just below the surface for a generation.
Since at least Vietnam, our elites have been sick. But the body politic has been sound, and is now reasserting its authority. Academics and journalists are the last to get the word. Finding they no longer hold moral sway, but still controlling society’s megaphones, the outgoing elites have to yell louder and louder. The sounds you hear are not the unraveling of society but the protests of an establishment being displaced.
My expectations are not based merely on deterministic cycles but on a reading of a broader change in the temper of the American public. The elections are merely the expression of an underlying change that taps the moral force that reaches back to the founding of this city on a hill. We are undergoing what the Founding Fathers would recognize as a Great Awakening. To be sure, it comes in a form shaped, or perhaps diluted, by a far more secular and pluralistic society. It does have a specifically religious dimension in the Christian Coalition, but also includes the nonsectarian revivalism of the Promise Keepers as well as many purely secular vibrations. What is being questioned is not the efficiency of the New Deal or the competence of the establishment, but the moral legitimacy of both.
Now, the Jewish community has historical reasons to be uncomfortable when moralism surges. In the Old World there was the Inquisition, and a kind of pagan moralism led to the Holocaust. The evangelical Protestant moralism that has dominated the American awakenings helped free the slaves, but also gave us the “noble experiment” of Prohibition. While political figures such as Pat Buchanan seem to be echoing a nativism that was historically associated with bigotry, the dominant strain of the religious Right is by now more sophisticated and worldly.
In any event, the prevailing danger in a media-drenched age is not too much moralism but too little. The change in public temper has come so quickly that even the most trendy have been caught off base. Suddenly society is telling Calvin Klein to clean up his act. Time Warner is now embarrassed by gangsta rap. The sexual habits of politicians have become decisive, at least when the victims are congressional aides rather than trailer-park women in the South. Yale finds its alumni not only asking questions but in one case yanking back a $20-million gift.
While we have been habituated to look to government, this change in public temper is more important than anything that happens inside the Beltway. Its effects will be pervasive and far-reaching. Already, for reasons the experts find baffling, the crime rate is plunging.
So in assessing the American future, I extrapolate the trends that have developed in the last few years. Hillary Clinton’s health-care task force was the last gasp of a dying era, and a new one is being born. The new trend is to empower the individual and hold him or her responsible; it fits both the American tradition and the new reality of an electronic and interdependent age.
Take all the caveats and expect the usual quota of disappointments, but in the main the national prospect looks brighter than it has for a generation.
On the surface, two of the three questions we are asked by the editors to discuss seem a little absurd: is our national project unraveling, and what about the basic stability of American institutions?
After all, the U.S. and its allies did win the cold war and, more recently, the Gulf war. Russia is a storm-tossed fragment of its former self. The threat of nuclear war has been removed. In fact, the threat of global war, with alliance pitted against alliance, can be said to have disappeared. Most of the world’s would-be immigrants want to come to the U.S. The world, more and more, is moving toward democracy, says Samuel Huntington, in “an almost irresistible global tide.” Socialism is bye-bye. Competitive economies are in.
As for the question concerning the conservative resurgence, Irving Kristol told us in 1993 that “liberalism today is at the end of its intellectual tether.” He was right, as usual. I would say that Bill Clinton, like Jimmy Carter before him, is already a lame-duck President.
Even more unhappily for the declinist school, for the Marcusean doomsayers, and for the anti-American American Left, the U.S., according to Michael Boskin, “remains the world’s largest, richest, and most productive economy.” With less than 5 percent of the world’s population, the U.S. produces about a quarter of the world’s total output of goods and services. The American standard of living exceeds that of any other industrialized country. The U.S. is not deindustrializing, nor is it losing its overall competitive edge.
Yet something is bugging us. It is the horrible things that are being said, written, published, intoned daily and hourly about this country. As Meg Greenfield of the Washington Post has written:
A Martian reading about [the U.S. as depicted in the media] might in fact suppose America to be composed entirely of abused minorities living in squalid and sadistically run mental hospitals, except for a small elite of venal businessmen and county commissioners who are profiting from the unfortunates’ misery.
A quarter-century ago, the South African novelist Alan Paton spoke moving words to America at a Harvard commencement exercise:
Your tribulations are known to the whole world. Some of us in the outside world derive satisfaction from them. . . . It is foolish of us to gloat when you appear to fail to solve them, for are we any better, any worse than you? Therefore you must regard yourselves as the testing ground of the world, and of the human race. If you fail, it will not be America that fails, but all of us.
So generous an appraisal of the national project is not to be found among the liberal Left. Here is Martin Walker, at present the Washington correspondent of the British daily, the Guardian, in a recent book:
The similarities between Moscow in the early 1980’s and Washington in the early 1990’s became eerily acute to one who had lived through both. The contrast between the former Soviet Union’s release of its prisoners and the way that the USA had over one million of its citizens incarcerated, summoned the bizarre, dismaying thought of an American gulag.
You really have to loathe our huge, blundering democracy to talk about an “American gulag”—or perhaps you just need to be a member of the marxisant Left. And that raises a puzzle. The great Polish émigré philosopher Leszek Kolakowski once said that Marxism has been “the greatest fantasy of our century.” On the one hand, Marxism’s century-and-a-half authority is gone. Without it, there is no epistemological foundation for the hostile, extirpationist analysis of capitalism, no longer any “scientific” alternative structure. The Open Society, with no scientific pretensions whatever, has triumphed over its enemies.
So why, on the other hand, is so much Marxism and Left liberalism still to be found in the halls of academe? Why, when anti-Americanism has diminished in most of the world, and Marxism has become a relic of a miserable past, is there so much academic anti-Americanism here at home? Why is the United States the world capital of political correctness, affirmative action, job quotas, multiculturalism, gay rights, radical feminism, Afrocentrism?
A close observer of the American scene, John Gray of Oxford University, has marveled at this phenomenon—and at the fact that while in most parts of the world liberalism, having lost its moral cachet, has moved to the Center and even to the Right, American liberalism is moving further and further to the Left. In an article in National Review, Gray asks:
What is it in American culture that renders it uniquely vulnerable to such pathologies? Are we to suppose that the unparalleled strength of these radical movements in America is merely accidental? Or does the fact that America must now have the most leftist political—and popular—culture on earth call for an explanation?
Gray’s explanation is this: for the American Left, America is not a nation
grounded in the contingencies of language and cultural affinity, but an ideological construction whose identity derives from universalist principles. For [the Left] America is not a nation but a civil religion, and loyalty to it is a matter not of sentiment but of ideological commitment. It is only to be expected that attachment to America as a civil religion should come to express itself as hatred of the values and institutions that are most definitive of America as an historic nationality.
I suppose I could be taxed for not writing about such problems as drugs, homosexuality, pornography, crime, jobs, racism, homelessness, AIDS, Oklahoma City, welfare, the deficit, and all the domestic difficulties which, to some people, look as if they are going to overwhelm this country and destroy the national project. My response is simple: the American people—black and white—will deal with these problems as they have dealt with other problems in the past. Our national project is not unraveling, and the country is not fragmenting. It is the liberal-Left that is unraveling—its stridency is the best sign of that—and that is a good thing, too.
William J. Bennett
There is no longer a serious question about whether much of our national project is unraveling. There is, in fact, overwhelming evidence that it is. COMMENTARY readers are by now well familiar with the social pathologies which have become a (seemingly) permanent feature of late-20th-century America. There is no need here to present all the empirical evidence.
Suffice it to say that we have experienced an astonishing degree of social regression. At the midpoint of this century, America was the preeminent military and moral power in the world. At the close of this American Century, the United States is still the undisputed military leader. But morally it has been a very steep, 50-year slide. America now finds itself at or near the top of the industrialized world in rates of murder, rape, drug use, divorce, abortion, child abuse, and births to unwed mothers. Our elementary-and secondary-education system often places us at the bottom of the industrialized world. Much of our popular culture is vulgar, violent, mindless, and perverse. Many of our character-forming institutions are enfeebled. More difficult to quantify but no less real is the coarseness and incivility of much of the public square. All of these things, together, have shattered America’s traditional confidence about itself, its mission, its place in the world.
Our social crisis is most often discussed with reference to, and focus on, the problems of the underclass. It is true enough that our modern-day tangle of pathologies is concentrated in urban centers and inner cities. That is where the fire burns hottest, where the pathologies are most obvious, most intense, most intractable. But there is trouble in River City, Main Street, and in the Hamptons, too. And while the problems there are somewhat different in nature (e.g., prolific divorce instead of illegitimacy), they pose no less a threat to the nation’s long-term prospects. A free society depends ultimately on the beliefs, behavior, and standards of the average citizen. What makes our situation today different from previous periods in American history—and fundamentally more serious—is the “de-moralization” of much of middle-and upper-middle-class life.
The causes are varied and complicated (my list would include, but not be limited to, modernity itself, affluence, spiritual acedia, intellectual trends, movies and television, advertising, and flawed government programs). But we are reluctant to admit that much of what has gone wrong has not been done to us; we have done it to ourselves. It is self-delusion to think that the American people have been unwittingly and reluctantly drawn into a culture of permissiveness.
My former teacher, John Silber, used to speak of an “invitation to mutual corruption.” We moderns are hesitant to impose upon ourselves a common moral code because we want our own exemptions. “If it feels good, do it” has a wider appeal than we like to admit. The result is that large segments of America are characterized by moral confusion, indolence, indifference, distraction, self-pity, self-absorption.
Writing just a few years before COMMENTARY published its first issue, the Scottish author and statesman John Buchan (an ardent Zionist, by the way, though also not free of the social prejudices of his day), described a “nightmare world” in which
everyone would have leisure. But everyone would be restless, for there would be no spiritual discipline in life. . . . It would be a feverish, bustling world, self-satisfied and yet malcontent, and under the mask of a riotous life there would be death at the heart. In the perpetual hurry of life there would be no chance of quiet for the soul.
In too many places, for too many people, Buchan’s nightmare world is reality.
I suspect that many Americans know it—and they are increasingly troubled and angered by what has occurred. All of which leads to some good news. Cultural issues now have a central place in our national-political conversation. We are achieving a bipartisan consensus on social issues which were once considered divisive. (For example, President Clinton and Secretary Donna Shalala of Health and Human Services have both said that having a child out of wedlock is morally wrong.) We see a renewed interest in character education. Religious and civic movements like the Promise Keepers, the National Fatherhood Initiative, and Best Friends are encouraging signs. And of course there are the results of the 1994 elections, which put (at least for the moment) an electoral stake through the heart of contemporary liberalism. What we are seeing, I think, are social antibodies reacting against a 30-year cultural virus.
An analogy, borrowed from a previous job I held in government, helps make the broader point. The recognition of a drug problem is the first step toward an addict’s recovery. But much more is required. The addict still needs to act. This requires a willingness to change and to persevere.
Together, we need to correct a mistake in philosophy. Many of us act as if we have reduced the entire Declaration of Independence to a single phrase, “the pursuit of happiness.” We would do well to refamiliarize ourselves with some of its other concepts—to take just two, divine providence and our sacred honor.
Rebuilding the national project depends, finally, on individual citizens living better and more decent lives. It does not require sainthood, moral perfection, or even moral excellence among citizens. It does require that we take seriously what too many Americans have come to neglect: our basic commitments as parents, spouses, neighbors, citizens, and people of faith. To accomplish these things, it would be no small help, as Aleksandr Solzhenitsyn and others have urged us, to remember God.
We were asked to reflect on the question of whether (to speak plainly) the country is going to the dogs. Those who think it is can point to the crime rate, the seemingly intractable drug and racial problems, illegitimacy and the collapse of the family, the disintegration of the party system, or to the fact that our leading national newspaper editorializes “in praise of the counterculture,” or that, on the occasion of his death, “we, the people of the United States” celebrated the life of a guitar-strumming heroin addict; some even find reason to be concerned about the condition of the economy. Forty-two years a professor but now safely and happily retired, I am going to focus on the university. The following anecdote speaks volumes about its political condition.
On June 4, 1990, Mikhail Gorbachev spoke at Stanford University before an audience of professors and their students. In that address, he announced the end of the cold war, then said: “And let us not wrangle over who won it.” Instead of wondering what there was to wrangle, or argue, about—as if there were any doubt about who won the war—that university audience responded with a thunderous, ecstatic roar of approval.
What were these academics cheering, or what did they understand him to be saying? That it would be impolite of Americans to acknowledge that the Western liberal democracies had won the war? That, at that juncture, it did not matter who won it? Or, more likely, that it never mattered who won it? That the cold war was an unnecessary war, a phony war, a war with nothing at stake? In short, were they cheering the fact that here, finally, was a world leader—a Nobel laureate!—with the courage to say publicly what some professors had been saying all along, namely, that between “socialism” and liberal democracy, which they label “bourgeois democracy,” there was, to say the least, nothing to choose?
Whatever accounts for that singular response, Gorbachev surely intended to elicit it, and just as surely he knew he was more likely to elicit it on a university campus than at a meeting of steelworkers in Pittsburgh (to say nothing of a meeting of shipyard workers in Gdansk) or at any gathering of Americans with family ties to Prague, Bucharest, Budapest, Warsaw, or Vilnius. He was fond of quoting the Russian proverb, “the fish rots from the head down,” and he knew that (metaphorically speaking) the head of the American “fish” was in the universities.
It is not that we are ruled by professors rather than by a church, a class, a party, or (pace Coleridge and John Stuart Mill) a “clerisy,” but that we are peculiarly dependent on the university—and not only because we attend it in such great numbers. Our governing principles, formally declared to the world in 1776 and embodied in the Constitution in 1787, are the product of a political science, or a body of philosophic thought, and the university is the home, and in our case the only home, of such thought. We, therefore, depend on it to expound those principles, so that they might be impressed firmly on the minds of our judges, politicians, New York Times editors, and, since ultimately we are a self-governing people, citizens generally. So, in effect, said George Washington in his repeated calls for the establishment of a national university where the youth, from all parts of the country, would receive instruction in “the science of government”; and so said Thomas Jefferson who, in his plans for the University of Virginia, recognized John Locke as the author of “the general principles of liberty and the rights of man,” and required its law school to teach the Declaration of Independence and The Federalist Papers, the best guides to “the distinctive principles of the government of the United States.”
They would be hard-pressed to find a faculty to teach that curriculum today. Especially in the prestigious universities, there are more Marxists than Lockeans (not because there are many Marxists, but because there are few Lockeans), more postmodernists than constitutionalists, and more deconstructionists and new-historicists than old-fashioned humanists. Such a faculty is not likely to extol the virtues of our principles of government. Locke (“America’s philosopher”) took his bearings from nature and the rights pertaining to it, but Marx denigrated the idea of nature and spoke contemptuously of the “so-called rights of man.” We declared our independence by appealing to “nature’s God,” but Nietzsche, the granddaddy of postmodernism, proclaimed that “God is dead.” Jefferson said that our “possession of a written Constitution is [our] peculiar security,” a point reiterated by Madison when he said the legitimacy of our government depends on adherence to the Constitution’s written text, but, according to Paul de Man, who brought deconstructionism to America (and eventually to its law schools), “the distinctive curse of all language” is its inability to convey meaning in any objective sense; in effect, there is no meaning in a deconstructed text (or, as Gertrude Stein said of Oakland, California, “there’s no there there”), there is only interpretation.
According to a distinguished professor of constitutional law, this is how the Constitution is understood today in the leading law schools. If there is no “there” in its text, it can be read to say anything—or as he puts it, “there is nothing that is unsayable in the language of the Constitution”—and if it can be read to say anything, there is no Constitution and no possibility of constitutional government. He does not hesitate to draw this conclusion: “The death of ‘constitutionalism’ may be the central event of our time, just as the death of God was that of the past century (and for much the same reason).”
Are we, then, going to the dogs? Perhaps; every regime needs supporters stronger than its opponents (to quote a friend, who, in turn, was quoting Aristotle)—which, in our case, means being able to meet the arguments of the postmodernists. Not only that, but to be a supporter of liberal democracy means to be a supporter of—surprise!—bourgeois democracy, and doing that in a university community carries the risk of being ostracized. Professors do not like to be reminded of their bourgeois roots or, with their jobs secured by academic freedom and tenure and their futures by the world’s largest pension fund, that they must be numbered among the chief beneficiaries of the “system,” as they call it. Thus, as reported in a recent issue of the American Scholar, professors are annoyed when a contentious colleague flies the American flag on patriotic holidays.
Robert H. Bork
That American culture is unraveling and its institutions becoming ever more fragile is so widely accepted that it does not require discussion. It is difficult to think of any area of the culture or any institution—from universities to popular music—that is not in significantly worse shape today than it was in the 1950’s.
The more interesting topic is whether the situation is retrievable, whether the conservative resurgence can arrest or reverse the trends. The prospects for that are dimmed by the realization that what we see today has been coming on for a long time, well before the 1960’s. This is not a sudden development but a continuation of a trend of many years that was temporarily suppressed by the Depression and World War II. The confidence and serenity of 1945 were wholly misplaced, though we did not know it then. Liberalism has always been hostile to constraints on the individual. For a time, that was very beneficial since many constraints increasingly served no legitimate purpose. But in the modern era the constraints under attack are those essential to a stable social and moral order: religion, morality, law, the family, and the lines of authority characteristic of a bourgeois culture.
The weakening of constraints entails the loss of moral consensus and hence the weakening of community. The same disintegrating effect is produced by the passion for equality, which, by 1945, had intensified into egalitarianism, a conspicuous passion of the generation that savaged the universities in the 1960’s and now runs those institutions. That passion produced multiculturalism and affirmative action and a further fracturing of community into hostile groups claiming the individual’s primary allegiance.
Technology exacerbates these developments. It not only makes life easier so that there is more time and energy for entertaining oneself, it also brings entertainment into the home so that no one needs others for recreation. Every new development—for example, the words and pictures of the Internet and the soon-to-be-realized ability to call up digital films on home computers—isolates the individual further and makes less likely a communal agreement on values. The ability to send and receive messages does not create a community. More likely, it creates small, often angry groups who further distance themselves from the larger community.
It would require a very robust optimism to suppose that a conservative revival can deflect us from the path on which we have been traveling for so long. There have been conservative reactions in the past that ended periods of social and cultural dissolution, but eventually and usually dissolution returned, often in more aggravated form. That certainly seems to be true of America, which suggests that if there is a pendulum effect, it serves only to mask temporarily long-term trends that move us always toward moral and cultural divisiveness and weakened institutions.
What, by the way, is the evidence of a conservative resurgence? The evidence consists largely of the 1994 elections and assorted shots taken at Time Warner and Calvin Klein. All to the good, but perhaps not enough to raise hopes very high. Republicans promised major changes, some of which would influence the culture in desirable ways, but, so far, they are having difficulty delivering. Either there are not as many conservatives among Republicans as advertised or some of them have been intimidated by the Democrats’ and the media’s class-warfare demagoguery. Welfare reform and the abolition of affirmative action have not occurred and, as time passes, the prospects for significant change may diminish. We have been through apparent periods of dramatic change before, of course. The Reagan revolution was to be a sharp break with liberal trends, but Reagan lost the Senate in 1986 and he was followed by Bush and Clinton. Today, more is wrong with our culture than was the case in 1980.
The public expresses disgust with mass culture, but it remains popular, and people go on watching, and making profitable, the movies and television they say are debased. Programs like multiculturalism and affirmative action, on the other hand, are products of an elite liberal culture that shows no signs of being affected by conservative political victories. The institutions of that culture, those whose business is the dissemination of ideas, attitudes, and symbols—universities, the press, Hollywood, many mainline churches, public-interest organizations, foundations, much of the federal and state judiciaries—are generally on the cultural and political Left, some overwhelmingly, some only predominantly. They will not change if conservatives control the government. If their performance after Reagan’s election is any guide, they are likely only to intensify their efforts.
If the conservative political revival persists and gathers strength, we may see a permanent and antagonistic standoff between the political nation and the cultural elites. If one or the other is to prevail, however, I would until recently have placed my money on the elite culture. That culture can recruit the brightest and most ambitious of the young because it has the prizes of prestige and influence to confer. It also has the attractive rhetoric of liberty and equality at its command as it produces the programs that dissolve community and moral consensus.
What has given me cause for hope is the rise of an energetic, optimistic, and politically sophisticated religious conservatism, which may be a far more powerful force than mere political conservatism. It can help elect conservatives to national office, as it was instrumental in doing in 1994. But more important, the new religious-conservative movement can alter the culture both by electing local officials and school boards, which have greater effects on culture, and by setting a moral tone capable of overcoming today’s relativism. It may be that we are witnessing a religious revival, another awakening. I hope so, for that will be our best chance to reverse the trends that threaten social chaos.
The American project has been under assault for some time. Meanwhile, for more than two decades, the argument that common values, common ideals, and, yes, a common language are critical components of America’s national culture has been all but absent from mainstream discourse. Thus, a longstanding consensus—grounded in the premise that a nation of immigrants requires a common culture—has virtually been shattered: this is just one of the unhappy realities of the last quarter-century. Insofar as the conservative resurgence is animated, in part, by a quest to affirm ties that bind, the GOP’s 1994 off-year electoral landslide represents a bright spot on the horizon.
True, there is a measure of irony here; most Americans never abandoned their sense that this country’s fortunes turn on the survival of a common culture defined by shared values. But President Nixon had it right when he spoke of traditionalists as a “silent majority.” The dominant voices in American life still belong to the liberal elite; commentators who dwell in the precincts of the Left continue to exercise disproportionate influence over the nature of public discourse. And this element is decidedly hostile to the notion of America as a civilization, and manifestly sympathetic to representations of the United States as a country rife with bigotry and inequity.
The contemporary conservative resurgence can actually be understood as the continuation of a process that began, if not with Richard Nixon’s victories in 1968 and 1972, then certainly with the 1980 triumph of Ronald Reagan. While the issues have changed over the course of the last fifteen years, Reagan’s candidacy was defined by many of the concerns that animate proponents of the Contract With America.
Sadly, the failure of the Bush administration to move with certainty from victory in the international arena—where Reagan had set the stage for global Communism’s demise—to rigorous implementation of a conservative domestic agenda eventually facilitated a political aberration: Bill Clinton.
In a telling comment on the temper of the times, however, even President Clinton, three years ago, deemed it prudent to campaign for the White House as a centrist. Admittedly, Clinton, just after his election, engaged in the most ludicrous celebration of PC “diversity” in recent memory—appointing a cabinet that “looks like America.” But the manner in which he campaigned remains worth recalling.
Clinton’s entire presidency, to be sure, has been informed by the Left-liberal ideology that dominates his party. And the popular energy harnessed by Newt Gingrich in the fall of last year is easily understood as an angry backlash—a backlash provoked, in good measure, by state-sanctioned multiculturalism, the abandonment of merit-based criteria, the use of education as a means of enhancing racial and ethnic self-esteem, the pernicious rhetoric of class warfare, and the unending depiction of America as a bastion of racism, greed, repression, and militarism.
At its essence, the conservative resurgence turns on a determination to stand up to the liberal elites with respect to these very issues—the cluster of policies that threatens the American project. The continuing struggle to dismantle the failed programs of the Great Society, and the attendant refusal to allow charges of “racism” to chill the debate and stymie the conservative resurgence, have already been crowned with significant success. Major changes in welfare regulations will soon be a reality—both in the federal system and in many states. Intergenerational cycles of dependency—“welfare as we know it,” to quote candidate Bill Clinton—will eventually be consigned to the dustbin of history, thanks to Capitol Hill conservatives.
Immigration reform likewise appears inevitable. The need to change the criteria for legal entry in order to end the phenomenon of “chain immigration” from the third world means that the principle of family reunification has to be revisited and redefined. This task cannot be undertaken lightly—the existing system has well-entrenched proponents. Still, immigration reform has earned itself a place on the national agenda—a triumph in itself with respect to securing the American project and ending the national slide toward balkanization.
The same can be said of the battle to ensure the primacy of English in American life. Passage of explicit “official-English” legislation may not be an imminent prospect. But the mere fact that the language question can be discussed in polite company—such discussions are invariably informed by consensus on the failure of bilingual education—marks a major step forward.
Again, however, the tenacity of those who mean to fight change in this realm—especially activists engaged in a conscious campaign to heighten social disharmony—should not be underestimated.
It would also be unwise to ignore the unhappy tendency on the part of some who actually identify with the conservative resurgence to embrace only its fiscal component. Such folk fail to recognize that the conservative social agenda complements the movement’s economic concerns. They fail to see that promoting deregulation and securing the dominance of the free market will have a relatively minimal impact on many of this society’s most debilitating ailments: the demise of the family as a social institution; the rapid growth of an urban underclass permanently incapable of entering the American mainstream; the rise of multilingualism. The last phenomenon, it is well to note, threatens the ability of millions to assimilate and undermines the evolution of a common culture.
The libertarian sensibility—an important factor in the conservative resurgence—does not always acknowledge the centrality of these concerns. But unless social conservatives and fiscal conservatives forge a firm alliance, the ability of the Gingrich revolution to prevent the unraveling of the American project will remain limited. In short, champions of the free market and soldiers in the battle to eradicate affirmative action need to know that they are on the same side.
“The struggle of reason against authority has ended in what appears now to be a decisive and permanent victory for liberty,” wrote the historian J.B. Bury in The History of the Freedom of Thought. That was in 1913. Today, in the wake of the collapse of the Soviet Union, not just the U.S. but apparently the entire world is going through a burst of free-market, classical liberal triumphalism very similar to the 1890’s. You have to be uneasily aware that the previous burst ended, totally unexpectedly, in World War I and the terrible first half of the 20th century. And that is even apart from the signs of fraying in the American national project to which the editors allude.
I believe, however, that there is an objective basis for much of this free-market triumphalism in the U.S. All levels of American government now consume well over a third of economic output. The Reagan years contained but did not significantly reverse this government grab. Under Bush, it started up again. So government is now vastly more intrusive, its flaws are much more apparent, and it has many more enemies than before the New Deal, when it consumed—incredibly—not much more than a fortieth of economic output.
I think this means the pendulum will continue to swing against statism for a very long time. Similarly, in Victorian Britain it took virtually the entire 19th century to get the government’s share down from its surprisingly high levels at the end of the Napoleonic war to below a tenth of economic output in 1890. And, just as in Victorian England, there will be decades of reform and decades of reaction. Government has friends as well as enemies. They will fight. But the underlying trend will be clear.
And this will have an effect, not controlling but influential, on the intellectual and cultural superstructure. Simply put, the spontaneous and private will have the intellectual and moral edge over the engineered, public, and politicized.
This perspective also causes me to be somewhat more relaxed about the conservative resurgence represented by the Republican Congress and the Contract With America. I expect it to fail. But I expect it to resurge again. Similarly the Reagan revolution “failed,” but confounded predictions and reinvented itself as the Gingrich revolution. None of this means that key individuals involved are not responsible, and culpable, for progress, or the lack thereof. But they are more ephemeral, transitional figures than they may appear to contemporary observers (or to themselves).
I am not at all relaxed, however, about problems posed to the American national project by what the editors call, quite accurately, “unchecked immigration.” The facts here are compelling. But they are not widely understood because of the romantic haze, intellectual inertia, and downright dishonesty that surrounds the subject.
The 1965 Immigration Act triggered an influx of historically high proportions, particularly compared to current U.S. birth rates. Thus the Census Bureau projects that Americans, left to themselves, are stabilizing their population around 250 to 260 million. But the government is in effect second-guessing them through immigration policy. If present trends continue, the U.S. population will reach 390 million by 2050. More than 130 million will be post-1970 immigrants and their descendants. Because the 1965 Act arbitrarily choked off immigration from Europe, this influx has been almost all from the third world. So by 2050, whites, who were 90 percent of the population as recently as 1960, will be on the verge of becoming a minority.
This is a demographic transformation without precedent in the history of the world. It is incumbent on those who favor it to explain what makes them think it is going to work—and why they want to transform the American nation as it had evolved by 1965.
Because the new arrangements are clearly not working at the moment. The 1990 Census revealed that native-born Americans, both black and white, were fleeing the immigrant-favored areas, where they were being replaced on an almost one-for-one basis by immigrants, and going to entirely separate sections of the country—whites to the white heartland of the Midwest, the Pacific Northwest, and so on; blacks to the black areas of the South, Atlanta, Washington, D.C., and so on.
The country is coming apart ethnically under the impact of the enormous influx. This must ultimately raise what might be called the National Question: is America still that interlacing of ethnicity and culture that we call a nation—and can the American nation-state, the political expression of that nation, survive?
All of the unraveling that the editors instance—multiculturalism, dissolution of shared values, increased stratification—is exacerbated, at the very least, by immigration. This is not to say that immigration necessarily caused these policies, a point immigration enthusiasts invariably miss. “The fault, dear Peter, lies not in our immigrants but in ourselves,” New York Post columnist Maggie Gallagher wrote in what was one of the nicer reactions to my arguments. But here’s the rub: if there is a rainstorm when you have a cold, you stay indoors.
Unless there is another pause for assimilation, as there have been many times in the past, immigration will add to America’s latent sectionalism and ultimately break the country up like the late Roman empire—a crisis as utterly unexpected as World War I by the American political elite, both Left and Right.
Illegal immigration should be ended with a second Operation Wetback, as the Eisenhower administration ended the similar illegal-immigration crisis of the 1950’s: seal the borders, deport the illegals already here. Legal immigration should be halted with a five-or ten-year moratorium: no net immigration, with admissions for hardship cases or needed skills balancing the 200,000 legal residents who leave each year. During that moratorium, there should be a debate in which Americans would be asked what they want—as they have not yet been. Immigration might then be resumed, at moderate levels, with an emphasis on skills, and on evidence of cultural compatibility such as speaking English.
As a contributor and long-time subscriber to COMMENTARY, I may say it is a reproach that this position has been abandoned to presidential candidate Pat Buchanan.
The two related forces threatening the national project are the politics of need and the culture of gratification. They obviously aim to transform an America in which self-reliance and self-restraint were once uncontested rhetorical terms. But left to themselves, the politics of need and the culture of gratification will do more: they will split America into mutually hostile groups and castes.
This effect, always logically possible, is making itself felt in practice. If the purpose of politics is to gratify needs, a coalition of the needy can hang together for a generation. But as the needs of each group push to infinity, even the common purpose of plunder vanishes.
Quotas are the last attempt of the leaders of the raiding party to impose order on their troops. So it is with the passions when they are left free to gratify themselves; hence some feminists have discovered that sexual liberation may not liberate both sexes equally. These forces have been at work for a long time; Franklin D. Roosevelt, the ratifier of the politics of need, died the year COMMENTARY was founded.
But the solvent forces have not been left unchecked. They had to erode a substantial bank of social and moral capital, which they have not entirely done, and their successes have provoked a powerful, if inchoate, popular response. One manifestation of that response is the distrust of the state that Americans called into being to gratify their needs.
This distrust shades into antisocial or paranoid extremes, such as home schooling or the militia movements, but these are not necessarily fatal symptoms: there was a lot of paranoia in the American Revolution, too.
A second manifestation of backlash is the religious revival of the last 25 years, amounting to a Fourth Awakening. Terms like awakening and revival can be deceptive when applied to American religion, since, as Garry Wills has noted, religion in America “does not shift or waver”; only “the attention of its observers does.” Say instead that American religion—mostly, but not exclusively, conservative Protestantism—has again been observed, even in the streets of the secularized city of New York.
If these “reactionary” forces are to amount to anything, they need leadership. Leaders need models of success, examples of greatness that are both practical and inspiring. Over the last few years I have grown more confident that Americans can recover such examples, because I have been reading the history of the revolutionary period. If three million provincials could pull that escapade off, there should be hope for their legatees. Indeed, the task is easier now, since Americans have their history to look back on.
The main barrier between heroic example and modern practice is ignorance. I talked with Newt Gingrich about this shortly after he became Speaker. He said that the reason American intellectuals do not honor George Washington is that they “despise” him. Gingrich has been a professor, so he has first-hand experience. But I wonder if he was not exaggerating.
Some intellectuals despise the Founders; others think they do. For the most part, though, they simply have not heard about them. Once their story is presented in a compelling way, surely it will prevail. Patrick Henry is more interesting than RuPaul.
The great failure of the conservative movement is a failure of imaginative presentation. The movement has produced a glut of media spokesmen and political tacticians. The Left whines that the Right dominates political commentary now, and that is so. The same holds for electoral nuts and bolts: why else did Bill Clinton run as a relatively conservative Democrat? Why is Bob Dole running as a conservative Republican? The Right has also produced a handful of theorists in the last two decades. George Gilder has been the most wide-ranging; in the realm of law, Walter Olson has made a subtle fusion of two unmixable liquids, libertarian-ism and tradition. But we have had a shortage of bards—whom Plato wrongly dismissed as rhapsodes. William J. Bennett’s Book of Virtues has been the one conservative foray in this direction. Its success—70-plus weeks on the best-seller list—and its limitations—as Digby Anderson has pointed out, it has no ripping yarns—should encourage competition.
In the 1940’s, when some nationalist students in India asked George Bernard Shaw what Indians could do to drive the British out of their country, Shaw said they could do the work of the British better. George Orwell thought this was a frivolous answer, and perhaps for colonial politics at mid-century it was.
But it is the right answer for cultural restoration in début-de-siècle America. Conservatives need to produce fewer policy wonks and pundits, and more hacks. If we produce more hacks, we will produce more poets. The best stories are ours. We have to tell them.
Last spring, William Baker, the president of Channel 13, the New York PBS affiliate, stopped by the editorial board of the Wall Street Journal, where I was then working, to make the case for continued federal funding of public television. At our meeting I made the point that if all PBS programs were as politically unbiased as the McNeil-Lehrer News Hour, the corporation probably would not be in such hot water. To illustrate PBS’s tilt, I mentioned a few programs from the leftish Frontline series.
Baker and his program manager could not understand my point. With evident sincerity they said they did not see how anybody could perceive a liberal tilt in these programs. (Well, fish don’t know they’re wet.) Baker added that they had consulted dozens of people about their programming, and none had found anything partisan; those they consulted went from the Left all the way over to the people on the Right like Steven Rattner.
Now, Steven Rattner is a successful investment banker at Lazard Frères, and an active Democrat, centrist but not remotely conservative. If Steven Rattner is the rightward edge of Baker’s political frame, then he is living in a hermetically sealed political culture.
Indeed, we now have two political cultures in this country, one headquartered in Washington that is political and conservative, another headquartered in New York and on university campuses that is cultural and aesthetic and, in a loose sense, liberal. These cultures do not share common assumptions. Each has its own criteria for success, and those criteria are mutually exclusive. Someone who succeeds in William Baker’s cultural establishment of New York cannot expect a warm reception in Congress. Someone like Newt Gingrich who succeeds in the new political establishment in Washington cannot expect favorable treatment from Vanity Fair.
The institutions caught between these two establishments have trouble. Public broadcasting operates in the New York cultural space, but must win funding on Capitol Hill. Conservative intellectuals may win praise from Dick Armey, but they also need to be published in or get reviewed by the New York cultural institutions. To win favor in both cultures is not impossible, but it is very difficult.
Political scientists now talk about the two-humped camel. They mean that the political diagram of Congress is no longer shaped like a bell curve, with a great mass in the center and thinning out as you get to the extremes. Now, looking at the legislatures, we have a valley in the center and two masses of people forming the humps on Right and Left. The Republican party has become more conservative and the Democratic party more liberal.
This same pattern applies to the large numbers of professionals in what has come to be known as the New Class. Even some who now identify themselves as being beyond Left and Right on policy issues are, as a matter of sensibility, aligned with the New York culture. That is to say, they find Newt Gingrich creepy and believe that the typical member of the Christian Coalition is (in the now-notorious words of the Washington Post) poor, uneducated, and easy to command.
This polarization is not necessarily unhealthy. It is the sign of a transitional moment, with 60 years of liberal dominance of both realms teetering. Nonetheless, the situation has its drawbacks. For one thing, there is the loss of civility. In the back of right-wing magazines you can see small ads with headlines like “Offend a Liberal, Wear This T-shirt,” as if offending liberals were some great blow for justice. On the other side, when I moved to Washington and my neighbors found out I was a conservative, a few made unpleasant comments or reacted with uncomprehending disbelief, as if I had invaded from a different civilization.
For another, more important, thing, it cannot help that our political polarization overlaps with America’s racial polarization. Blacks are mostly on the Left, and many of them interpret attacks on liberalism as assaults on black people. But the biggest challenge posed by polarization is that nothing remains uncontroversial. Few behavioral standards are above dispute. Few codes of conduct are so widely accepted that people conform to them reflexively, without having to think. Marriage, how to educate first-graders, even sexual techniques are now subjects of political controversy. The New Left said that the personal is political and this is true today as well. Morality, as Nietzsche predicted, has become a problem.
The crucial questions now concern the correlation of forces. Will conservatives be able to use the strength of political institutions to turn the culture in a conservative direction? Or will liberals be able to use the strength of the culture to liberalize the polity? Or will the fight go on forever?
As for national collapse, each side loves its enemies too much ever to break apart as a nation. Over the past quarter-century, tension between Left and Right has made America the most dynamic nation on earth. It is a creative conflict. We are not calm, we are not well-mannered, and out of our chaos grow real problems, but we are the nation that sends its vibrations across the globe. Nations with calm political cultures do not do that. On the whole, I would rather live in brawling, buzzing America than in (for example) staid and coherent Germany.
Is our national project unraveling? No, if the words breakdown or balkanization are to be taken literally. The United States will neither collapse nor fragment along ethnic lines. It will remain for quite some time an ongoing social and political system as well as the globe’s only superpower. However, I do believe that our society is experiencing a debilitating cultural collision between its popular grass-roots culture and its celebrity-obsessed, TV-driven, style-setting culture. This is contributing to the continuing political polarization and cultural antagonism.
During much of America’s history, the society’s elitist and prescriptive culture could be labeled a North Sea culture because of its origin, character, and norms. There was, to be sure, a great deal of hypocrisy involved in the elite’s professed commitment to the primacy of Protestant religious-ethical values, in the proclaimed emphasis on self-discipline, tradition, and personal probity—but even hypocrisy is a bow to virtue.
On the grass-roots level, this dominant and style-setting North Sea culture was partaken of by growing numbers of non-Nordic or Anglo-Saxon immigrants: the Slavs, the Jews, the Italians, etc. It served as the standard for imitation and assimilation. The family, the schools, and the churches were the primary instruments for the cultural induction of the new immigrants into a society that was open to change but which was guided, at least formally, by a moral code.
Moreover, the style-setting culture of the established elite did not clash fundamentally with the religious beliefs and social traditions of the non-Nordic masses. Assimilation upward did not require a dramatic rupture in personal values and social conduct. That made it easier to adapt oneself and yet to retain some vague links with the past. Cultural compromise was the socially viable result.
In recent years, the collapse of the Wasp elite and the replacement of the traditional instruments for inculcating values by the TV-Hollywood-Mass-Media cartel has produced in America a new dominant and style-setting culture. It can be called a Mediterranean Sea culture in order to underline its contrast to the North Sea ethic. It stresses self-enjoyment, entertainment, sexual promiscuity, and the almost explicit repudiation of any social norms.
The newly dominant Mediterranean culture collides with the more traditional grass-roots values of America. Yesterday’s cultural compromise is thus being shattered. Controlled by a cartel that is driven exclusively by material self-interest, TV has replaced the schools, churches, and even the family as the principal mechanism for the transmission of values.
In the past, undeniably, the family (especially the rich ones), the churches, and the schools were likewise—as social institutions—motivated to some extent by self-interest. However, they were also the explicit exponents of moral or religious values that were not determined by greed alone. By contrast, today’s purveyors of the style-setting culture are concerned entirely with profit, and huge personal profits at that. They thus cater to, and deliberately exploit, the more perverse human instincts: they compete in the dissemination of cultural pornography, on the principle of Gresham’s Law that bad currency is more appealing than good.
The cartel also quite deliberately promotes the worship of celebrities as a substitute for the role previously played by the established elite and moral leaders. These celebrities, through their highly publicized conduct, by and large foster the values of greed and encourage the illusion of a permissive cornucopia as the ideal definition of social reality. To make matters worse, some of our top political leaders are happy to play the role of supporting cast in this demoralizing social deception.
The impact of all this on the latest immigrants is not very likely to promote a cohesive society of shared values. In that respect, increasing multiculturalism—since it is not being subjected to a binding ethical code—might become quite disruptive, making a viable cultural compromise less attainable.
Has my thinking regarding the stability of American institutions changed in recent years? No, since I do think that they are basically stable, though some of them are stalemated and even discredited; that deplorable condition can last a long time.
In my last book, Out of Control, I listed twenty economic, social, and cultural challenges to contemporary America. The last third of the twenty are cultural-moral in character, and it will take the longest time to mitigate or correct them, in part because of the collapse of the cultural compromise. That is why it is likely that America will experience a prolonged period of philosophical and political confusion.
Despite my very real concerns, however, I do not accept an apocalyptic vision as the inevitable future for the United States. America is already showing an impressive capacity to respond to its foreign economic rivals, and with better political leadership it could continue to exercise effective global leadership. Indeed, even a reversal of America’s cultural-moral degeneration is feasible, especially if the potential for such a reversal were to be intelligently tapped.
Is the conservative resurgence arresting or reversing some of the negative trends? Yes, but so far only in part, and in many respects in a rather simplistic and confused fashion.’ Certainly, the emotionally charged rhetoric of the so-called Christian Right and the extremist manifestations in the Republican party can hardly be considered relevant guides for the future. (In that respect, the extreme Right and the liberal Left—with the latter’s worship of social deviance—are progressively marginalizing themselves.)
The instinctive repudiation by America’s grass roots of the TV-Hollywood-Mass-Media cartel is so far being expressed less through intellectual or philosophical argumentation than through challenges to the material interests of the cartel’s moguls. The beginning of such a counterattack against these interests may be seen in the defeat of the otherwise culturally innocuous (and vacuous) Disneyland in Virginia by a coalition of the old elite and the grass roots. That coalition’s success might serve as a precedent for what more generally could—and should—happen to the advertising sponsors of cultural profanity, especially as the public becomes more generally aroused. (The outrage provoked by the tawdry Calvin Klein ads is another case in point.)
In time, we may even see a renewal of shared ethical consensus in American society, a renewal driven more broadly by such new social or philosophical concerns as the ecological dimensions of human survival or the meaning of life in the scientific age. That would make America morally a more appealing and socially a more cohesive society.
William F. Buckley, Jr.
In reading the galleys of Richard Powers’s Not Without Honor: The History of American Anti-communism, I was reminded that in 1978 George F. Kennan lowered the defenses of the United States. He spoke to those high tables where the elite foregather to pool their most recent moral conclusions and said that he no longer believed that the United States had anything to teach the Soviet Union. This was an early venture in equivalence—What is the difference, after all, between them and us? In America we were making no progress in removing our slums, eliminating poverty, containing pornography, restoring civility, nurturing the environment, reducing crime, raising the level of literacy. Under the circumstances, Kennan wanted to know, why do U.S. leaders speak condescendingly to their counterparts in Moscow? What are the differences between them and us?
Let us finesse the temptation to rub the nose of a gifted and industrious scholar in the differences between Soviet life as described by, say, Aleksandr Solzhenitsyn and life elsewhere. Instead, the mood (my mood, perhaps COMMENTARY’s also) is to focus on Kennan’s indictment and wonder why he had not simply confined himself to criticizing America (leaving the Soviet Union out of it). Because he was right, in 1978, that there were manifest shortcomings in our society, and since then they appear graver, in the light of what appears as comparative immobility. Worse, the graphs since then point, mostly, in the wrong direction. Crime is up over when Kennan spoke. So is illiteracy. Pornography is available with the touch of a remote-control unit. And the illegitimacy rate, in which so many other concerns are subsumed, is in virtual free fall.
The editors of COMMENTARY wonder whether these problems foretell the end of the American dream or whether, given America’s penchant to succeed, we can anticipate a happy, or at least bearable, ending to it all. Yes, it is a season of foreboding, even if we agree to nod our heads in hazy acquiescence when the Gipper addresses us with his wonderful buoyancies and transfigures the data with his magic words and irrepressible thoughts.
To every one of our problems there are approaches that commend themselves to non-ideological thinking. Education?—Encourage the voucher-system substitute. Crime?—More police, stricter courts. Pornography?—Revisit the First Amendment on the shoulders of those who devised it, in place of those who have misinterpreted it. And so on.
But although Charles Murray has advanced one approach to the problem of illegitimate births, no one is truly convinced that an end to the provision of welfare would bring on anything like an end to unguarded sexual promiscuity. Something more is required. We need to accost not transient folkways but institutional mores. It is one thing to require a student to put on a tie, another to civilize him.
I wonder whether we suffer from a failure to exploit those aspects of our social arrangements that we mostly tend to hide, or to ameliorate, or to ignore. The gravamen of the case against America made by Left critics has to do with the sharp edges of life in a free society. What about those who do not learn to read and write? What about those who are abused as children? Those who drift toward, and are caught by, drug addiction?
Maybe the cost of life in a free society should not be made more agreeable for those who fail to accept socialization, but less so. To say it in so many words, what is gained by the public indifference to the serial marriages of Elizabeth Taylor? Is there not a point at which society should turn against those who mock marriage? What favor do we confer on the student who refuses to learn to read by letting him, assuming he does not suffer from congenital disability, continue in illiteracy? Why are we so resolute in seeking to “understand” those whose behavior is antisocial, whether by disrupting classrooms or giving no thought to the creation of a child to whom no attention will be given after insemination? Or mocking the responsibilities of marriage? Or giving way to booze and drugs?
It is a democratic habit to resist any crystallization of status. Does it really make sense to avoid any public distinction of the citizen who accepts responsibilities in contrast to the citizen who does not? The upwardly mobile society properly rejects strata but only when they are the creatures of prejudice. A free society needs to be hospitable to virtue but should be inhospitable to dereliction.
In fact there really are what one might loosely call first-class citizens, and there are second-class citizens. In a book published a few years ago (Gratitude: Reflections on What We Owe to Our Country), I advanced universal voluntary service as a corporate ideal, a corporate objective. After their year of service, the young men and women who gave their time would be distinguished in formal circumstances from those who did not. First-class citizens and second-class citizens. Not a scarlet letter—these are indelible. Some might wait until they were sixty to remit their debt to their country by giving a year of their time to the care of the old, or to environmental enterprises, or to teaching, or maintaining the peace. But for as long as they put it off, they would be second-class citizens.
It is for another essay to devise suitable rewards and acceptable tribulations. Let it rest that as long as the behavior of Elizabeth Taylor merely amuses, we are incapable—or unwilling—to generate stigma. And without the capacity to stigmatize, a society loses its capacity to exert pressure for reform. People no longer (in general) inveigh against Jews or segregate blacks because they know now that it is wrong to do so, and those who behave wrongly in these matters will suffer palpable consequences for their misbehavior.
A second-class citizen fathers a child he proceeds to ignore. That is the central, the overwhelming, problem. Briefly: if the birth rate of single-parent children were reduced to the level of 1965, what problems would we be left with? I mean, that we could not, with modest confidence, reassure George Kennan about?
The most striking development in American political life since 1945 has been the growth of congressional activity. Today Congress spends more money than ever, passes lengthier laws, hires more staff, and has vastly increased the cost of running the institution. It would seem that the conservative Republican takeover of Congress, with an agenda of smaller, more limited government, could not have arrived at a better time.
In the next few years we can certainly expect the new Republican majorities in Congress to make some impact on controlling our metastasizing government. But conservative partisans delude themselves if they believe that merely a cutback in federal spending will be sufficient to revive Congress as a revered institution of our government. Despite all the “less-government” boilerplate generated by the 104th Congress, our chief problem with the institution is not the raw size of the bureaucracy it endows (although that cannot be easily ignored). The problem with Congress is its persistent inability to act in a limited way. The unfocused conduct and boundless scope of its legislative activity have contributed, more than anything else, to the breakdown in our national conversation about how a democracy should best deal with its problems.
Look, for example, at the core issues that have dominated public concern for more than a quarter-century: crime, poverty, and the economy. In each case, there is broad public support to halt or even reverse decades of liberal congressional activism. Yet few in Washington seem content to slow the congressional pulse. Instead, they try to institute a newly activist conservative agenda, which, though preferable, is no less a threat to genuinely limited government.
Let’s begin with crime. Although the federal government can exercise very little influence on urban street crime, that has not prevented the Congress from considering a massive omnibus crime bill every other year. For some time, John J. DiIulio, Jr. of Princeton has persuasively argued that the single most important crime-control measure Congress could pass would be one that stopped federal judges from interfering in state and local prisons, forcing corrections officials to release repeat offenders. This eminently sensible step has won broad support among Republican and Democratic prosecutors across the country. Yet the Congress seems incapable of simply passing this limited, focused reform and moving on. Instead, it must add layers of less urgent legislation (a federal provision for car-jacking, for example, or another aimed at “crimes against women”). The final package quickly becomes a massive, incoherent bill that has little to do with the actual problems we confront.
We see the same tendency in welfare policy. Earlier this year, one Senator proposed ending the entitlement status of the country’s major welfare programs and returning control of the programs to the states. Initial legislation to accomplish this straightforward goal ran to some 25 pages. Yet before long, the simplicity of the proposal (which nevertheless provoked howls of opposition from activist groups) was overrun by more ambitious efforts to create an immense overhaul of welfare, touching everything from immigration policy to “elder care.” When finally introduced, the leading Republican welfare bill weighed in at more than 800 pages.
On the economic-policy front, meanwhile, ever more demonstrative acts of legislative prowess have become the standard. For better or worse, the Republicans in Congress have committed themselves to balance the federal budget in seven years. (One GOP presidential candidate even promises to step down if he fails to bring the deficit to zero in four years.) Make no mistake: this reform is an immense congressional undertaking. While there is certainly political support for it, its immediate effect has been to forestall other incremental steps (a cut in the capital-gains tax, a less rigid IRA system, a repeal of the Clinton tax hikes) that might actually be better for the economy, even if they failed to advance the cause of balanced budgeting.
Even the current tax-reform debate is now mired in competing utopian visions, one side pushing to rewrite the entire tax code, the other suggesting that the Sixteenth Amendment itself be repealed. There is no patience, it seems, for moving ahead on just a handful of discrete (albeit less sexy) measures that would spur economic growth and reduce government control of the economy. The congressional impulse is strictly for reform on the grandest scale.
Other examples abound. During last year’s health-care debate, the so-called conservative, free-market alternatives to the Clinton health-care reform were no less sweeping in their attempts to remake American medical practice. This year Congress has spent months debating a telecommunications bill unprecedented in its scope; the Senate version, should anyone bother to read it, runs to 289,000 words. Even the exuberant rhetoric of the current Republican leaders suggests a vision of Congress with an unlimited appetite for action. Newt Gingrich, the most voluble and dynamic Republican leader in 50 years, speaks of nothing less than “renewing American civilization.”
Some may argue that with a Congress so calcified after 40 years of Democratic rule, nothing but bold strokes will do. Yet a sharp retreat from liberal ideology should not require a series of omnibus bills from the Right. What is needed in its place is what William Kristol once termed “principled incrementalism.” The boldness that would best distinguish a truly conservative era would be the boldness of restraint: a Congress that did fewer things and focused its attention more narrowly.
At the start of the Reagan era fifteen years ago, the late Irving Younger offered in these pages a somewhat whimsical set of rules which, if acted upon, would have fundamentally changed the character of our legislative process and restored a sense of seriousness to American law-making (“Socrates and Us,” December 1980). Among his suggestions were that no bill could become law unless the members who voted for it had actually read it. Another was that no bill could become law unless it was written in language comprehensible to ordinary Americans. A third suggestion was that legislation must have a clear and achievable purpose.
Today, at the start of the post-Reagan era, there is more wisdom than whimsy in those suggestions. Perhaps they should form the basis of a truly reformed, limited-government Congress that would once again make the art of deliberation and reflection into the hallmark of our democracy.
Of all the intractable problems facing America—crime, illegitimacy, a breakdown of community—the issue that provokes the most apocalyptic warnings is, oddly, immigration. Across the political and philosophical spectrum, from Barbara Jordan to Pat Buchanan, from James Fallows to Peter Brimelow, the alarm goes out: we are being inundated by a flood of nonwhite immigrants who will transform America into, in Brimelow’s memorable phrase, an alien nation.
On the liberal Left, even the establishment Atlantic Monthly has grown hysterical. The tag line of a cover story in the magazine last December by Matthew Connelly and Paul Kennedy ominously asks, “. . . will the wretched of the earth overwhelm the Western paradise?” Invoking horrific images of colored hordes headed West in a third-world armada, in a scenario taken from the pages of Jean Raspail’s thoroughly racist jeremiad, Camp of the Saints (1975), Connelly and Kennedy muse whether the ill-fated voyage of the Golden Venture, which went aground off New York City in 1993 with some 300 illegal Chinese immigrants aboard, is not a portent of future Western collapse.
To stave off the possibility of the West’s demise, Connelly and Kennedy offer a lengthy series of prescriptions, including: more development aid to poor nations; more contraceptives; a redeployed army of scientists and engineers from the former Soviet empire dispatched to rescue Asia and Africa; a beefed-up, permanent UN rapid-deployment military force to quell civil wars and rebellions; and, incredibly, a binding international agreement, on a par with the Universal Declaration of Human Rights, recognizing “cultural diversity, both within countries and between technologically dominant cultures and the rest of the globe.”
As confused as the Left is on the issue, however, the anti-immigrant Right is hardly more consistent or rational. Two things worry conservatives about immigrants: that immigrants will become a dependent class producing a huge infusion of new clients for the welfare state; and that Asian and, especially, Latin culture will transcend our Anglo-American heritage, leaving us a polyglot, balkanized people with no common culture.
I share concerns about the effect on American society of both the welfare state and multicultural-ism; I just do not believe that either problem is caused, or even much exacerbated, by immigration. Immigrants, for the most part, are more likely to be in the labor force and less likely to be dependent on welfare than the native-born. Refugees and elderly immigrants are exceptions to this general rule, but that is largely because federal refugee policy actually encourages welfare dependence and because the government rarely enforces rules requiring sponsors to bear financial responsibility for immigrants who become indigent.
Rather than attacking these specific policies, however, some conservatives are hell-bent on halting immigration altogether, or at least drastically reducing it. The new, more conservative Congress apparently agrees, and is about to enact drastic cuts in the number of legal immigrants, including highly skilled ones. Dozens of high-tech company executives recently lobbied on Capitol Hill against such proposals, to no avail. One vice president of Sun Microsystems, a Silicon Valley computer-chip manufacturer, complained that immigration restriction “is going to kill us. We will not he able to compete.” Why indeed are we closing our doors to immigrants who are twice as likely as natives to hold Ph.D.’s? This kind of fear-driven medicine is worse than the disease it was intended to cure.
On the culture front, too, the solution—limiting immigration—is ill-suited to the problem. Immigrants are not demanding multicultural textbooks or, for that matter, bilingual education. In fitting irony, the real challenge to bilingual programs is now being mounted precisely by Hispanic immigrant parents, who know full well that their children must learn English quickly if they are to succeed in America, and that teaching them in Spanish for most of the school day is not the way to accomplish it.
So long as American-born, English-speaking Hispanics made up the bulk of students in bilingual classes (they still comprise more than half of the students in such programs), bilingual education remained merely a wasteful and expensive sop to ethnic pride—a Hispanic version of the self-esteem movement that spawned Afrocentric curricula across the country. The influx of tens of thousands of immigrant children into bilingual classes, however, has produced a genuine crisis. School districts cannot possibly afford to teach all these children in their native-languages, and Hispanic immigrants have begun to balk at native-language instruction for their children alone, especially as they watch Korean, Vietnamese, and Russian immigrant children learn English more quickly in English-immersion or English-as-a-second-language classes. A recent lawsuit against the New York State Commissioner of Education by a group of 150 Brooklyn parents, most of them Mexican immigrants, is the first of its kind alleging that bilingual education deprives students of the right to be mainstreamed into English classes. Immigration may well prove the undoing of bilingual education.
Americans have never had more reason to be confident than we do today. We won the cold war. Free-market capitalism has spread to every continent on the globe. Our own economy is the most productive in the world. We still lead the world in technological and scientific innovation. Our popular culture—for better or worse—holds hegemonic sway everywhere. America remains one of the few places in the world where hard work and perseverance are enough to make a person, if not always rich, at least comfortable, no matter how humble his origins. Immigrants, perhaps more than those who were born here, know and appreciate this, which is why a million people a year give up home, family, and friends to come to these shores. This fact is a sign of American strength, not of incipient cultural weakness as the immigration restrictionists would have us believe.
Eliot A. Cohen
If the United States really is in such a bad way as the questions posed in this symposium suggest, the prospect is dreadful, not for this country alone but for the globe. If the United States really is corrupt, decadent, and disintegrating, it will not long exercise international leadership or use its tremendous power to shape international relations for the good. Nothing indicates that mankind overall has become kinder or wiser over the course of this last bloody century. To think that our current prosperity and peace rest on something nobler and more durable than power well-understood and properly wielded is to misunderstand our world gravely.
In truth, I think the signs are better than that. There is, of course, the sudden stunning victory of the Republican party in the last set of congressional elections, and, more importantly, the paralysis and decay of a liberal establishment that had grown infirm, corrupt, and in some parts degenerate. But the fundamental health of our politics rests less on this than on an effervescence from below, on the ability of city and state governments to depart from destructive social policies.
Politics, however, is not our fundamental problem; rather, some of the undesirable offshoots of capitalism unrestrained by morality are. The corruption of our culture by some of the market’s more loathsome products—from Beavis and Butthead to gangsta rap—is the matter of gravest urgency. Here simple Republican principles of governance may not be all that much help.
The high culture is, in some way, protected: there will always be wealthy patrons of opera and symphonies, and enough of a discriminating book-buying public to reward great writers. It is the middle level of culture that is in a truly parlous state. The steady deterioration of the fare our children watch and to which they listen is something for which we have no ready remedies. The libertarian mood of the moment is, in this respect, almost as destructive as the unashamed avarice of the media that are such a powerful force in our society. The austere virtues are, if not driven from the field, confined to a corner. But the forces of the market in entertainment can work their effects only on a society that welcomes them. When enough people are outraged or disgusted, the media barons yield, for their commitment to their own free speech rests on profit, not dearly held principle.
Tocqueville was altogether right when he declared that mores preserve liberal democracy to a greater degree than do laws. There is not much the government, and the federal government in particular, can do to repair the fabric of public morality, although the statesmanlike thing is to prevent government from doing much harm—no mean task.
What counts more, however, is the struggle to shelter the family from the corrosive effects of both untrammeled market and ill-conceived law, and to nurture the revival in religious awareness that is a feature of the last decade. Public pronunciamentos on such matters, however, will have only a limited effect, and in any case open up those who make them to charges of insincerity unless they are reflected in private action. As the saying goes, it’s not enough to talk the talk—conservatives have to walk the walk. Never have concrete actions—playing a leading role in local community institutions such as churches, synagogues, and school boards, setting an example in our homes of the principles we proclaim in the public square—been more important.
It is already clear that the new politics may require the development of some strange new alliances and even the severance of some old ones. Conservatives will, on some points, have more in common with feminists, environmental activists, and, in general, the tender-hearted than with exponents of minimal government and free enterprise pur et dur. There is already a split between those conservatives who sympathize with the Christian Right and those who fear it. What is more important, however, is explaining first to ourselves, and then to others, why we are not, in fact, libertarians, and how we wish to balance personal freedom and the social cohesion that is indispensable to civilized life.
For decades now neoconservatives have been in opposition—deploring proponents of a flaccid diplomacy in the cold war, arguing against inane academic fads in the humanities and social sciences, ridiculing grandiose uses of government power in such fields as medical care or welfare. The more difficult but altogether healthier challenge of creating a positive program is upon us. We shall succeed in doing so, however, only if we do on a small scale what we urge upon our fellow citizens in national life.
A generational change has begun to occur in the leadership of the conservative or, if one wishes, the neoconservative movement. Norman Podhoretz’s retirement as editor of this magazine and William Kristol’s founding of a new publication, the Weekly Standard, remind us that a new generation is coming into its own. That generation, of which I am part, did not have the formative experiences of its elders—World War II, military service, the struggle against a Left that was compelling in the society at large and not merely on college campuses, the deadly seriousness of the cold war at its height. Most of us did not fight in our war, which was Vietnam, although our formative political experience included the furor on the campuses during the 1960’s. Our President was Ronald Reagan, not Roosevelt or Truman—and with all of his virtues, Reagan was neither a Roosevelt nor a Truman. To be blunt, my generation has had neither the toughening nor the inspiration that its elders experienced.
All the more need, then, for us to temper our characters in the quieter forms of discipline and service, as we confront the challenges that lie before us.
Werner J. Dannhauser
Is the national project unraveling? I think so. All I have learned in my life from both Athens and Jerusalem inclines me to suspect that the decline and disappearance of human things must occur sooner or later. The tradition of political philosophy stemming from Athens knows all about the frailties of mortal man: ancient Thucydides knew that his history of the Peloponnesian war would outlive Athens, and modern Tocqueville contemplates the coming of a time when there will be no United States of America.
To be sure, a countervailing wind of hope blows from Jerusalem. God made a covenant with us, and I have faith that He will not destroy humanity in general or the Jews in particular—although He grants us the freedom to destroy ourselves. And what we Jews are doing to ourselves informs the sadness of my views. I speak mainly, but not exclusively, of the self-destructive policies pursued by the state of Israel these days. This symposium, of course, does not focus on the national prospect of Israel, but Milton Himmelfarb was right when he once wrote in these pages that American Jews cannot make it without Israel. It is true that American Jews are only a small part of the American prospect, but it is also true that a world in which Israel was, God forbid, annihilated would be a world in which all decent people should be ashamed to live.
Limiting my gaze to these shores, I detect all too many ominous signs. The general coarsening of culture—TV talk shows, movies that combine technical virtuosity with utter emptiness, the debasement of the English language, New York Times editorials, you name it—has continued throughout my adult life (and I am no longer young), but that amounts to no more than a non-fatal lingering infection when compared to other factors.
Our failure to come to terms with the problems of race relations threatens to undo us before other fatal diseases have a chance to take their toll. Racial bigotry runs deeper and uglier than conservatives are wont to admit, and our inability to be severely just so long as justice is color-blind is much more glaring than liberals are able to admit. I see no end to racial quotas in my time. For a while, there was genuine progress, a change for the better, but I realize with some dismay that I have never had a close black friend. My predictions almost always turn out to be wrong, so it may be a good sign for the nation when I prophesy that the consequences of the O.J. Simpson trial will be horrifying.
The ravages brought on by radical feminism pose a deeper though perhaps less immediate threat. The nuclear family, and there is no other, heads toward obsolescence and we seem powerless to do anything about it. In the 60’s in COMMENTARY, Leslie Farber feared for the viability of sex, and he was right to do so (“I’m Sorry, Dear,” November 1964). People still “do it,” but it is less meaningful today, and generally less fun, as love has lost its savor and runs the danger of losing its soul. Outrage against sexual harassment is alive and thriving, though it is a crime that far outstrips malingering and being a public nuisance in its refusal to be defined precisely, no matter how copiously we are graced by the rhetoric of Senator Boxer, Senator Mikulski, and their spiritual kin. Instead of morality defined by Rousseau as the condition of being strict with oneself and lenient with others, we find a moralism that consists of being lenient only with one’s self. And when we men are berated for “not getting it,” we are meant to feel shame and contrition for the way God and nature made us.
In my capacity as a teacher, I am troubled not only because our problems are insoluble but because they are becoming undiscussable. The universities are indisputably less free today than they were 50 years ago. The dogmatic politics of relativism and radical egalitarianism pollutes our discourse. It is already a badge of honor to be labeled an elitist. We professors, and I do not exclude myself, watch and modify what we say, though what is unsayable becomes unthinkable for most human beings, and there was a time when universities prided themselves as being places where nothing was unthinkable.
The state of affairs I have tried to sketch out has provoked a response, but I am not sure yet that it amounts to a conservative resurgence. Nevertheless, as the old song has it, “last November there was held a big election,” and I delight in the results. Newt Gingrich is no doubt a political genius who brought about results I thought impossible. (Remember, however, that I am a political scientist.) To my regret, I must at once add that my enthusiasm for this impressive Republican victory has already been dampened, and not only because Speaker Gingrich speaks too much. The Right obviously has evils of its own to combat. I worry about the resurgence of old-fashioned Republican mean-spiritedness, as can be seen in the animosity toward immigrants, of whom I was one more than 50 years ago. Moreover, I am disturbed by the repeated pressure to amend the Constitution in various ways, and I find all proposed amendments on the public agenda today to be either silly or dangerous or both.
On the whole, though, the Republicans are fighting the good fight today. Long-range decline may be inevitable but short-range rejuvenation remains possible. As G.K. Chesterton said, if it’s worth doing it’s worth doing badly. I am pleased as well as proud to be a soldier in the ranks of conservatism. I wish I could feel the cheerfulness that makes for effectiveness in American politics. I wish I could stop feeling that today we can do little more than to buy time, precious little time, a little precious time.
In 1980, the late Leopold Labedz quipped, “The Soviet Union and the West are in a race to decadence. So far at least the West just might be losing.”
Judging from the condition that the Communists left behind in the new Russia, the West, particularly the United States, was never even seriously in the running. Still, the conclusion of the 45-year demand on us to remain at least minimally pulled together that went by the name of cold war has left us exposed in a most curious condition. It is this condition, I think, rather than the question of the sturdiness of American institutions that we must begin to consider in responding to COMMENTARY’s query. For the struggle against Communism (along with the equally fierce and costly internal struggle against those who have in one way or another opposed the struggle against Communism) served to mask a predicament that is unprecedented in the whole history of human experience, from the expulsion from Eden on down. Call this new predicament the second American challenge.
The first American challenge was that best articulated by Abraham Lincoln—in a sense now being echoed by COMMENTARY’s symposium—when he declared the Civil War a test to determine whether “a nation so conceived and so dedicated can long endure.” As we have even better reason than Lincoln to know, much in the great wide world beyond this society’s own welfare has hinged upon America’s response to his question. And it seems to me that the 100 years of rich and various and often bitter history stretching from the end of the Civil War to, say, the Free Speech Movement in Berkeley have left not a moment’s doubt about the solidity and staying power of the country’s political institutions. They have not only lasted, but lasted longer than those of any other single form of government on earth.
Those of us who are currently engaged in the battle to wrest some saving cultural power from the soiled and grasping hands of the liberal Left often speak as if the very existence of the polity as we know it is at stake in the outcome of our labors. But this we do only in the heat and hastiness of warfare; the Constitution, however bloodied, still stands—the glory of the world—for all to see.
It is nonetheless true that our common life as a society is in a parlous condition. The battle I have mentioned might be called the domestic cold war, one in which there is even less possibility of arriving at understandings and settlements than there was in its now-defunct international namesake. And in which, to take the parallel a little further, ever-greater numbers of people show signs of growing restless and rebellious at their entrapment within the airless prison of liberal piety. I am afraid, however, that just as America’s triumph in the cold war has proved to be far from enough to bring about the full-scale liberation of the Russians from their Communist jailers, so even a whole raft of conservative victories in the domestic cold war—from bringing down the welfare system to restoring serious education to reducing the reach of government to restricting abortion to you-name-it—will not by themselves make us safe from the single most serious threat that hangs over us. I am speaking of infectious nihilism.
Which brings me to what I have called the second American challenge. As Lincoln could not know what would in a century’s time be the full, gorgeous consequence of his having determined so bloodily to preserve the union, so it may take a century to know the outcome of the adventure on which we are, willy-nilly, now embarked. I mean by this our having been granted the possibility to live under physical conditions so benign as to beggar the uses of mere tradition in helping us to deal with them. Consider just a few of the conditions I am referring to. The words “hard labor” have virtually lost all meaning to Americans facing the 21st century. Nor for most people does work to support oneself any longer take the whole of one’s waking day.
From which it follows that the country abounds in ever-increasing forms of, and facilities for, recreational pleasure. All commonplace childhood diseases have virtually disappeared. Being required to bury one’s children—surely the most wrenching of all human sorrows, and once far from an uncommon experience—has become a rarity (at least for those whose children are not involved in such voluntary forms of suicide as serious drug use and/or gang warfare). People are able to take for granted that they will live long and no longer suffer the encroaching debilities of old age until well past their allotment of threescore years and ten. Technology has provided nearly universal means for assuaging extremes of heat and cold, while medicines and disciplines have been found to preempt virtually every pain and every ill-feeling, whether physical, psychic, or spiritual.
Lately it has become obvious just what effect all this unprecedented good fortune is having on us: it is simply making us crazy. Our good health, for example, has become a disease: surely never have so many people paid so much attention to the properties of what they put into their mouths or what they take in through their nostrils or which of their muscles requires what amount of special attention; never have they medicated themselves so heavily or been so pharmacologically learned. For every discomfort, physical or emotional, it is assumed that only ill will—the government’s, the system’s—can be interfering with instantaneous succor. Moreover, freedom from what were once the most taken-for-granted of life’s necessities has driven a whole generation of young women into frenzies of rage and bewilderment, feeling that they must somehow have been cheated but unable to determine exactly what it is they have been cheated of. Young men, no longer required to do many difficult things, such as marry the girls they get pregnant or serve their country in war, now find it attractive and self-testing to do such things as fall out of airplanes or jump off high places with their legs attached to an elastic cord.
This list of craziness could go on and on; the state of California, which is, after all, geographically speaking, pretty close to paradise, could all by itself provide volumes of examples. One can sum up the condition by saying that Americans are a people simultaneously sated and starving. Many conservatives, the ones known as cultural conservatives, are groping around in the neighborhood of this problem, but so far only God knows how to help his wayward American children turn their blessings into blessings.
America’s social crisis is partly an intellectual crisis. The nation’s public policy is in important respects guided by the assumptions of cultural relativism, which remains the central foundation of liberal anti-racism. Cultural relativism arose in the early part of this century to challenge the old racism, which hierarchically ranked groups in three stages: savagery, barbarism, and civilization. By asserting the equality of all cultures, and the adaptive value of all behavioral norms, relativism helped to undermine 19th-century racism.
But the solution to an old problem has become the source of a new one. Cultural relativism now prevents liberals from recognizing a civilizational breakdown that is national in scope but whose effects are disproportionately felt by poor blacks. This breakdown is characterized by extremely high crime rates, the normalization of illegitimacy, an excessive reliance on government provision, and a contempt for the virtues of civility, discipline, and deferred gratification. If these trends persist and metastasize, then the American Century, which really began in 1945, will prematurely come to an end.
There is some evidence that Americans increasingly oppose the two central policy expressions of cultural relativism: the doctrine of group equality or proportional representation, which is the foundation of our civil-rights laws; and the doctrine of multiculturalism, which offers a basis for group identity and an educational program for our schools and universities. Public resistance, as reflected in the battles over political correctness and now affirmative action, is widespread but inarticulate. Even though proportional representation and multiculturalism have been largely discredited, they continue to be promoted by institutionalized interests and are likely to survive, in scaled-back form, into the 21st century.
Immigration is not the problem. The challenges faced by newcomers, such as what language to speak, how to gain access to credit, and a feeling of cultural displacement and isolation, are precisely the same as those faced by earlier generations of immigrants. Moreover, nativism against nonwhite immigrants is confined to states like Florida, Texas, and California which are experiencing regional indigestion, and is considerably weaker than that which greeted the turn-of-the-century waves of Irish, Italian, and Jewish immigrants. Multiculturalism, which seeks to unite the cause of nonwhite immigrants and African-Americans, offers a rainbow diversion from the specific problems of native-born blacks.
What we need is a cultural restoration based upon a revival of the ancient distinction between civilization and barbarism. In practice, this means that the liberal conception of rights unaccompanied by responsibilities or duties is fundamentally unsound. Our social policies, which for a generation have asked only what would be the redistributive effect of this or that program, should now be based upon an entirely different question: what is the likely effect of this policy upon the civic behavior of American citizens? Moreover, black leadership needs to redirect its efforts away from wresting political and financial concessions from whites and toward the neglected project of rebuilding broken families, reducing crime, and strengthening the entrepreneurial base of the inner city.
The conservative-intellectual renaissance and Republican political gains are based largely on the fact that these two groups are the only ones that recognize the extent of American cultural breakdown. Yet the Republican and conservative communities have not found themselves exempt from the social debris of the 1960’s. Many conservatives have become accustomed, in their own families, to coping with social pathologies like serial divorce and teenage pregnancy. While many of them in principle support reform, it remains unclear whether they are willing to invest the resources and effort needed to achieve it, either in the public sphere or in their private lives.
The cultural pathologies that have been subsidized by the government for a generation have now assumed a life of their own. It is questionable whether the social fabric that the state has helped to rend can now be mended again, even with sensible government policies. The problem is not primarily one of a failure of leadership in Washington. Rather, it is a profound weakening of the Judeo-Christian ethic that, as Tocqueville observed, establishes a necessary moral framework for liberal institutions such as free speech and free markets. Barring a massive religious revival—an unlikely but not impossible prospect—it is hard for me to foresee America recovering its civic moorings. Rather, we are likely to see escalating rhetorical bombast, combined with reforms in the areas of tax and welfare that only moderate, rather than reverse, current dysfunctional trends. As Irving Kristol recently observed, Western civilization is decaying, but decline comes slowly, so that the best we can do is to live well in the meantime.
Jean Bethke Elshtain
Perhaps our confidence in the American purpose was overdone, lodged at least as much in our overweening ability to project our power post-World War II, as in the resiliency and robustness of our basic institutions. Looking back a half-century, one detects a certain fragility in the triumph; one sees previously unnoticed cracks in the pane. That there was a mid-century epiphany, I have little doubt. That many were left behind in the march of progress is a story too well known to bear retelling. I have in mind something else. A form of insouciance, perhaps? Or overconfidence, given our power and our prosperity, that our basic institutions—family, school, church, government—were not only secure but nigh invulnerable.
One does not want to overdo any of this. I recall Freud’s mordant observation, in a 1916 essay, “Reflections on a Time of War and Death,” that the radical disillusionment spurred by the bloodletting of the Great War was taking people by surprise because they had been lulled into an illusory view of humanity’s flower-strewn parade toward a glorious future. Perhaps, he suggested, humanity had not fallen so low as many believed because it had never risen so high as many proclaimed. But this sort of observation is alien to the American temperament, Henry Adams, and a handful of others, excepted.
Allowing then for an earlier unwarranted optimism and a current radical pessimism, there are nonetheless grounds for deep foreboding. The problem is not so much one of searching for an overarching purpose that will lock us all into a single aim, but the loss of a political and civic language in and through which we can search for, and forge, commonalities even as our distinctiveness remains intact. It is difficult to lift up commonality as a worthy democratic dream if you are convinced that the entire history of American democracy is one of hegemonic imposition, “nothing but” a story of inegalitarianism or naked power hidden beneath the shimmering folds of Lady Liberty’s generous garb. To the extent that our belief in ourselves as a constitutional republic and a democratic civil society (our national project) is unraveling, what seems poised to take its place is not healthy skepticism but sour cynicism; not a more generous articulation of our possibilities but a harsh condemnation of any project that offers a civic identity not reducible to the terms of racial, ethnic, or gender identity.
So: my own thinking has changed. I am now convinced that unless we can rebuild basic authoritative institutions, institutions required to shape, form, and mediate democratic passions and interests, we will continue to be beset by various panics that pit us against one another as strangers, not friends; as enemies, not opponents.
Alas, I see little evidence at this point that the conservative resurgence is arresting or reversing the trends here noted. Perhaps because the conservative project itself does not know whether to bury certain tendencies or to praise them. More and better consumerism? I do not think so. Yet the market panacea for all our woes seems to be the central theme of much conservative thinking. Devolution is more hopeful. But devolve—to what? To often strapped and discredited state governments? These, too, are reeling from public mistrust, cynicism, and the loss of legitimate authority. Finally, it really does not work to try to put together the Tofflers with Tocqueville, as the current Speaker of the House would have it. The former undermines the latter, with his emphasis on habits of the heart; on associational life; on democratic dispositions and purposes.
If conservatives, and not conservatives alone, would revitalize our institutions, hence offer a fighting chance to restore some civic confidence and hope more generally, they will not find answers in techno-enthusiasms (the third way and all that); or in the market alone; or in the sort of econometric cost-benefit analyses that seem to have taken hold in much of the budget-cutting effort, but rather in a Lincolnian awareness that holds promise for the goodness of the nation’s soul rather than the greatness of the American state. This leads to a more modest statement of purpose, no doubt, one that emphasizes civic peace, neighborhoods in which children may play safely, schools in which children learn and teachers really teach, workplaces and jobs that honor the fact that men and women are also parents and citizens.
Mid-century greatness and triumph are behind us. I fear it corrupts us to long for the restoration of that precise moment and a grand articulation of our prospects. Something humbler, yet no less demanding in its own right, is called for at present. It is unclear to me who is really answering this call.
More than 50 years ago Henry Luce, the founder and editor-in-chief of Time, predicted that this would be what he called, with a journalist’s hyperbolic touch and an ad man’s panache, the American Century. Had the Communists triumphed, Luce’s prediction would today look ridiculous. But—here’s a late news flash—Communism sputtered, faltered, and ended up in that same dustbin of history to which Trotsky, Stalin, and the rest of that grim and barbarous crew were always consigning the capitalist West. This put the seal on it: the century has indeed been an American one, with, at century’s end, American power and influence ubiquitous and evident around the world.
America has not merely triumphed over Communism, it has also triumphed culturally over Europe. As an aspiring intellectual with cultural interests who came of age in the 1950’s, I remember thinking America existed in a state of distinct and seemingly permanent cultural inferiority to Europe. Next to England, France, Italy, even recently defeated Germany, our own culture seemed thin, dim, and parochial, where not vulgar. The best living writers, painters, musicians were all Europeans. No longer. While Henry James and T.S. Eliot, Ezra Pound and Ernest Hemingway and others once felt the need to leave America for the richer culture Europe provided, today the traffic, it is plain, is running the other way. Culturally, for better and worse, the United States is where the action is.
So here we are, living in the American Century, the benefactors of a bloodless victory that has put us indisputably at the very top of the world, and not feeling all that good about it. Quite the reverse. Most of us live with that decline-and-fall feeling all but explicit in the editors’ opening statement, waiting for things not so much to unravel as to explode, wary of dancing atop what we sense is a grumbling volcano.
Is it possible that we are suffering a case of the fin-de-siècle blues? The end of a century is notable for encouraging gloomy thoughts. Eugen Weber, the excellent historian of France, in Fin de Siècle, notes that “the notion of end, somehow, goes with thoughts of diminution and decay.” Part of the story Weber has to tell in his book is about the vast discrepancy in France at the close of the 19th century between impressive technological progress and the depressed mood of thoughtful observers. History, it sometimes seems, is not merely just one damn thing after another but often the same damn thing after the same damn other thing.
If one attends carefully to the news, things in America just now seem sufficiently gloomy for the end of two centuries. But what do the people who gather news know? I, for one, am not ready to concede them all that much. I was recently impressed by a remark of Irving Kristol, the tutelary saint of neoconservatism, who said he was opposed to any politics that makes its adherents gloomy. (There is a lot to this; one does not become a tutelary saint for nothing.) Perspective of the kind required for judging something so large as our national prospect is never easily attained, and the kind of gloominess that comes so easily to intellectuals does not do much to foster it.
As an intellectual, and hence someone most comfortable when complaining, let me begin by featuring some of the things that, over the past quarter-century, have made the national prospect seem bleaker. The universities are much poorer now than when I was a student, at least in their humanities and social-science departments. Families are a mess, in the upper-middle as well as in the lower classes. I used to ask my own students what their parents did, until I grew too saddened by the reply, “Which ones?” The public schools in our big cities have fallen to so low an estate that anyone who can afford private schools for his children is certain to send them there. Finally, there is crime, chiefly crime committed by the young, which has brought the dispiriting notes of desperation and paranoia to everyday urban life.
Bad schools, the breakdown of families, crime committed by the young, these items are clearly related, and the linchpin required to put things back together is the family. We need to develop ways to encourage the strengthening of families: two-parent families with sufficient income, coherence, and authority not to accept shoddy schools for their children, or to let them run wild, or to accept squalid conditions in private or public life. Any politics that conduces to helping bring about stronger families is, at present, the politics I favor.
I took great pleasure in the rout of the Democratic party in the 1994 election because its politics, with its reliance on hopeless federal bureaucracies, its misplaced tolerance for intolerable behavior, its inability to rethink old and baleful policies, and finally its refusal to admit that anything is really wrong in the first place seemed to me to encourage a continued weakening of families and thus of the quality of American life. I hope the Democrats come to understand the true message behind their defeat was that most Americans are not all that impressed by demonstrations of false virtue combined with policies that led to genuine misery. I hope so if only because I hate to think I shall never vote Democratic again before going to my grave.
It is still too early to judge if the Republicans are going to be any better at strengthening the family in America. (It is far from clear that cutting welfare and otherwise leaving it alone is a real solution.) I do not myself happen to think that Newt Gingrich is our domestic Winston Churchill, even though I am glad he is on the scene. Left-wing or right-, conservative or liberal, politicians remain politicians, and, in my happily jaded view, only an idiot puts anything like full confidence in any of them. Yet the Republican victory did seem to signal a sense that Americans urgently want a calmer, more decent, less agitated life than the United States has known since the mid-60’s, when Democrats have been chiefly responsible for setting legislative agendas.
In the distant future, one hopes a scholar will write a history of American civilization that will take account of the American Century in an even-handed way. Like all other civilizations, ours will no doubt be found to describe a trajectory of rise and peak, decline and fall. Whether we shall be found to have fallen from within or from without will inevitably be argued in this history. So, too, will the uses to which we put our power and the quality of our morality as well as our pridefulness and our inability to recognize the larger, crucial issues that were playing out under our exquisitely ignorant noses. The chapter of that history we are currently living through ought perhaps to be titled, in the smart-alec journalistic style of our day, as “The Empire Has No Clothes.” I believe—I ardently hope—we may be coming to the end of that chapter now, and am myself intensely curious to see what shape the next chapter will take.
I once heard this from a college lecturer, who told me that he in turn was recalling something J. Robert Oppenheimer said in the 1950’s. The quotation now strikes me as very un-Oppenheimerlike, so I have reason to question its provenance. Anyway, it has stayed with me for twenty years. “The country is quite obviously going to hell,” said whoever it was, “and the only thing that can save us is if nobody does anything about it.”
I like this quotation very much, and think of it often as I write about the busy men and women of Washington, D.C. I like it for several reasons. First, it serves to remind me that intelligent people were sure the country was going to hell in the 1950’s—a decade most of today’s hell-bent reformers consider a golden age. It is hard, in fact, to find any age in our country’s history, golden or otherwise, during which a large portion of intelligent people were not convinced the country was going to hell. And so it is today. The more things change. . . .
Second, the statement expresses, with pleasing irony, a certain skepticism about the very enterprise of reform. How many of the evidences of national unraveling listed by COMMENTARY’s editors—racial polarization, the dissolution of shared values, and so on—cannot in some measure be traced back, perversely, to one or another grand scheme to make things better? Perhaps the schemes were imperfectly executed; they usually are. Perhaps they were incorrectly conceived; also likely. We cannot be reminded often enough, it seems to me, that the impulse to uplift has its costs, and they are seldom foreseen, and they are quite often, if not most often, severe.
This is not fatalism; every day in every way we should all try to make things better and better. It is merely the recognition that the best-laid plans tend to go awry. The wiser course, then, would be to keep our plans and projects small and humbly circumscribed—personal, I mean, and not political (for by now all of us have learned that the two are not in fact the same). Alas, calls to national panic, such as we are hearing from the Right today, and as we have heard from the Left perpetually, do not encourage humility. Yet humility is what we require most of all when we set about to improve the lot of vast numbers of people, or even whole countries.
Liberalism perished of pride. The vanguard of the conservative resurgence the editors mention would do well to consider the fate, and consequences, of a political philosophy that proves overweening when it gets its chance to strut the stage.
Thus this quotation (and here is my third reason for liking it) is curiously and fundamentally optimistic. It shows faith that common wisdom can reassert itself, if the uplifters and experts are kept at a safe distance. The country will right itself when the people are allowed to rediscover that now as always the fundamental things apply. The genius (my quotation assumes) lies in the demos—and this, after all, is the hypothesis that the American experiment set out to prove.
Chester E. Finn, Jr.
I am still glum about our prospects, primarily because the major institutions that shape the thinking of the next generation are still either run by balkanizers (as we find in the media, mainline churches, schools, and universities) or too weakened to have much traction (the family). Such positive signs as we see are mostly found in the interstices. That is not enough.
Consider how our educational institutions, instead of strengthening the national project, accelerate its unraveling. They do this both by what they teach and by how poorly they teach. At the university level, they transmit relativism, deconstructionism, multiculturalism, victimization, and political correctness. This fissiparous curriculum then trickles down to the schools, where it is aggravated by federal regulations, bad textbooks, ill-prepared teachers, and “ed-school” ideologies that celebrate diversity and scoff at knowledge itself.
That what is taught is taught badly and learned weakly is clear from a thousand sources. Nothing is a better symbol than the College Board’s recent decision to “re-center” the SAT scores because the slippage in average performance was so serious over the past three decades that statistical problems arose even in reporting present results according to the norms of 1941. The Board recalibrated the reporting system so that today’s meager average will henceforth be the center of the curve.
The general ineptitude of the education system confers a perverse benefit: if one has accidentally swallowed poison, one is better off when the body absorbs it slowly. Similarly, a bad curriculum poorly taught is less life-threatening than the same stuff delivered well.
Unfortunately, even while doing a mediocre job of moving poison into the bloodstream, American education is not helping students digest the essential nutrients of a well-functioning society: reading, writing, math, science, history, literature, geography, and civics. Thus, most of our young people emerge into adulthood without the knowledge and skills needed for individual accomplishment and national prosperity. (About a third of high-school seniors in 1994 could barely read, according to the National Assessment of Educational Progress.) They know little about the nation’s past, its culture, its political and economic systems. This generalized weakness makes it harder to survive the poisonous parts of the curriculum, even when those are also badly taught.
The schools, to be sure, help spread an ersatz common culture simply because most use similar textbooks and software, buy their texts from the same publishers, have their students read the same stories and periodicals, and are permeated by the same products of Hollywood, sports, music, and commerce. That is why a fourth-grade classroom in Portland, Maine, is so much like one in Portland, Oregon, and why university applicants from Florida and Minnesota arrive with similar transcripts and attitudes (and deficits). But the common culture thus transmitted is degraded. Indeed, one reason we need a better education system is to counterbalance that culture. If obscene rap lyrics fill the airwaves, Mozart and Copland should fill the classroom. If television celebrates violence, school should be a place of orderly purposefulness. If welfare fosters dependency, education should forge the tools of independence.
Obviously, that is not the education system we have today. Nor have the reform efforts of the past dozen years borne much fruit. With rare exceptions, they have been weak attempts to boost the school’s efficiency rather than rethink its purpose, replace its curriculum, or rewrite its ground rules and power relationships. (As for higher education, far less has been tried there, because most policymakers and business leaders share the widespread illusion that U.S. colleges and universities are fine just as they are—if only the schools would send them better entrants.)
Recently, a few promising changes have appeared on the periphery. These entail cracking the monopoly, allowing different kinds of schools to operate, and transferring power and resources to those who want to create or attend such schools.
The voucher program recently enacted in Milwaukee and a similar venture soon to begin in Cleveland allow poor, mostly minority children to take their portion of education funding to the schools of their choice, including private and parochial schools. The courts are currently weighing the constitutional aspects of these programs.
Private contract management of public schools has begun in a half-dozen communities. And the charter-school movement is growing: nineteen states have enacted enabling legislation and several hundred of these “independent public schools” are operating, with more to follow.
Charter schools can be launched by parents, educators, or entrepreneurs. Freed from most state and local (and union) red tape, they can teach pretty much whatever they like, pretty much however they want to. Funded by the dollars-per-pupil that would otherwise go to conventional schools, they comprise a sort of public-sector voucher system. So long as they produce the results they promise, they can keep operating. Many charter schools have waiting lists. And the evidence suggests that a number of them are fine schools.
But they are no cure-all. Even if their numbers rose tenfold, they would comprise only 2 or 3 percent of U.S. public schools. They face stiff establishment opposition. Moreover, starting a charter school takes a measure of discontent, imagination, and energy on the part of educators or parents—scarce qualities in a nation where most people are generally (if blindly) content with conventional schools.
We must also grimace at the irony of looking to greater diversity of educational offerings to solve the problems of an enterprise that is characterized, above all, by its overfondness for diversity. Some charter schools (and many conventional private schools) are as contemptuous of the national project as is the public-education establishment. Some have flaky curricula; others celebrate ethnic separatism.
What cracking the monopoly offers, then, is simply the right of schools to be different and a chance for families to choose. That means that parents inclined to seek them out will at least be able to find (or create) schools that celebrate the common culture and intend to prepare students to participate in the national project. It does not, however, guarantee that large numbers of such schools will exist, or that many youngsters will attend them. Indeed, we must expect most to remain in schools and colleges that—however inefficiently—continue to poison their brains.
What this says to me is that the conservative resurgence could succeed in reshaping a lot of today’s programs and agencies yet fail in the long run unless we can devise far better means of redirecting the institutions that mold tomorrow’s citizens.
As we face the third millennium, it is hard to doubt that the American family is in disarray. Divorce claims more than half of all marriages and more than one-third of all children are born to single mothers. Signaling the revolution in attitudes toward the importance of traditional family bonds, the words “adultery” and “illegitimacy” have effectively disappeared from our vocabulary. Increasingly children, even those with two resident parents, are left to their own devices, which, the frightening statistics tell us, too frequently lead them to crime, drug addiction, alcoholism, suicide, and violent death. The world of stable, two-parent families and protected childhoods to which Americans turned with such enthusiasm in 1945 seems gone beyond recall.
In retrospect, the serenity of attitudes and the stability of institutions that prevailed in 1945 may be seen more as wish than as reality, for the forces that would shortly transform the United States almost beyond recognition were already pulling at the leash. Nineteen forty-five did not so much inaugurate a return to normality as the first glimmerings of an unprecedented dual revolution in economics and sexuality. That dual revolution radically transformed Americans’ attitudes and expectations about women’s roles and family dynamics, but, radical pronouncements to the contrary notwithstanding, it did not shake most Americans’ commitment to stable families and responsible child-rearing. What it did do was make the realization of both increasingly difficult.
The sexual revolution broke upon popular consciousness in the mid-to late 1960’s and, by the early 1970’s, carried the day. In 1969, two out of every three Americans disapproved of premarital sex. Four short years later, in 1973, only 48 percent disapproved, and 43 percent believed that premarital sex was OK. In retrospect, it seems clear that most Americans supported the sexual revolution because they thought that it freed “nice” girls to have sex before marriage without ruining their reputations. It apparently never occurred to them that this small relaxation in sexual “morality” would permanently sever the link between sex and morality, making it increasingly difficult to censor any sexual behavior at all.
Thus, what initially looked like a minor adjustment in courting conventions rapidly led to open marriage, single motherhood, an explosion of pornography, the celebration of “man-boy” love, X-rated films around the corner and on television, and any other shattering of taboos that human appetites could devise.
Feminism has ridden the crest of the sexual revolution, insisting upon sexual freedom as the bedrock of women’s liberation. For many feminists, the consolidation of this freedom has required not merely the constitutional guarantee of a woman’s “right” to abortion, but also women’s freedom from the control of men through families, specifically no-fault divorce and public acceptance (not to mention financial support) of single motherhood. Thus, even while some feminists deplore pornography as yet another manifestation of men’s brutality against women, few if any have been willing to advocate a curtailment of the sexual revolution in the name of morality.
Not for nothing have feminists insisted that those who evoke morality and family values more often than not favor women’s return to the traditional domestic roles of the immediate postwar era when American families were catching up on the childbearing they had deferred during the Great Depression. Determined to consolidate women’s massive entry into the labor force, not to mention their personal freedom, feminists easily confuse any mention of morality with men’s determination to confine women to the bedroom and the kitchen.
As it happens, however, the economic revolution that intertwined with the sexual revolution of the late 1960’s and early 1970’s has made such a restoration impossible. In the 1990’s, most working women work because they must. And most work throughout much if not all of their childbearing years. Today, the typical working woman is a mother or likely to become one, and the typical mother is a working woman. Ordinary Americans, whose families depend upon women’s earnings, live intimately with that necessity, which they do not confuse with an evasion of moral responsibility.
By the same token, the recognition that most women must work does not lead most Americans to a devil-take-the-hindmost attitude toward morality. To the contrary, almost two-thirds (according to a recent survey reported in the Wall Street Journal) regard the collapse of morality as our most pressing national concern. Yet slightly more than two-thirds do not merely accept the necessity for wives to work, they approve of their doing so. And to complicate the picture further, the majority of American women continue to view marriage and children as essential ingredients in their ideal life, just as they continue strongly to support—and practice—marital fidelity.
These attitudes make it difficult to argue that most American women regard traditional commitments to a husband and children—what most of us would call the bedrock of family values—as just another form of male oppression from which they must be liberated. Many Americans, in other words, are doing their best to practice both family values and morality, which challenges us to explain the widespread perception that both families and morality are in disarray.
Only the arrogant or the stupid could pretend to offer a simple explanation of the gap between practice and perception, but some things are clear. We are all, good intentions to the contrary notwithstanding, failing our children, and we are failing them because neither public nor private solutions to their problems will alone suffice. Most families cannot do the job without some assistance and encouragement, and the public sector demonstrably cannot replace families, or even compensate for the absence of one parent. Well beyond infancy, children require and deserve sustained attention from both a mother and a father.
But if our policies are failing our children, our public pronouncements are failing them even worse. For one major consequence of the sexual revolution’s divorce of sexuality from morality has been the ensuing divorce of reproduction from morality. Willy-nilly, we have conspired to transform the moral work of society, notably responsibility for the next generation, into servants’ work which none should be coerced to perform. Worse, we have beaten an unseemly retreat from the authority of moral obligation, thereby reducing the fulfillment of moral obligation to a matter of personal choice.
Whether we count ourselves among those who would liberate women from responsibility to children or those who would impose that responsibility upon them, we are ending in the same disastrous impasse, namely, the privatization of morality which, if it is to deserve its name, must be both public and binding, and which, if it is to command allegiance, must take account of a world in which most good mothers must work.
Is it really so strange that today’s America falls short of the standards of national cohesion set 50 years ago? Then, the country had newly triumphed in the most colossal war in its history. Upward of thirteen million young men had been conscripted, put into the same uniform, subjected to the same discipline, fed the same food, integrated into units with other men collected randomly from across the continent, paid the same wage, and thrust into the same dangers. For four years, they risked their lives for a shared goal; then, when the war was won, they shared the same glory and came home to the same lavish veterans’ benefits provided by a grateful country.
Americans who did not wear uniforms were gathered together into a collective experience too. More than half the nation’s economic output was taxed or borrowed by government. With sons fighting in Italy or Iwo Jima, Americans listened to the same news, thrilled to the same victories, despaired at the same reverses. They sank their regional, ethnic, economic, and ideological differences in ways never known since—or before. The year in which COMMENTARY was founded was perhaps the most abnormal in the country’s history, and the two decades immediately afterward, rosy as they look in nostalgic retrospect, were nearly equally unusual.
It is the way we live now—distracted by bitter arguments over what it means to be American—that is normal. Think of the America of the 1930’s, torn by furious economic antagonisms, the leaders of American business denounced by the President as the “forces of organized greed” fated to “meet their master.” Think of the violent strikes, the seething hatred for FDR on the part of the well-to-do, the sly implications of the Republican campaign ad attacking Roosevelt’s ally, the Jewish unionist Sidney Hillman: “It’s your country. Why should Sidney Hillman run it?”
Think of the America of 1910, overrun (as it was then thought) by impossibly alien and unassimilable foreigners. English seemed to be disappearing from the streets of New York. New England, the cradle of American Protestantism—America’s original common culture!—was savagely nicknamed “New Ireland” by horrified Yankee Bostonians.
Or think of the unbridgeable mistrust between country and city in the 1890’s—an era when politicians carefully put a token Southerner in cabinets and on courts in exactly the same condescending way that they now reach for black and Hispanic nominees. Or the sectional hatreds that exploded in war in 1861.
America is a colossal place, a world as much as a country. Cultural unity does not come naturally here—the reason, I think, for the public displays of patriotism and flag-waving that so often seem forced, even hysterical, to foreigners.
And this is especially true because, in a country that so often defines itself by its convictions and ideals, the power to interpret those ideals confers political power too. When we debate the national-history standards proposed by the Clinton administration, we are not—obviously—merely disagreeing about the meaning of events a century or two ago. We are arguing, as Henry Cabot Lodge was arguing when he attacked East European immigration, as William Jennings Bryan was arguing when he denounced the depravity of the big cities, over who should rule, and how.
Of course, the fact that Americans have always fought cultural wars does not detract in the slightest from the importance of the cultural wars being waged now. But perhaps a little perspective might exert a small calming influence. Just look around. For all their fractiousness and failures, conservatives are winning their argument in favor of equality under the law regardless of race, and in favor of the proposition that justice appertains to individuals, not groups. I sense too that conservatives are prevailing in their insistence that America is formed by a European cultural inheritance, not a multicultural hodgepodge. The Clinton attempt to impose a self-hating national-history curriculum was, after all, given a sharp electoral heave-ho.
America’s troubles now are very like the troubles that roiled its past. And its prospects? As always: sunlit!
While I am more pessimistic about the United States than I was ten years ago, I believe that American institutions are basically stable and will weather their current problems. The key to their survival, however, will largely depend on what goes on in American civil society, which in turn is shaped by the culture wars in which we are now engaged.
One of the most insidious changes that has taken place in American life over the past couple of generations is the secular decline in what Tocqueville labeled the American art of association—that is, the ability of Americans to organize their own society in voluntary groups and associations. This falling-off can be measured in a variety of ways: in declining memberships in traditional service organizations like the Red Cross, Elks, or Rotarians; in the decrease between the 1960’s and the present in the number of Americans who, when polled, say they trust “most people” (from two-thirds to one-third); and in the symptoms of fraying community like rising litigation and violent crime.
In other societies with low social capital, this lack is compensated for by strong family ties. Indeed, in many societies there is a trade-off between the strength and stability of families and the strength of voluntary associations outside the family. The United States, unfortunately, has been experiencing a decline in the stability of nuclear families in tandem with the decline in civil society noted above. The social pathologies attendant on this dual decline are obvious.
The depletion of social capital has multiple causes. First and most important is the rights revolution, which has entitled each individual American to an ever-larger sphere of autonomy and led to the undermining of the authority of communities of all sorts, from the family to the workplace to the nation itself.
The legal expansion of rights through the court system is, of course, only a reflection of the runaway individualism present in the culture itself. A recent article in the Wall Street Journal noted that young men and women going through Marine Corps basic training find themselves totally alienated from and contemptuous of the civilian life they see around them when they leave the Corps. That life is characterized by aimless individualism, unstructured by group loyalties or respect for higher authorities. This alienation is a measure less of the isolation of the Marines than of the degree to which the surrounding society has changed: the older officers observed that as little as a couple of generations ago, it would not have been necessary to socialize recruits into the institution so brutally.
Beyond the court system, the state has encouraged the decline of social capital in a number of other ways. As the sphere of state authority has increased since mid-century, it has taken over an increasing number of responsibilities from both families and civil society. The impact of Aid to Families with Dependent Children (AFDC) on the nuclear family is only one example. Every religious charity that accepts federal money to deliver welfare services is coopted and ultimately weakened by the requirements imposed by the government. And the state (often in the guise of local boards of education) has fostered the needless balkanization of American society by promoting bilingualism, multiculturalism, and other policies aimed at raising the self-esteem of minority groups.
The state, however, is like a one-way ratchet that is capable of weakening the family and civil society but rather powerless to build either up again. The problem with the current Republican agenda is that it has only half the solution—the party hopes that if the state is cut back, civil society and the family will reemerge spontaneously to fill the vacuum. There is, however, no guarantee that this will happen. The experience of other societies with civil life damaged by the state suggests that the rebuilding process is a very long one. (In the case of France, it has been going on for nearly 500 years and the clock is still ticking. . . .)
The regeneration of civil society and the family cannot come from the top down. At most, the state can agree to do no further harm. It cannot, however, undertake positive measures to encourage voluntarism and spontaneous association; nor can it resocialize fathers and mothers into the responsibilities of parenthood. The problem with Bill Clinton’s Americorps is that it seeks to encourage voluntarism through a new federal bureaucracy and subsidies, a contradiction in terms.
The restoration of civil society and the family can come about only as a result of our current culture wars over basic social values. Here, I think, there is some room for optimism. There is a growing consensus that stable, two-parent heterosexual families are important, that private voluntary organizations can do a better job dispensing welfare services than a state bureaucracy, and that multiculturalism and political correctness have run amok.
I also detect a greater frankness in talking about certain racial and ethnic issues, as well as a greater tolerance for religion, if not religiosity itself, on the part of people who would have been indifferent or hostile ten or fifteen years ago. All cultural revolutions and counterrevolutions take a long time to accomplish, and this one will be no exception. But sensible ideas about families, community, rights, and education are present in a way they were not a generation ago.
The one area where I have grown much more pessimistic concerns race. For a long time I shared an instinctive belief that our liberal society would eventually solve America’s race problem in a way that had eluded other developed societies in Europe and Asia. Today I am much less sure, in light of the downward spiral of the black underclass and the increasingly sour attitude that middle-class whites and African-Americans have toward one another. This issue raises a cloud over the optimism expressed above, since, unlike a century ago, the problems of American blacks will no longer be isolated within their own community.
I have no particular wisdom on how this nexus of problems might be resolved, but I know that all of the problems we now face—restoring the family, dealing with immigration, fixing the educational system, restoring trust and social capital throughout the society—are all made infinitely more complicated by the race issue.
One advantage of being merely an intermittent visitor to the life of the mind is that you are less likely to catch some of its fevers. The notion that we are becoming a kind of cultural Bleak House is, it seems to me, one of these febrile disorders.
My reasoning is jingoistically simple. Over the years I have seen that when it comes to the things that make life good for most people and even better for the privileged, America continues to do the bulk of them better than the rest of the world. (If you are lucky enough to reach the age at which you become a consumer of complex and sophisticated medical care, as I did recently, this truth will suddenly appear to you with blinding clarity.)
Amid the country’s vast beauties we have our ugly economic and cultural wastelands, our carjackers and schoolyard shooters. But for two centuries we have talked, written, warred, and campaigned to improve conditions for large numbers of Americans; and for all our continued high-volume kvetching, most things keep moving spasmodically ahead. Besides, we easily pass the acid test: where there are pockets of superior talent abroad, large proportions of their inhabitants keep wending their way by hook or crook to our shores.
Indeed, a significant source of the troubles we do have is the fact that this plodding, mildly optimistic reformism is so ordinary and intellectually unsatisfying. It does not lend itself to grand theorizing and is perennially short of novelty. As a result, people whose stock in trade is ideas always face the temptation to make our social and political conditions something more dramatic—almost always something worse—than what they really are. Almost invariably we get Armageddon, not Arcadia.
On the Left, the characteristic sin is what Thomas Sowell has called “the vision of the anointed.” It is the habit of viewing the country as something benighted but almost infinitely malleable, capable of transformation by a well-meaning, all-wise, powerful government and an intellectual elite with allegedly superior insight and abilities.
On the Right, the temptation is to react to the alarums from the Left by responding in overwrought kind, as if the barbarians were always at the gates. Thus the Left’s fondness for government intervention in all kinds of markets has prompted a healthy immune reaction in defense of market mechanisms. But markets are capable of displaying their own inanities and devising their own diseases. Each morning, it seems, we awake to read about communications moguls clawing and lunging to devour one another, like bluefish in a feeding frenzy—so as to be ready, I suppose, when the starting gun unequivocally sounds for the race down the new information highway, though virtually no one has the faintest idea where the road starts or ends or how high the tolls will be. It may be possible to justify many of these moves on efficiency grounds—but there is something greedily megalomaniacal, and heedless of human costs—the wreckage of families and communities—about the way in which these people conduct their daily business. This should not be ignored.
In the same way, there has been a reaction to years of assertions by the Left that the racial attitudes of white Americans have not only been evil in the past but remain just as evil in the present. This charge is simply wrong; in the offices and on the downtown streets of any major city, the painfully wrought, frustratingly slow changes are plain to see. The charge is also deeply pernicious, eroding the good will on which future progress depends. Yet some on the Right feel compelled to answer it with equal extremism, asserting that racism does not exist at all—or that if it does exist, it does so only in the impotent imaginations of a few racists, with virtually no consequence for social and economic conditions in the real world. The sterility of the current affirmative-action debate—as though a jump start was unnecessary in the 70’s and new standards for opening access to opportunity for the sturdy emigrants from the underclass are unnecessary in the 90’s—is testimony to the poverty of most talk and thought on the continuing crisis of race.
Today we have to cope with more than the old and true fact that it is very boring to make political arguments in the muddy middle ground between the poles of a debate. There are new forces pushing social commentators to the dramatic, dangerous edge. To put it cynically (though no more so than the matter deserves), middling positions and careful prose do not sell books or bring coverage in national magazines or lead to talk-show appearances and large lecture fees. An author cannot make a splash through refined logic and observation. He must offer something that stirs the emotions and translates effectively into sound-bite babble. Political writing and ideas, it seems, must enrage to engage.
Such exaggeration and literary hustling have harmful consequences, but I believe these will prove temporary. Rock music and mendacious machine-gun movies have deafened us and encouraged writers to raise their decibel level; but by virtue of this insensibility, most products of the current literary and media hyperactivity have the solidity of paper-thin vermicelli-type Chinese noodles that speed through the American cultural viscera without even being digested. Thus today’s ambivalence-free authors make themselves not only prosperous but happy in their deeply mistaken notion that they exercise a dispositive influence on events. In truth they have next to no influence on the real world, in which the varied permutations and shifting demands of our constantly changing cultural mix will require not rigidity but a principled openness and a talent for improvisation.
In other countries, these thinkers might be plotting revolution; here, they are merely light lunch for the ever-beckoning, ever-voracious media machine. In fact, this is not a bad deal for America’s protean culture—another reason, as Rodgers and Hammerstein would put it in their ordinary, homespun style, for an American to be a cockeyed optimist.
The editors ask whether the country is more balkanized, mistrustful of authority, or morally fragmented than it was 50 years ago, at the end of World War II. It strikes me that what we make of present-day America in these matters depends to some extent on the generational perch from which we view it. For instance, from the time I was sentient until just a few brief years ago, I knew the America of the cold war. Now, it seems, we are in the process of once again becoming the country I recognize from history books.
When it came to cultural diversity, this was almost always a place of two minds violently opposed to one another. On the one hand, the country was famously insular and bigoted: even apart from black slavery, the national history is replete with tales like those of the humiliation of the Chinese in California, the exclusion of the Irish on the East Coast, the cruel treatment of the Indians, the quotas discriminating against Jews in the professions. Protests against immigration to these shores have a pedigree as old as immigration itself.
On the other hand, American anti-immigration sentiment and even ethnic bigotry have, over the long run, been notably ineffectual for all their fervor. Certainly the waves of immigration—sometimes larger, sometimes smaller—have continued despite the campaigns against them. Assimilation took place and continues to take place not because of an absence of prejudice but in the face of it; indeed, this resistance has helped shape American ethnic identity, pride, and politics.
The idea of a real melting pot, in which individuals could form bonds of the deepest sort with one another despite ethnic differences, came only with World War II and its aftermath. We shed sentimental tears over war movies that featured multiethnic platoon roll calls and interethnic acts of heroism. The discovery of Nazi horrors expanded the list of places in which it was no longer permissible to utter religious or racial slurs. Later, the struggle against Soviet Communism on behalf of the free world perpetuated the need and provided a powerful impetus for the idea of a unified national identity transcending parochialism.
That ethos of unity started to weaken in the cultural maelstrom of the 1960’s. The civil-rights struggle, begun as an appeal to an entire nation assumed to be mostly good at heart, became mired in increased antagonism between whites and blacks. American Indians launched a civil-rights struggle of their own, this one separatist rather than integrationist from its beginning. A women’s movement, with a distinctly nonunity view of male oppression, grew out of the anti-Vietnam-war movement just as past women’s movements had grown out of the abolition and temperance movements; but this time, women’s liberation was joined by yet another expression of separate identity, gay liberation. Now (nostalgic sigh) we no longer have the Soviet Union to exercise a modicum of discipline over this process. So it is no surprise that we are more balkanized than we were, say, 40 or 50 years ago. But we are no more balkanized than at many other times in our history.
As for another of the editors’ instances, mistrust of authority, this country has always excelled in the particular type of mistrust appropriate to a free society. Thus, ours has not been the kind of mistrust that leads to individual isolation, sabotage, and the hoarding of potatoes; we have shown ourselves quite capable of trusting one another and cooperating superbly in organizations. But when it comes to government, our heritage is one of regal contempt. Many countries joke about their politicians; few do so with America’s long-running, flamboyant enthusiasm.
The Depression and the federal government’s response to it made inroads into this tradition; the wars that followed, first World War II and then the cold war, did much more. Government’s fighting machine might have been a Rube Goldberg creation, but it was either that or the unthinkable. Government secrets and spying were nasty; but in the face of such a duplicitous enemy as the Soviets, we had no choice but to trust our government with such activities.
The politics of the 1960’s, culminating in Watergate, eroded this small beachhead of trust. But the wave that really knocked it into oblivion was, once again, the end of the cold war. The great imperative was gone, and things returned to something not all that far from normal.
Even our perception of a dissolving consensus on religious and moral values, though gravely worrisome, is not so unusual. It is hard to contemplate today’s illegitimacy rate or violent youth crime without being appalled; it is also hard not to be aware that the fear of moral dissolution has been one of the great constants of our history. The moral struggle entailed in the cold war was large enough to provide some distraction from such concerns. But that epic drama is over now; and when we looked up from the rubble of the battle, we found our old problems and anxieties waiting patiently—no, impatiently—for us.
The current reaction to such worries has also been familiar: a type of religious revival. Those of us who have not seen one before are as mystified as we were when enlightened folk, after several years of sexual liberation, started marching backward (for so their reversal of direction seemed) toward an overriding fear of sexual disease and harassment.
Cold-war elites did not have to take much account, relatively speaking, of conventional American religious piety; no doubt they thought that by defending Western civilization against Soviet materialism they had “given at the office,” and they were allowed to persist in this notion. But that dispensation is gone now.
The country, in short, is engaged once again in its old debates and contests. These do not have quite the excitement of a geopolitical struggle for control of the entire earth. But I suppose I will eventually get used to that.
Eugene D. Genovese
The indictment holds on all counts. If anything, it has become more severe since the South lodged it against Yankeedom 150 years ago. For whatever the sins of the Yankees of old, they did try to hold fast to a vision of national greatness; they could in fact be fairly criticized for inviting excessive chauvinism and a penchant for world domination disguised as a program to promote human rights. Their current heirs, in contrast, have fallen in love with a chimera called the global village and cannot wait to trade our national culture and identity for participation in a homogenized world, wittily packaged as multicultural diversity.
I shall restrict myself to one problem: immigration in relation to national identity. Those who seem determined to beat up on immigrants are missing the point and ruining a good cause. There are no grounds for demanding that the United States open its doors to anyone who wishes to come here, and less justification for the silliness according to which our “traditions” demand it. Despite the claptrap about a haven for huddled masses, America invited immigrants because it needed their skills and labor: it was a question of economic policy. At the same time, Americans had strong faith in the ability of their political institutions to absorb and “Americanize” those who had had little or no previous experience with republicanism.
That confidence was not misplaced, and it need not prove so now. Whether we continue to invite immigrants or, more realistically, how many we invite is a question of policy and prudence, not of abstract right or duty. What happens to those immigrants once they get here is another matter. The nasty part of the current campaign against immigration is the barely veiled assertion that since today’s immigrants are likely to be nonwhite they should be regarded as unassimilable. (How, when, and why Hispanics became nonwhite remains a mystery.) The trouble with the assertion of unassimilability is that it was made against the earlier waves of Europeans, most notably the Irish. Nothing was clearer than that the Irish, like the Southern and Eastern Europeans who followed them, being Catholics, could never adjust to a republican, democratic, and constitutional order.
The danger today lies not with the immigrants but with us. Instead of making clear that anyone welcomed to our shores must agree to submit to our national ethos, to say nothing of learning our language, we announce that we have no national identity to adjust to. On what grounds, then, can we condemn those who accept the invitation to establish their own people’s republics within our frontiers? And on what grounds do we expect such immigrants to retain a shred of respect for us when we proudly announce that we, unlike the Chinese, Japanese, Ghanaians, Haitians, and Dominicans, have no national culture to preserve and therefore no personal identity to claim? Pat Buchanan, among others, would be well advised to end his irrational (and un-Christian) assaults on nonwhite immigrants and aim his fire at the appropriate target—the politicians and academics who are making claims on behalf of immigrants that few immigrants dream of making for themselves until invited to do so by our cultural elite. Meanwhile, and for just that reason, Buchanan’s call for a five-year moratorium during which the problems may be sorted out in a national debate has much to recommend it.
If we get our heads straight on that question, we can settle down to a rational debate on the specific level of immigration we can in fact support, materially and culturally. At that point, we will be able to settle accounts with our racists and make clear that people of any race, nationality, and religion will be welcome so long as they agree to accept our national culture and participate in our constitutionally sanctioned political life as Americans rather than as hyphenates with special privileges. And at that point we can devise policies designed to attract, as in the past, people who come here to work, not people who come to go on welfare rolls.
Correct me if I am wrong, but as one who fled the Cities of the Plain for the land of Canaan, I notice on my happily infrequent visits to New York that it seems to be the Koreans, West Indians, and Africans who are keeping the economy of Sodom afloat, much as our Irish, Jewish, Italian, and other ancestors did a century ago. I refer to those who are working, opening mom-and-pop stores, and the like.
Certainly, Asian, African, West Indian, and Latin American immigrants are a positive force in the economic life of Georgia, where I live, and I have not noticed that they pose a threat to the Republic. Yes, we do face a threat to the Republic, but it is being mounted in Washington. It arises here in Georgia, as elsewhere, largely from the self-hating lily-whites who dominate our campuses and a Democratic party the Lord, in His infinite mercy, seems to be marking for extinction.
Balkanization—as manifested, for example, in the demands to make Spanish and other languages a sanctioned alternative to English—like the multicultural nihilism promoted on our campuses by a guilt-ridden white elite, aims at recognizing the right of African, Asian, and Latin American peoples to national identity while denying that same right to Americans, Europeans, and Israelis. Or so it seems at first blush. But in fact this madness proceeds on the assumption that once the nonwhite peoples are safe and have been paid handsome indemnities for something or other, they will demonstrate their moral superiority to us Honkies by merging into a beautiful and loving unified world that respects diversity while it imposes the patently totalitarian ideology of Western radicalism.
Naturally, that nonwhite world will stand firmly against elitism, hierarchy, oppression, and prejudice of all kinds. Hell, the whole history of the nonwhite peoples shows that a grand egalitarian order committed to individual self-expression, recognition of five genders, equality for women, and the destruction of all oppressive elites is what they had in mind before the white man came along to mess up their heads.
The question remains: how are we to defend our national culture and extraordinary achievement in constitutional government if we surrender the ramparts by ourselves invoking the cynical rhetoric of egalitarianism and radical individualism? (And never mind that the two are irreconcilable: the ideologues of the radical Left and free-market Right assure us that they are one and the same.) And how can we expect to win the cultural struggle if we continue to deny that the common law, which undergirds all our freedoms, emerged from, grew, and depends upon adherence to the Decalogue, especially as manifested in the evolution of Christianity? Begin where you will—with the question of immigration, of the family, of the racial crisis, of law and order—you will end with the question of the religious roots of our national culture and political institutions. The totalitarians and nihilists, who easily merge on our campuses, know as much and know where to aim their hardest blows. Meanwhile, those who are appalled by current affairs proceed as if Pat Robertson and Pat Buchanan were the enemy—proceed, that is, to play the game of those they seek to combat. But these larger matters are for another day.
From Harvard to Hollywood, the intellectual Left has managed to capture and corrupt most of the commanding heights of American culture. Under the guise of third-world, multiculturalist, feminist, and other fashions, bohemian values have come to prevail widely over bourgeois virtue in sexual morals and family roles, arts and letters, bureaucracies and universities, popular culture and public life.
As a result, culture and family life are widely in chaos, cities seethe with venereal plagues, schools and colleges fall to obscurantism and propaganda, the courts are a carnival of pettifoggery, and political leadership is stultified by public-opinion polls shaped by the rear-view queries of establishment pundits.
Mostly escaping control by the intellectual class, however, has been the capitalist economy, necessarily a bastion of bourgeois values. In any practical contest, including the creation of real art, bourgeois values will trump bohemian ones every time. So-called third-world culture, wherever it arises, inexorably falls before the relentless advance of bourgeois entrepreneurialism.
Secular hedonism can prevail only through the capture and suppression of capitalism and the consequent spread of poverty and decadence. Yet this is the goal that animates all the political programs of the intellectual class, and prompts all their endless mock moralism over gaps between the rich and poor, with the poor virtually defined as people who uphold bohemian values.
While most of the distributionists claim to be egalitarian, they chiefly feed on envious fears that the real third world is now rising toward equality by adopting American capitalist values and technologies while the U.S. wastes its moral energies on the environmental and multicultural fantasies of the intellectuals.
A key source of the distributionist confusion is the perverse belief, beginning with Adam Smith and shared by both liberal and conservative intellectuals, that capitalism is based on a system of incentives and rewards. Incomes are a reward for services in the economy and an incentive to continue them. Since no one needs huge rewards as an incentive to produce, progressivity in taxation becomes a plausible cause and opposition to it is seen as an expression of greed.
The essence of bourgeois predominance over the bohemians, however, is not material but moral. It springs from thrift and discipline, patriarchy and sacrifice. In the key practical tests of life, entrepreneurs tend to be both ethically and intellectually superior to the intelligentsia. In general, they work harder, master more difficult disciplines, marry better and longer, bear more children and raise them more responsibly, and spend far less of their income. Thus they are better able to create and reinvest wealth, both social and economic.
Entrepreneurs invest virtually their entire personal fortunes on the basis of their superior knowledge, earned in business and technical pursuits, oriented toward serving the real needs of others. That is the moral source of economic growth. Most intellectuals, on the other hand, lack the slightest idea of what to do with large wealth. In control of old-money foundations, for example, they almost invariably waste funds on bohemian enthusiasms, political perversities, anti-bourgeois arts and indulgences.
Forget incentives and rewards. Capitalism works because the people who have demonstrated their ability to create wealth govern its further investment. The so-called increased social and economic stratification mentioned in the editors’ statement is a statistical chimera registering the global triumph of American entrepreneurs pioneering the new technologies of the age. Not only do American companies now command nearly 50 percent of the profits of the industrial world but they also command between 60-and 100-percent market share of most of the prime products of the information age, from leading-edge semiconductors to network software and hardware.
While intellectuals still ululate about the nation’s declining competitiveness, America now commands some three times the computer power per capita of either Europe or Japan and is even more dominant in the Internet technologies central to the new era.
What the multiculturalists reject is the very technological and entrepreneurial culture of bourgeois America that is now sweeping the globe. The rise of computer networks, however, threatens the key remaining bastions of the power of bohemian intellectuals: the universities, Hollywood, the broadcast networks, and the government/social-work complex. Intellectuals like to describe these mostly depraved institutions as the core of the nation’s identity and common purpose, but in fact they serve chiefly as the pork barrels, subsidy mills, and agitprop centers of the bohemian intelligentsia.
The rise of distributed networks of computers, each commanding the creative power of a film studio and the communications power of a broadcaster, dooms the existing mass media and academia to decline and eventual collapse. The culture of Hollywood and TV elites pandering to the lowest common denominators of the masses will give way to a culture of first choices registered on distributed telecomputers on the Internet. Hierarchical universities will give way to educational heterarchies in which the best courses and teachers will command global markets. Rather than the few hundred new titles available in existing video channels, the multimedia and course-ware businesses will resemble the book market, where there are some 80,000 new publications every year and some 145,000 titles in a typical store.
The book culture is not only quantitatively but also morally superior to mass culture. At some $2.5 billion, the sales of religious books, for example, rival the trade-book business. As the Internet begins to offer vistas of choice comparable to the book business, it will blow away television and Hollywood and restore the distinction and moral quality of American culture and education.
The national prospect has, in short, never been better. But it thrives in defiance of the bohemian agenda of secular hedonism, relativism, multiculturalism, gender revolution, environmental panic, nihilist arts and letters, and expropriation of the productive world. Increasingly we see the establishment intellectuals lashing out in envy and disgust at the new sources of American leadership and wealth. Some, like Pat Buchanan and Richard Gephardt, will find allies in large companies threatened by the new forces of change. Others will reach out to Luddites and terrorists in the anti-capitalist backwaters of the third world. To the extent that these anti-entrepreneurial trends prevail in America, the rewards of the new technological regime will be harvested in other places, such as Israel and Asia. As a result of the victories of their ersatz champions, the American poor and middle classes will lose their access to the bonanzas created by American entrepreneurs.
I have been thinking, in pondering whether our national project is unraveling, of the questions raised by David Gelernter’s book, 1939: The Lost World of the Fair. The fair in question is the New York City World’s Fair of 1939, and despite the fact that we were still in a terrible ten-year Depression (though on the upswing out of it), and on the eve of a terrible war, it was a period, compared to our own, of optimism. People responded with enthusiasm to the exhibits projecting the better “world of tomorrow.”
Gelernter says that most of what was promised then has been achieved, and yet we are no happier for it. Better kitchens, cars, roads, plastics—the kind of thing featured in the fair exhibits—do not excite us much today. When we think of science and technology now, we think of atom bombs and AIDS, environmental and population crises rather than of all the good things science can do. How many of us really look forward to the new technology that at an ever-accelerating rate makes the computer we own out of date and useless, and will provide us with 500 television channels when 50 have not really done much to improve our lives?
The world of 1939 was remarkably homogeneous. Men wore hats and ties and jackets when they went to the World’s Fair. What we would now consider our multicultural diversity was in fact greater at that time than it is today (the closing-down of the great European migration was only fifteen years in the past, the black population of the Northern cities was already large), but it was not much in evidence as a phenomenon. Everyone, it seems, had decided to mute differences under the habiliments of a common culture. There was racial and ethnic prejudice and conflict: Jews and Irish Catholics and German Bundists clashed over the Spanish Civil War, Hitler, and Father Coughlin; and anti-Semitism in major private institutions was the norm, not to mention the far more significant prejudice and discrimination against blacks. But what is striking in retrospect is the cultural uniformity, taking “culture” in its anthropological sense. It is my recollection that in the working-class schools I attended in New York City, every child either went home for lunch or brought something from home. There were no lunchrooms in New York City schools, and no free lunches, even though we were all much poorer, by any economic measure, than poor students today. Mothers provided lunch.
This recollection will be attacked as nostalgic fantasy—many mothers worked even then, and were not at home to make lunch for schoolchildren. But the sober census statistics would show they were a small minority, compared to the situation today. Our multiculturalism played no role in our lives, and underneath the large differences a sociologist could have noted among groups, there was a prevailing similarity in family structure, in expectations for children, in accepted moral norms.
We are all aware of the great changes since, whether we celebrate or deplore them, and there is much both to celebrate and to deplore. But one consequence is instability and uncertainty. Neither parents nor teachers nor politicians (I am thinking of Mayor LaGuardia) have the authority they once did to impose conformity to established rules. I do not believe the huge changes we have seen in family structure, in the role of authority in the lives of children, in our sexual mores, owe much to government. Government has been responsive to these changes rather than provoking them, though its responsiveness does indeed expand their reach. Since government has not done much to create the major social trends that have transformed the family, I do not see that government can do much to moderate them.
Stability in the bedrock structure of society was matched for a few decades after World War II with a remarkable stability in the economic expectations of most Americans. The United States was the richest country in the world. It had the best and most accessible educational system. Jobs steadily became better—there was Social Security and unemployment insurance, and private pension plans, and medical insurance came with many jobs. Now we have been through two decades or so of shocks, and of falling incomes for the less skilled and educated and a rising uneasiness about the economic future among many people. I do not think Ronald Reagan had much to do with creating this, and I do not think Bill Clinton and his successor can do much about soothing it. New realities—such as international competition—mean there are simply fewer good blue-collar jobs, and more uneasiness about the future among many white-collar workers. The decline in employment in automobile plants and steel mills has been followed by declines in such icons of modernity as IBM, and we are finally coming to the point where even the stablest forms of employment—namely, in government are threatened.
These are upsetting developments. Conservatives take pride in the great numbers of jobs we create compared to Europe, but these are often low-paying and insecure. Whatever the soul-destroying characteristics of unemployment insurance, the unemployed of Europe are better off. It is hard to see what we gain from using low-paid service workers, often immigrants, without benefits, as opposed to the pattern in Europe where such workers are well-protected by benefits. We now celebrate the victory of capitalism. It is certainly better than its alternatives, but we are brought up against its unsettling characteristics—steady destruction of what exists for something that is better, as judged by market return, uncertainty and insecurity among both capitalists and workers.
Conservatives tell us this provides great opportunities: the middle manager who loses his job can start a business, and often does. But while people like opportunity, they do not want it thrust upon them unwillingly. As someone who holds a tenured job in a university, I understand their position perfectly.
There are thus good reasons for uneasiness, even after the disintegration of the Soviet Union. But I do not think the overall stability of the American political system is challenged by these changes in society and the economy. For better or worse, it is the only system we know, and so we will muddle through with only modest changes here and there. Whatever our problems and disagreements, we are agreed on one important thing, namely, the public processes that govern how we go about dealing with them.
A nation cannot appropriate for itself with the simple strokes of its policy pens either a sense of direction or of purpose. Although as partisan and conservative as anyone except Pat Buchanan, I take note of the fact that our great divisions are argued along a shifting spectrum, that we sometimes mistake for principle either expediency or fashion, that over time we often trade positions with our opponents, as in the question of isolationism and that of free trade.
We would have to be exceptionally arrogant to imagine that history will not dull the distinctions that now in the heat of passion seem so sharp but that in the future will anesthetize new mutations of graduate student. If it was nearly impossible to tell George Bush from Bill Clinton in 1992, imagine the task 100 years hence, when the best guess as to the identity of Ross Perot might be that it was a kind of dental floss.
And yet if we are to judge the national prospect it must be in the broad perspective of the centuries, as seldom are national prospects shaped within the span of electoral cycles or even decades. The fate of Medicare is important, but it will not delineate the national prospect. The question of individual versus communal identity is essential, but it will not delineate the national prospect. Decisions affecting the level of taxation are crucial, education indispensable, and foreign policy urgent, and yet they will not determine the nation’s success or failure, its drive, cohesiveness, momentum, fortitude, or will.
Policy has but a minor role in forging the great and harmonious periods in the history of a people when it seems to win every race by many a length, when great things become commonplace, and all is caught up in a fume of energy and luck. This is what Ibn Khaldun, writing of the Islamic conquests, called ‘asabiya. It was present in full force during the periods of empire building and European overseas expansion. It was at the base and root of the Renaissance. It found expression in the age of Pericles, the American Revolution, the rebirth of Israel, and in Britain standing alone.
We had it. I remember it—enough at least never to forswear it. My father grew up and lived and died with it. He volunteered and served during World War II, considerably overage. He had been in London during the Blitz, when he did not go into shelters but stood on the rooftops, he was in North Africa, and in Asia, and he told me that even in the days of Pearl Harbor and the Kasserine Pass he had never had the slightest doubt that we would prevail.
To understand what it is that we lack and that we so recently lost, one must take into account the longstanding preeminence of the nation-state, and the mistaken belief that the drive to prosper, the will to defend, and the desire to expand, colonize, or migrate exist for the sake of the nation. For it is precisely the opposite. The nation came into being only because it was the most successful way of bringing these things about. The nation exists for them, not vice versa, and if they are over and done with the nation will wither for lack of purpose. What we perceive as exhaustion, imperial overstretch, the end of history, a dismal cyclical position, or simply inexplicable malaise is nothing of the sort, but merely the fact that we are no longer engaged by a particular set of fundamental challenges.
Before considering their individual relevance to the destiny of the United States, think for a moment of attempted substitutions. Jimmy Carter wanted to make turning down the thermostat the moral equivalent of war. He hadn’t a chance. The Vice President wants to save the earth. Saving the earth is not a fundamental social drive, nor will it ever be. The First Lady wants to homogenize incomes and outcomes. Homogenizing incomes and outcomes is not a fundamental social drive, nor will it ever be. Theology, art, and science are the crowning glories of human achievement, but they are not fundamental social drives and never will be.
What is fundamental? What deep ingrained purposes does a nation serve, what bass notes must it strike? Look back upon the first bands of men, or even to herds of animals on the Serengeti, and without doubt there you will find the drive to prosper, the will to defend, and the desire to roam. These are the imperatives of survival, which is what we are talking about, or should be talking about, when we consider the national prospect.
Materially, however, we have for most of this century been dealing only with matters of excess. If one looks at the material welfare in relative terms there will be no end to accumulation and adjustment, but if one is dealing with what we actually require, the play has long been over. As for defense, the landscape of modern-day America could be turned into that of postwar Germany in about an hour and a half, but we have chosen to live with the delusion that we are in the clear. Even during the cold war the threat to existence was so quick and abstract that the sense of struggle paled in comparison with that of World War II, when, in fact, we were in less danger. And long ago we reached the Pacific, only to bounce back within our closed borders like so many Ann Beatties, elevating self-reference, self-reflection, and self-concern to new heights, as rats in cages and people in countries do when confined with no access to the open.
A reprise of any one of these three fundamental drives would be enough to assure reignition of the fires now held in abeyance. A real war would offer the opportunity to pull together in victory or defiance, but let us hope that such an opportunity does not arise. A deep and destructive depression would offer the chance yet again for nation-building, but neither is that to be desired.
Of the three elemental activities only one can be constructively encouraged, and that is to reopen the frontier, to wander, to shift, to move, to explore, to fulfill a fundamental human need that we neglect only at our peril. As we are not and should not be an imperial nation, the only way left is up, ad astra, at once a basic imperative, an unmatched organizing principle, a point of aim, and in some ways the noblest of all enterprises.
Perhaps when we have finished churning to no avail in patterns of increasing self-absorption we will see that the best way to get back what we have lost is to look outward, to resume a journey that has lasted through all of history and pre-history, and that has come to a halt only in our time.
America has not become balkanized. It has become polarized. And not polarized racially, ethnically, economically, or sexually, but culturally and morally.
It was exactly 150 years ago that Disraeli wrote the memorable passage in Sybil describing England as “two nations”:
Two nations; between whom there is no intercourse and no sympathy; who are as ignorant of each other’s habits, thoughts, and feelings, as if they were dwellers in different zones, or inhabitants of different planets; who are formed by different breeding, are fed by a different food, are ordered by different manners, and are not governed by the same laws.
Disraeli’s two nations were “the rich and the poor.” Our two nations are not so pithily identified but they are quite as distinct. Because we have been beguiled by the race/class/gender trinity (race now used to include ethnicity), and because we are shy about talking about morality, we have not developed the vocabulary that will properly describe our two nations—two nations separated by a profound cultural and moral divide.
Even the term “multiculturalism,” as it is generally used, is a euphemism. It refers not to a variety of cultures, properly speaking, but to varieties of races, classes, and genders which have been given the honorific label of cultures. These categories do not, in fact, represent distinct cultures, for within each one of them there is the same cultural and moral divide.
It is interesting that few people have remarked upon the absence of religion in that race/class/gender trinity, as if religion is of no consequence in determining an individual’s or a group’s identity. And it is no accident that religion does not appear there, for religion would have imported precisely that cultural and moral dimension that is lacking in the trinity.
Jean Jaurès, the French socialist and member of the Chamber of Deputies, is reputed to have said: “There is more in common between two parliamentarians one of whom is a socialist, than between two socialists one of whom is a parliamentarian.” So, one might now say, there is more in common between two middle-class families one of which is black, than between two black families one of which is middle-class; or between two shopkeepers one of whom is an immigrant, than between two immigrants one of whom is a shopkeeper.
It is because their cultural and moral values, as much as their race, define their identity, and because these values often transcend their color or class, that many inner-city blacks send their children to Catholic schools—not because they are Catholic but because they want their children to have a more rigorous education in a more disciplined environment than what is available in the neighborhood public schools. Similarly, some secular Jews send their children to Jewish day schools, not to be inculcated in religious beliefs and practices they themselves do not observe, but to escape the kind of permissive education and atmosphere that prevails in most secular schools, private as well as public.
So too, religious Jews and religious Christians (including blacks and Hispanics) find themselves allied against those secularists (Jews and Christians) who are so fearful of religion that they would not only separate church and state but also denude public life of any spiritual content. And so too, people of all races and classes and of both sexes find common cause in the traditional moral and family values that have been subverted by what used to be called the counterculture but that is now the dominant culture.
Multiculturalism and immigration are frequently discussed under the title of America’s “identity crisis.” But it is less an identity crisis that America is experiencing than a moral and cultural crisis. And that crisis affects all groups alike, including immigrants. Many Mexican immigrants, Peter Skerry has shown, resist assimilation because they disapprove of American culture. If they keep their teenage daughters out of school, it is because they do not want to expose them to the moral and sexual laxity of the school environment. And if they encourage their sons to leave school and go to work, it is because they esteem work as a sign of manhood and familial responsibility. These values, to be sure, often do not survive into the next generation; Mexicans, like all Americans, are caught up in the dynamics of American society, which is itself experiencing a crisis of values.
It was long ago observed that the “Puritan ethic”—the ethic of work, responsibility, self-discipline—was as much a Jewish ethic as a Protestant one. We have now discovered that it is also a Catholic ethic, an Asian ethic, an African-American ethic, a Latin-American ethic, a working-class ethic.
Conversely, the “anti-Puritan ethic,” as we may call it, cuts across race, class, and sexual lines (even, unhappily, religious lines). Myron Magnet has pointed out the symbiotic relationship that exists between the “haves” and the “have-nots”—an upper class liberated from “bourgeois values” and promoting policies designed to liberate others from them, and an underclass that never had those values and is not likely to have them so long as those policies are in place.
Is there any hope for a resolution of this crisis? I am a congenital pessimist, but even I now see some cause for optimism, if only because more people have come to acknowledge that there is such a cultural crisis, that the problem is not “the economy, stupid.” I interpret the conservative resurgence as a moral as well as a political phenomenon. And although political reforms cannot solve all moral problems, they can ameliorate some of them; at the very least, they can have the salutary effect of repealing those social policies that exacerbate the moral problems.
I am not optimistic enough to believe that our two nations will ever be completely integrated; even Disraeli’s were not. But the relative sizes of his two nations have vastly changed and the gap between them greatly decreased. It is some much smaller changes of that kind that I think we may be beginning to see today. It will be no mean achievement if we can reinforce the moral fortitude of one of our nations and arrest the moral decline of the other.
Whatever the state of the national project, one of the nation’s great projects is the Interstate Highway System. Indeed, it is the greatest feat of construction in the history of the world, dwarfing any of its imperial predecessors. Rather low-tech in and of itself, it is—to use the lingo—a platform for more sophisticated conveyances.
Consider two empty-nesters as they pass the mileposts. We took along very little real money, knowing that cash to supplement the credit cards would be readily available at teller machines. Nor did we need plans or reservations, for accommodations are always everywhere. With the cellular telephone, we spoke with our friends and tended to busywork. Had we brought along the laptop, we could even have read the on-line press. Now and then, we would pass a larger vehicle with a satellite dish mounted on its roof. Television? Datalink? Asset management in real time?
Too bad we allowed a book on tape to invade this carefree idyll. A cultivated English voice intoned the beginning of Volume I of Edward Gibbon’s The Decline and Fall of the Roman Empire. Why were we listening? Well, everyone we know is convinced that the world is going to hell. I myself had taken to sending to friends paperback copies of Tacitus’s Annals, with its acerbic account of Roman decadence in the age of Nero, and to some others Procopius’s Secret History, a 6th-cen-tury account of the Byzantine court of Justinian and Theodora. This lesser-known classic, I assure recipients, is the best book that has ever been written about the Clinton administration, indeed, the best book that will ever be written about the Clinton administration. So, as well-prepared as we were, we were still taken aback by Gibbon’s narrative (need I mention that Volume I first appeared in 1776?) and its obvious echoes in our own time. Altogether too grim to contemplate and, after far fewer than the full complement of cassettes, simple radio would have to do.
Gibbon, we were taught, was an 18th-century rationalist who blamed the rise of Christianity for the collapse of Rome. We must hope that he was wrong about that, for Christianity was everywhere present on the broadcast band as we scanned the dial. Real Bible study, too. It did not raise our morale to have Gibbon replaced by a preacher knowledgeable in Scripture who, by reference to the Book of Daniel and to various Judges and Prophets, instructed us in the fate of great empires—Babylonia, Assyria, others—all of them done in by moral rot. Indeed, ruin is a universal expectation, the end of the great tales in all the great cultures.
As in other things, America is the grand exception, for we Americans are not trained into pessimism, nor do we yet need to be. Among the great projects and empires and causes and crusades of the 19th and 20th centuries, ours alone has come through. And there is nothing American which threatens us now, save the detritus of a century dominated by an encounter with weird ideologies and outlooks of European origin. Still, we were damaged by that struggle, even as we were winning a major part of it—damaged enough to wonder whether liberalism will, in the end, do to America what Communism did to Russia.
It was the recent conventional wisdom that the Right would lose out after the collapse of the Soviet Union because it would no longer have anti-Communism to feed on. But, as it turns out, it was statist liberalism which had prospered because of the Communist threat—much as it pooh-poohed it in recent decades—and which is now crumbling without it. We were persuaded we needed a welfare state to demonstrate to the poor of the world that it was not only Communists who cared about them, or federally enforced race-rooted statutes to assure the emerging nations of Africa that we were not bigots, or federal aid to local schools to show that our teenagers could also do physics. (Indeed, the foreign threat seems to be statist liberalism’s final redoubt, for it now argues that we need all these measures and more in order to defend ourselves against Japanese, Koreans, Mexicans, and Malaysians.) At the same time, we were urged to open ourselves to the intellectual and artistic fads of the day, the better to convince cultural elites around the world that we were a sophisticated people, not just rich provincials. How else did America itself become, for a while, anti-Americanism’s last best hope on earth?
Whatever our own current difficulties in shaking off the effects of our century, we must take heart from others far more seriously damaged than we have been. Survivors of Communism in particular frequently express an aspiration to become “normal” people in a “normal” country. For all the manifestations of the non-normal in the United States, it remains to my mind a normal country, filled with millions and millions of normal people who will in their own good time and in their own good way, but with their customary fits and starts, end the arrogant sway of their self-proclaimed betters. Naturally, it will be preferable for this to happen sooner rather than later, for much more rubble will pile up in the interim. But the main issue, once in doubt, has now been resolved; and, for our pains, we will reward ourselves by leading the world into its next great age.
The list of America’s maladies offered as the subject of this symposium could with little change have been made by a conservative at any time in this century, by a Henry Adams at the close of the last, by a Massachusetts Whig in the age of Jackson, or by an Essex Federalist in the age of Jefferson. The conservative Cassandras have been right every time, moreover, for at every stage in the development of American liberal capitalism something important has indeed been lost. And each new Great Awakening, each broad, popular effort to reach back for the best of the past has only slowed momentarily the evolution or, if one prefers, the decline of American culture and morality. The Great Awakening of 1994 will probably be no exception.
Still, few conservatives today would insist that the best of America died in 1800, 1828, or even 1893. One could plausibly point to the Progressive era and World War I as the time when all the seeds of today’s conservative list of horrors were sown, when faith in science, social engineering, and the healing power of the state took root, when the historical relativism championed by Charles Beard and Carl Becker seized hold of the American academy, when thinking about America in terms of distinct classes and interests rather than individuals became pervasive. But few conservatives today suggest that America’s cultural and moral decline began in, say, 1912.
A conservative in the 1930’s, witnessing the popularity of Huey Long and the Townsend Plan, the rapid growth of unions, the increasing prominence of socialist and Communist intellectuals, could well have decided that the end was nigh. Yet somehow his descendants today can look back to the 1940’s, as the editors suggest, and still find an American nation “confident in its democratic purposes and serene in the possession of a common culture.”
And since most conservatives would agree that this confidence and serenity lasted until the early 1960’s, it is really not so long ago that the national project was fairly well intact. Perhaps we ought to wait a bit longer before deciding that it has unraveled.
This is not an argument against waging the culture wars. Vital issues are at stake, and efforts to arrest and, if possible, reverse the trends toward individual irresponsibility, toward the evisceration of standards, and toward the dissolution of a common culture must be made. But in fighting these worthy battles, it is important not to be so consumed by them that we are blinded to other issues of at least equal importance.
There has been a disturbing tendency among conservatives in recent years to connect the sorry state of America’s culture with its role, or potential role, in world affairs. Some of our finest thinkers believe it is of strategic significance that America’s teenagers watch inane television programs; others argue that even though this nation has carried out five major military actions in the last five years, a shrinking birth rate has unfitted it to fight to preserve its interests. America’s moral and spiritual decline, many conservatives seem to believe today, virtually requires a concomitant geopolitical decline.
At a time when the United States enjoys a military, economic, political, and cultural preeminence in the world unparalleled in human history, this pessimism would be merely odd were it not so dangerous. But for many, the focus on the culture wars has become a conservative version of George McGovern’s old slogan, “Come Home America”; in a rather benign world, as Francis Fukuyama put it not so long ago in these pages, the United States can afford to “concentrate on domestic problems” (“Against the New Pessimism,” February 1994). There is a widespread conviction at large that our nation has, as it were, only one pair of hands. We need a time-out in the eternal global competition, we are told, while we fix our soul.
Unfortunately, the world will not hold our place for us while we tend to our problems. We will be amazed at how quickly today’s benign international order can turn malign if we cease to play the leading role that our great power demands. There is, indeed, a direct connection between the state of American culture and our capacity to conduct a vigorous and effective foreign policy. But we cannot afford to neglect the latter in the interest of the former.
Happily, however, a nation can attend to more than one problem at a time. An active foreign policy need not be an obstacle to the domestic well-being of the country. Indeed, if history is any guide, the opposite is more likely to be true. When was America ever more internationalist than in the 1940’s and 50’s? When in this century was it ever less active than in the 1930’s?
The cultural problems identified by conservatives today are serious. But an inordinate pessimism about the state of the nation can be as debilitating as it is ahistorical. Conservatives need to do their best to address America’s ills, but fin-de-siècle thinking will be inappropriate when the new siècle arrives.
I should begin by saying that it seems to me premature, not to say hyperbolic, to speak of our national project unraveling. Cultural criticism is not prophecy, for one thing, and we who have lived through the past 50 years, or some portion of them, are still too much in medias res to reckon with much confidence the ultimate outcome of the battles now raging in our society. Besides, who knows what unexpected resiliencies we Americans may yet discover in our midst as the future unfolds? It will not do, certainly, to acquiesce in pessimism.
Alas, that caveat exhausts the upbeat portion of my reflections. For I also believe that no one who cares about the spiritual health of this country can regard our present situation with anything but dismay. The “melancholy, long, withdrawing roar” that Matthew Arnold discerned as the sea of faith ebbed into darkness around him has become a deafening thunder. The editors mention several important evils; many others could be adduced, including the decline in educational standards and liberal learning, the attrition of manners and civility, the ubiquity of a shockingly degraded and corrupting popular culture: everyone will have his own inventory of horrors.
What opened this Pandora’s Box? How have we come to this troubling pass? In my view, the essential problem is not pragmatic but moral. Among other things, this means that changes in public policy alone will not fix things: a change of heart is also needed. The question is, where do we find the incentive for the necessary change of heart? There is no single or simple answer to that question. We are living with a crisis of values that amounts in the end to a crisis of faith.
There are many sides to this crisis, and a long history. Almost a century ago, Gustave Flaubert wrote in a letter that he felt
a wave of relentless barbarism, rising from below the ground. . . . Never have affairs of the mind counted for less. Never have hatred for anything that is great, contempt for all that is beautiful, abhorrence for literature been so manifest.
What would Flaubert have to say if he were to visit a class in cultural studies at one of our premier universities today? What would he think of MTV? Of Calvin Klein?
The problem is not just around us: it is potentially within us as well. As Evelyn Waugh noted,
barbarism is never finally defeated; given propitious circumstances, men and women who seem quite orderly will commit every conceivable atrocity. The danger does not come merely from habitual hooligans; we are all potential recruits for anarchy. Unremitting effort is needed to keep men living together at peace; there is only a margin of energy left over for experiment, however beneficent. Once the prisons of the mind have been opened, the orgy is on.
In one sense, the barbarism that Flaubert and Waugh descried is a perennial threat: it is part of the human condition. What is new is the celebration of barbarism as a form of welcome liberation. We live at a moment when philosophers routinely espouse the nihilistic absurdities of deconstruction and eagerly proclaim the “end of man,” when all manner of obscenity is aired on television and championed by those charged with preserving our cultural and intellectual heritage. In the Ethics, Aristotle observed that nobody but a blockhead believes that our conduct does not form our character. We are as we act, and we have been acting very badly indeed.
What has the vaunted conservative resurgence done to address these problems? So far, at least, the answer is almost nothing. Indeed, much as the participants in this symposium might applaud recent Republican electoral victories, the truth is that conservative political victories have hardly made a dent in the onslaught of barbarism. Part of the reason is that, to a large extent, conservatives have ceded authority in cultural and intellectual matters to liberals, who in turn have capitulated on every issue to the most radical elements. If the conservative movement in this country is to make any fundamental and long-lasting improvements in society—if it is to help precipitate that change of heart I spoke about—then conservatives must seek not only to win elections but also to win on the battleground of culture. Among other things, this means overcoming the contempt for culture that has long been an ingredient in many versions of American conservatism.
But culture is not, I think, the whole answer. In one of his essays on humanism, T. S. Eliot observed that when we “boil down Horace, the Elgin Marbles, St. Francis, and Goethe,” the result will be “pretty thin soup.” “Culture,” he concluded, “is not enough, even though nothing is enough without culture.” What else is there? Religion, or at least some acknowledgment that the ultimate source of our moral vocation transcends our mundane interventions. Eliot put it neatly:
Either everything in man can be traced as a development from below, or something must come from above. There is no avoiding that dilemma: you must be either a naturalist or a supernaturalist.
It says a lot that Eliot’s articulation of this core belief of traditional conservatism should be deeply controverted today, even by many conservatives. The depth of that controversy is perhaps an index of our confusion. Dostoevsky once claimed that if God does not exist then everything is permitted. Considerable ingenuity has gone into proving Dostoevsky wrong. To date, though, the record would seem to support him.
Jeane J. Kirkpatrick
It is true that some observers look at American streets after dark, watch our television shows, read our test scores, listen to gangsta rap, read our high-school history textbooks, and conclude that we are not a viable society. It is true, too, that eventually we will perish from the earth as every other civilization has perished. But not yet—not until we have explored further the capacities of free people in this age of receding limits and expanding possibility.
America has demonstrated unusual vitality, ingenuity, strength, and success in this century. For 50 years the United States served as leader of the party of freedom—defending free societies and democratic governments—at great cost. Our efforts were not always appreciated and our strengths were not always understood.
To many sophisticated persons it was American power that seemed to be declining in the years before the Soviet Union self-destructed. Among America’s allies many were more impressed by growing U.S. economic weaknesses and moral confusion than by deepening Soviet crisis.
A 1987 multination poll confirmed a continuing drift away from the United States among its allies: the British political scientist Ivor Crewe reported that the British had come to see the United States as a greater threat to world peace than the Soviet Union by a margin of 37 to 33 percent. The number of British who believed their government should work less closely with the United States increased from 17 percent in 1974 to 43 percent in 1987. In West Germany only 31 percent preferred closer relations with the United States as compared to 58 percent who advocated a policy of “equidistance” between the U.S. and the USSR.
By 1987 it had become commonplace in Western capitals to hear American deficiencies and decline contrasted with Mikhail Gorbachev’s “bold leadership.” Our allies thought perhaps American economic power and authority had eroded so far that we were no longer able to provide leadership in international affairs. The great popularity of Paul Kennedy’s The Rise and Fall of the Great Powers (1987) both reflected and strengthened the idea that America had lost its momentum and entered on an irrevocable decline—even though available reports showed that the U.S. share of world GNP was about the same as in 1970 and the U.S. deficit was about the same as the Japanese; our share of world GNP and our labor productivity had in fact risen sharply in the five years before Kennedy’s pronouncement of decline. Nonetheless, he thought he saw some intangible, immeasurable, nearly indescribable evidence that the United States was in the early phases of a mortal illness.
About the time Kennedy and other declinists seemed to many to be winning the argument, the Soviet Union entered the final, critical stage of its self-destruction, leaving us to consider why allies who had profited greatly from steadfast American support should have had so poor an opinion of us, and to wonder why a significant portion of the American elite turned on its own society an unrelenting barrage of disapproval and disdain.
Regimes may disappear because they suffer a crushing defeat in war and prolonged military occupation, as did the Nazi and fascist regimes after World War II. This is not likely to happen to us. As we demonstrated again in Desert Storm, we are the greatest military power in the world, and are safe as long as we do not dissipate our strength in idle operations.
Regimes can also disappear because their ruling groups lose their commitment to the values and institutions that sustain them. The Soviet elite lost confidence in Marxist-Leninist predictions and infallibility, then confidence in the rectitude of their own monopoly of power, and then they abandoned power.
Plato describes this model of regime change in the Republic to explain how even an ideal state eventually turns into a very different polity—through marginal, unintended changes in the beliefs and goals of its ruling elite. Regimes change when the ruling elite changes, when the political class no longer has the loyalty, affection, legitimacy required to sustain the system.
Not even a truly closed society can forever resist this unintended transformation of successor generations.
That could happen to us. Many think it is already under way. “We” survive only as we preserve our identity—the distinctive identifications, values, practices, attitudes, character—that define us. Those depend on our shared history, experience, attitudes—on our shared sense of being part of the same valued country and civilization.
We could lose that.
It is true that the boundaries that differentiate us are being eroded from within through multicultural doctrines and practices; from without by waves of illegal immigrants whose languages and cultures are protected by multicultural practices. It is true that social disintegration, cultural disintegration, crime, and economic decline have increased.
It is true that trust in government and other major institutions, and respect for authority have declined. But we are aware of these problems, and our national habit of self-criticism will keep us working on them.
I believe we will survive this perilous danger. America is the embodiment of our era. Americans are the modern, free, pluralistic, pragmatic, successful people. Born out of change, we thrive on change. We are a distinct people who have created ourselves from new biological and cultural mixes of great diversity. When challenged, we fight back in diverse ways. As voters we will go on throwing the rascals out until we find elected officials who give us laws and leadership more consistent with our traditions, values, and hopes for a better future.
America will not end with the century. Americans are not ready to abandon the field.
Fifty years after the end of World War II, it becomes harder and harder to believe that we shall ever again be what we once were. The odds are certainly against it. We now live in a culture that is deeply corrupted—a liberal culture that in the name of unrestricted freedom has brought us to a condition of moral insensibility. What were hitherto the malign beliefs and sinister fantasies of a radical fringe of incendiaries and immoralists are now the stuff of advertising campaigns, classroom instruction, courtroom strategies, and talk-show shamelessness. The concept of respectability, which for so long was the butt of enlightened ridicule and contempt, has at last been eradicated. Men can now be elected to high public office without meeting its standard, and women can become “role models” while flouting even the appearance of its demands. The children are brought up accordingly, and suffer the consequences.
Given this condition of moral insensibility, is it any wonder that our institutions should prove to be increasingly unstable? The institutions of our society do not derive their authority—their moral authority, as we say—solely from the laws that support their existence. Their authority derives from convention and consensus—from tradition, if you will—which it takes much time and a profound collective effort to create and which requires an unflagging moral commitment to sustain. Once that commitment has been shattered, as it has been in our time, all institutions pass into a kind of free fall—which is where, alas, most of ours now find themselves.
The schools, the courts, the press, the citadels of business and government, the culture itself—all now fail to command respect even as more and more demands are made upon them to assist in transforming the conditions of life. Meanwhile, into every reach of the moral void left by these failed institutions a debased popular culture—now more powerful than ever before by virtue of corporate mega-mergers and accelerating technological innovation—insinuates itself as the principal arbiter of what is permissible and desirable. Before the juggernaut of this popular culture, which has now assumed and corrupted so many of the prerogatives of education and moral instruction, the only escape is to opt out of what our mainstream society has become.
This is what a small minority of people now do privately to whatever extent their means, circumstances, and personal pieties allow. The home-schooling movement is one example of this tendency, but it is only available, of course, to parents capable of assuming its daunting economic, intellectual, and logistical burdens. It is a sign of the times that the home-schooling movement, which was once largely the preserve of “progressive” parents determined to safeguard their children against the corrupting influence of the bourgeois values believed to be propagated by the public schools, has become an option for conservative parents determined to protect their children from the abysmal standards and lethal social agenda of these same schools.
An even larger and significant minority is to be found in the Christian evangelical movement, which not only opts out of much that mainstream bourgeois society has become but actively opposes the advancement of its agenda. This active and avowed opposition is the reason why the evangelicals inspire such extreme paranoia on the cultural Left, for the latter—itself the product of an earlier radical movement to opt out of and then reshape bourgeois society—knows very well how much can be accomplished by such organized groups of highly motivated true believers when the conditions are propitious. Which is to say, when mainstream society has lost faith in itself, and opens itself to radical alternatives.
There is no doubt in my mind that the conservative resurgence that began with the election of Ronald Reagan in 1980 and was so spectacularly advanced by Republican victories in November 1994 was a direct consequence of the social malaise caused by this debasement of cultural life and the moral disarray that has followed in its wake. The 1994 election was never really about taxes, however onerous our taxes may be, or about the deficit, however problematic that undoubtedly is. The election was about values—about a perceived connection between the profligacy of government and the decline in the character of society and the quality of life. In this respect, even the deficit may be seen to have symbolized for many voters deficiencies closer to home.
Republican conservatives won, I believe, because they succeeded in articulating that perception more forcefully than had ever been done before, and they were significantly aided in this endeavor by the appalling character of the Clinton administration during its first two years in office. Yet the degree to which the 1994 Republican victory represented a vote of despair has not been sufficiently appreciated. And that despair has not been relieved or reversed by anything that has happened in the aftermath of the Republican victory. The 1994 election may have rejuvenated the ranks of the Republican party, but it has not altered the mood of the country, which remains fractious and leaderless. And the failure—so far, anyway—of the Republicans to produce a resolute conservative standard-bearer around whom all factions of the party can join in a common effort is not a happy augury.
Meanwhile, this fragile conservative resurgence remains hostage to a cultural establishment—in the media, the entertainment industry, the universities, and the arts—that is ever-more hostile to it. On this issue it must be said that conservatives have not, for the most part, proved themselves very smart or even well-informed. With certain notable exceptions, they have largely ignored the entire realm of high culture and what has befallen it at the hands of the cultural Left. As for the high culture of modernism, they seem to have written it off as an elitist aberration. The unexamined assumption seems to be that the achievements of modernity are OK for social, scientific, and business purposes but that the modernist art and literature that has the most to tell us about the fate of the human spirit in the age of modernity can be dispensed with.
Let’s face it: there is in the conservative movement an element of smug, unreconstructed philistinism that renders it incapable of engaging the deepest issues of cultural life. It is all well and good for Senator Dole and others to attack the immoralism of our pop culture; such attacks are needed, and the more the better. But in the long term the solution to the problems of cultural life does not lie in substituting conservative schlock for the Left-liberal schlock we now have in such abundance. For it is the prevalence and prestige of schlock itself that now retards the moral and intellectual development of our society, condemning it to a permanent state of adolescence and immaturity.
It remains to be seen what difference, if any, the conservative resurgence will make in dealing with these and the other problems that now beset us. The only certainty is that the culture wars, as they have come to be called, will be with us long after the next election no matter which party proves to be victorious.
I do not for a moment believe that the United States is headed toward balkanization or breakdown, despite all the twaddle about multiculturalism and diversity—and despite, too, all the government money that now actively sponsors such ideas. The key group is the Hispanics, whose numbers are now just about equal to the blacks. They are assimilating into the American mainstream, though more slowly, for all sorts of reasons, than immigrant groups in the past. Most Latin American immigrants have had little connection with Latin American culture, about which they know nothing—in this respect, they resemble the Italian immigrants of yesteryear. In any case, Latin American culture, in both its literary and religious traditions, is part and parcel of Western civilization. It is interesting to note that “Hispanic studies” in our universities almost never require that the students read any books in Spanish. Those courses are political, not cultural. The self-appointed “Chicano” leaders are blowing into the wind, and their financial support is almost entirely governmental. Such support is a major obstacle to the process of assimilation.
About the blacks, I have to admit, I have no certain convictions—except that the notion that American blacks constitute some kind of multicultural entity is obviously absurd. Black writers and black musicians are, for better or worse, as American as apple pie. On the other hand, what can only be called black racism does seem to have a powerful grip on the black popular imagination. One can hope that the emerging black middle class will gradually mollify this poisonous passion. This racism, together with multicultural fantasies, offers no future for American blacks. Sooner or later, common sense should prevail, though not, I fear, before considerable damage has been done.
As for the fastest growing minority in the United States, the Asians, they are succeeding economically while disappearing as a racial-ethnic group at the same time—and at a rate unprecedented in American history. American-born Asians intermarry with those of European stock at a 30-percent rate, and we no longer think of such marriages as in any sense “mixed.” Most then become—many already are—Christians. Our best universities are concerned that Asians may, without restrictive (if informal) quotas, become a majority of the student population. Nor is there anything multicultural about this group. Very few Chinese, Japanese, Korean, Vietnamese, or Thai youngsters can read—or wish to read—a newspaper in their parents’ or grandparents’ native language. The melting pot works here with a quite stunning rapidity. We are on the verge of seeing these Asians, and their offspring, as just another “European” ethnic group.
What I do find most perplexing and bothersome about the American condition is neither racial nor multicultural but generational. The observable trends among our young are so complex, so much at odds with one another, that they obscure any vision of the American future.
As best I can determine, there are three main cultural trends among the young: countercultural, libertarian—in radical, liberal, and conservative versions, depending on attitudes toward the free market—and what might be called traditional conservative. But these currents overlap, merge sometimes, become distinct again, and originate minor rivulets in the most bewildering way. In the Jewish community, for example, we see this process visibly at work. What are we to make of a gay synagogue that is, in many respects, more observant and traditional than most Reform synagogues? I do not know what to make of it. I am always pleased to see younger Jews become more observant; on the other hand, I wish they were not gay. Similarly, I am delighted at the presence of Jewish feminists who wish to participate, on a more equal basis, in rituals and observances, and who study Hebrew and the Talmud. These feminists are breathing new life and vigor into what otherwise threatens to become a moribund religious community. But I simply do not know how to cope with a learned (by my standards, anyway), observant (again by my standards), lesbian rabbi. It is all very confusing.
In the same way, I do not know how to feel about young men and women who go off to study in Israeli yeshivas. I am simultaneously both approving and worried. I do not wish to see Jews cut themselves off from our Christian-secular Western civilization—my civilization—and become little more than a parochial sect. But I am pleased that they are involving themselves in a rediscovery of traditional Judaism. My own religious leanings are toward some version of modern Orthodoxy. But can such a movement resist all the centrifugal forces that are pulling the Jewish community apart?
And not only the Jewish community, but all the Christian denominations as well. I have Catholic friends who simply do not know what to think about their “charismatic” brethren, growing in numbers. And the secularists, as well, are feeling the stresses and the strains. Our culture, both “high” and “low,” is still overwhelmingly secular, to the point of frequently verging on a hedonistic paganism. But while the young are thrilled by this culture—young people are hedonists and pagans by nature—their parents, themselves raised in that culture, are becoming more dubious about it. As for the powerful and still emerging movements of religious evangelicals, one does wonder about their children. Do they never go to rock concerts? Are their sexual habits so wildly at variance with their secular contemporaries? Do they turn their eyes away from the near omnipresence of soft porn on all those youthful television programs? In the absence of any credible studies, I cannot even begin to guess.
I am persuaded that a serious religious revival is under way in this country. But just how this revival will make out when it confronts the hedonism of our popular culture and the libertarianism of so many of even our politically conservative young people remains to be seen. What we call the culture war is still only in the skirmishing stage. Anything like a Kulturkampf is not yet visible. For my part, I would welcome it, if uneasily. In any war, one’s allies can be as troublesome as one’s enemies.
Five years ago, when I moved to the Midwest, about an hour’s drive from the place immortalized as Middletown by the sociologists Robert and Helen Lynd, one of the first community experiences I had was visiting the schools my children attended. In the classrooms were things I had not seen in previous schools in New York City and Washington, D.C.: American flags. Moreover, with the build-up for the Gulf war under way, my older son’s homeroom was busily collecting various objects to send to GI’s in the desert. I joked to my wife that, were we still in the nation’s capital and were such a project undertaken at all, the children, in order to be fair, would undoubtedly have had to send presents to the Iraqi troops as well.
As I have since come to learn, the national prospect really does look different from the Midwest. Once known for its homogeneity (even a degree of intolerance), Indianapolis now manages to house, peaceably enough, five synagogues, a mosque, enough Hispanic businesses to form their own Chamber of Commerce, an annual business-oriented convention called Black Expo, and even a Japanese grocery store for a growing Asian population. Its mayor wins national accolades as a civic innovator, able to make urban government work again.
On Sunday, you still cannot buy a bottle of liquor or a car in Indianapolis. Each year there is a film festival that aims to honor “heartland” values in the movies. On V-J Day, thanks to the American Legion, whose national headquarters are here, we actually had a big parade, complete with crew members from the Enola Gay. Had he decided to run for Governor, Dan Quayle would have been handily elected.
Yet, even though this is what the talk-show hosts like to call “flyover country,” the cultural tornadoes of recent decades also touched down here. Last year’s big to-do at the state university was not about the antics of the basketball coach but rather about plans to use taxpayer money for a gay student center. The Indianapolis public-school system, which otherwise has little to brag about, nonetheless boasts one of the country’s foremost “Afrocentric” educators. Murders are up, as are children born out of wedlock. The city’s elites do not have the self-confidence (and influence) they used to, and the local newspaper runs lengthy articles and editorials about racial divisions in the community.
Thus, the view from the outskirts of Middletown is somewhat complicated. There is much to be optimistic about, as traditional values still infuse much of civic life. But they coexist with other outlooks and beliefs of more recent vintage. Here in Indianapolis, one sees a conservative resurgence (or rather, persistence, since the city never really lost its moorings), but not much receding of the floodtides of doubt and self-criticism that reached the Midwest as well during the decades of the 60’s and 70’s.
One reason for this is that places like Indianapolis cannot isolate themselves from the major streams of American culture. The small-town newspaper of old now carries reports—and the attitudes that come with them—from the New York Times and similar publications. Hollywood’s latest creations open here usually at the same time they begin running on Broadway. More high-school graduates now go off to college, returning with the current academic fads, and more college-trained professionals come here to work, carrying similar baggage. Thanks to the genius of mall builders and merchandisers, the nation’s finest shopping is now minutes away, and so, too, ads, slogans, and fashions to shape a city’s style.
That embracing, albeit sometimes suffocating, ethos that once characterized the towns and cities in the nation’s heartland has become so porous that practically any ideas and values can get in and stay in. Most have done so.
This is partly because of a second feature of life in the Midwest: the conservatism of conservative people. Apart from an occasional battle over a textbook or religious display, the traditionalists who live in Indianapolis are more inclined to be tolerant than troublesome. They have accepted the idea that it is wrong to try to “impose” their values on others. In any case, spending time on political or social controversies seems wasteful.
Moreover, a deeply held sense of loyalty makes them more forgiving of the faults of American institutions than they ought to be. Not long ago, for example, a group leading an outburst against the excesses of “outcomes-based education” stopped short of supporting ideas, such as vouchers, that might really make a difference in the curriculum. Instead, they professed their unwavering commitment to the public schools—even though, thanks to ill-conceived public policies, institutions like the schools are often major obstacles to a healthier civic culture.
Thus, the public schools in Indianapolis are still under a busing order, established over twenty years ago, that made racial balance, not educational achievement, the focus of community concern. Old Supreme Court decisions are routinely (and often successfully) invoked by those opposing religious displays, such as a menorah, on public property. Despite a huge demand for low-skilled workers, the city’s welfare agencies operate in such a way that relatively few of the poor are encouraged to become self-supporting. In short, although Indiana voters are electing moderate Democrats and conservative Republicans these days, they continue to be governed by the ghosts of politicians long past.
In their studies of Middletown and similar places, sociologists observed that the wholesome civic culture which revealed itself publicly often disguised less virtuous (if not deviant) behavior taking place privately. Today, the opposite may be more nearly the case. Along with continued fidelity to the historic symbols of American life, much now occurs in public that would have struck Mid-westerners of 50 years ago as out of place, if not bizarre. Yet in their private lives Midwesterners are probably at least as traditional as they have always been.
For those who believe that the American prospect will ultimately be determined by how families raise their children, treat their neighbors, and go to worship, this may not be bad news. But as long as tension persists within many elements of the public creed, even Hoosiers will be uneasy about owning up to how they actually live.
The questions posed by the editors raise issues sufficiently broad to bring to mind, for this respondent at least, the famous opening eight lines of Yeats’s “The Second Coming,” in which the poet describes a crisis so profound as to encompass culture, politics, and the totality of the human spirit. Those lines have achieved a sort of comprehensive resonance. Each phrase stands as such an acute rendering of the condition as to have become a cliché: “Things fall apart; the center cannot hold;/ . . . The best lack all conviction, while the worst are full of passionate intensity,” etc.
The remaining fourteen lines of the poem consist of an imagining of the result of this condition. They begin with the speaker’s sentiment that “Surely some revelation is at hand.” The pursuit of this line of thought then proceeds almost breathlessly—“Surely the Second Coming is at hand./ The Second Coming!”—and our speaker is off to conjure in his mind that “rough beast” slouching toward Bethlehem.
But, of course, Yeats gives us no actual Second Coming at the end of the “The Second Coming.” And that absence, it has long struck me, is the source of the poem’s true greatness. What we have instead is a perfect portrait of the mind that harbors the despairing thoughts of the poem itself. Things cannot go on like this; something must happen. Conditions are intolerable. There must be some resolution—for good or (more likely) for ill. This yearning for a resolution becomes a demand (“Surely . . . / Surely . . . !”) that must be fulfilled—so much so that the imagination will do the job if no actual beast is available. And none is.
You could probably say that another aspect of the genius of the Yeats poem is that its first eight lines, though vivid, are generic, allowing us to plug in the details of our own particular social, political, and cultural crises as we see fit. Thus the relevance here. Given the ills listed in COMMENTARY’s statement—from “multiculturalism and/or racial polarization” to “the dissolution of shared moral and religious values”—well, here we go again, turning and turning in the widening gyre.
I do not mean to belittle our problems. On the contrary: they are grave. What I mean to belittle is the notion that some great revelation is at hand.
For the wildest optimists among us, the imagined revelation is total, inevitable political victory for the forces of good. Perhaps we shall cybersurf a third wave to peace and prosperity with Newt Gingrich and the Tofflers. Perhaps we shall see the emergence of a third party on the ashes of the other two—a party that will either (a) recapture the broad middle of American politics from the extremes of Left and Right or (b) reject the globaloney of the new world order, put America first, and take back the country from those who have hijacked it from its rightful owners. Perhaps we shall make a revolution against the overclass and all that it represents by way of unjustified privilege, thereby restoring a fundamental sense of social justice as we create a new American nationalism. Perhaps we shall be transported by a religious reawakening that will sweep away the detritus of a failed and decadent culture and replace it with the traditional values that made this country great—or perhaps replace it not with values but with moral truth.
For the pessimists, the clouds have long been gathering and growing darker; soon they will burst. Perhaps we shall see our political system break down and splinter in response to the irreconcilably contradictory demands of an angry electorate. Perhaps the rich shall flee to their gilded suburban prisons as the ghettos bubble over with the wrath of the have-nots. Perhaps we shall see a metastasizing state bent on intruding more and more into the lives of more and more people, ordering those lives and regulating them according to the whims of those in power. Perhaps we shall finally throw off the delicate balance between man and nature, with cataclysmic global consequences. Perhaps we shall all inhale an airborne strain of a Level IV pathogen.
But I doubt it—all of it.
We are indeed in the midst of a conservative resurgence, and it is indeed, at least in part, a response to the undeniable social problems resulting from perhaps 60 years of liberal ideological domination. Properly nurtured, this resurgence ought to continue a while. (Helping nurture it is my day job.) But a conservative resurgence is something that will take place about an inch at a time, and only then with the full-time efforts of those whose business it is. This is the reality of our social, cultural, and political spheres.
But how can we wait? With illegitimacy rates and welfare dependency rising, with test scores declining and the education system failing, with a generation of unsocialized males growing old enough to wreak havoc on a society throwing up its hands in despair, with both high and popular culture engorged on depravity, with American and Western achievement under assault from within, with families buckling under the pressure merely of getting by—with all of that, must not something be at hand?
It may be. There are, after all, those who like to read Yeats’s poem as a prefiguring of the horrors of the 20th century—horror on a mass-age, industrial-age, atomic-age scale. I think that this historicism trivializes the poem. The keening for some transfigurative moment, for some real-world Götterdämmerung about which we can say, “Well, that settles that,” is as much a part of the human soul now as it has always been.
And let us not overlook one of its most common and dangerous consequences—that we will be keening and yearning instead of working for such good as we can, when we can.
Seymour Martin Lipset
Is the nation unraveling? The evidence is otherwise. The melting pot is melting as never before. The Jewish community is worried about an intermarriage rate of 50 percent or higher, and this is typical, not unique. Most Catholics now marry non-Catholics. Close to four-fifths of Italians and Irish marry outside their ethnic heritage. The large majority of Japanese-Americans do the same, as do a substantial minority of American-born Latinos (40 percent). The only distinguishable ethno-racial group whose intermarriage rate is in the single digits is the African-American; but black-white marriages have increased fourfold in the past twenty years. There are now 1.2 million mixed-race couples and there will be many more. A summer 1995 Washington Post survey reports that fewer than half, 47 percent, of white men would object to a black mate, a position voiced by 60 percent of white women. The comparable racial-rejection rate among black males is 26 percent, and among females 46 percent. Not surprisingly, the willingness of those of European origin to wed Hispanics is greater.
These indicators of intergroup acceptance are the highest they have ever been. Thus it is clear that although America’s leftist intelligentsia and the leaders of some ethnic organizations may advocate multiculturalism and consequent balkanization, the large majority of the ethnic minority populations does not.
The census and polls find it difficult to get people to specify their ethnic or national ancestry because so many have multiple origins. The number describing themselves as European-Americans increases steadily. And according to the National Opinion Research Center, only 5 percent of Americans, 4.8 to be exact, belong to nationality-based or ethnic organizations, up from 3.6 in 1974. These include African-American and Latino organizations.
The political parties, however, are more separated ideologically and culturally than at any time in this century. Analyses of legislative voting records reveal a steady shift to the Left among Democrats and to the Right among Republicans. These generalizations apply to both social/cultural and economic/welfare issues. The ideological cohesiveness of the parties in the present Congress is not a new phenomenon, simply more of the same but with the GOP in charge. The number of moderate (liberal) Republicans and moderate (conservative) Democrats in the House and Senate is inevitably declining. The Center is being squeezed out.
And as politics becomes more divisive, more ideological, a growing number of Americans become more disillusioned with it. And the Center is rebelling. There has been a steady growth in the proportion agreeing that the emergence of a significant third party would be a good development. More appear to feel this way than at any time since World War I.
Pro-third-party sentiment in the polls now stands at 60 percent, up from 53 in 1994. But what kind of third party do the disenchanted want? Not the party of Eugene V. Debs or Norman Thomas, or even Robert LaFollette, and certainly not Jesse Jackson; not one led by the likes of Father Coughlin, or even of Pat Buchanan. Rather, they are looking for a party of the Center, one that might be led by Colin Powell, Ross Perot, or Bill Bradley. Powell is clearly the preferred leader, backed at the moment by over 30 percent.
Americans are angry with the political process, with their elected officeholders of both parties. They see their “leaders” as uninterested in them, as corrupt and immoral. The historic cultural disdain of Americans for government, which is greater than in any other country, has been refurbished. It had decreased during the Great Depression, which in Richard Hofstadter’s words introduced a “social-democratic tinge” into American life. But postwar prosperity and continued near-full employment, together with the expansion of educational opportunity and consequent enhanced social mobility, have reinforced traditional American anti-statist feelings. In 1994, the electorate gave control of Congress to the only major anti-statist libertarian party in the world. And, in response, the more leftist government-oriented Democrats have moved to the Center to pick up votes.
As the one significant Protestant sectarian country, the United States has been more moralistic than church countries, i.e., Catholic, Anglican, Lutheran, or Orthodox. And that moralism combined with antagonism to the state has made us more disposed to believe the worst of those in politics and government. The movements which have arisen on the Left and Right, from feminism and civil rights to the Christian Coalition and anti-abortion groups, contribute to the trend. Corruption and immorality are present in all countries. Where America differs is in its religiously-bred insistence on virtue. That concern has led our media and government to pay more attention to scandal.
Playing just as strong a role in undermining trust has been the enormous growth in what Norman Ornstein of the American Enterprise Institute calls “prosecutional zeal,” flowing in part from the “creation of a Public Integrity Section in the Justice Department, which defines its success by the volume of its prosecution of public officials.” As a result, Ornstein reports, “between 1975 and 1989, the number of federal officials indicted on charges of public corruption increased by a staggering 1,211 percent.”
Our sectarian-inspired moralism inspires public and private efforts to expose evil and to enforce “correct” behavior and thought. And these feelings and efforts make each generation believe it is living in the worst of times. This leads to recurrent religious-linked moralistic movements which seek to purify, not change, our providentially inspired institutions, and to find better leaders.
Some would argue, finally, that the declining confidence in American political institutions is a reaction to the ways in which a presumed economic decline is affecting morale. But the data challenge this. Cross-national surveys taken in 1990 for the World Values Study and in 1992 for CNN find that the overwhelming majority of Americans feel positive about their personal future, a higher proportion than in any other industrialized country.
Two polls taken in 1994, one for the Hudson Institute and the other for Times Mirror, reported similar findings for the United States alone.
The optimistic camp is where I am on the question of the basic stability of American institutions, if stable institutions are what we want. I certainly agree that some of the problems about which the editors inquire—in particular, the multicultural ideology of the Left and the kind of racial polarization it fosters—pose a danger to our social fabric. But I also see some of the evidence cited, such as unchecked immigration, as a basis for concluding that our national project is as capable as ever of capturing the world’s imagination. This may be the result of years as a foreign correspondent. Somehow the glory of the American Republic shines particularly bright when looked at from other parts of the globe.
Though my optimism is intact, I have found my thinking changing in recent years. I have been a joyful participant in the social, political, and cultural resurgence of which COMMENTARY itself has been in the vanguard. I do believe that headway has been made and will be made in the coming years on important reforms, ranging from the budget and monetary policy to social and constitutional questions. I believe that political correctness is in retreat and that there will be a reflowering of reason within academia. I was struck, though, by a remark that William Kristol reportedly made upon the launching of the Weekly Standard, to the effect that the only significant debate in America today is within the conservative movement. If true, that is not a good situation; and I find myself moving with a livelier step as I explore the social-democratic traditions with which the editorship of the Forward has put me in touch.
The big danger I see is that a victorious conservative movement will become narrow and exclusionary in a way that, say, Ronald Reagan’s leadership was not. Of the Reagan presidency I sometimes think that we did not realize what a canny coalition it was until it was over. From Irving Brown of the AFL-CIO, one of the cold-war heroes to whom Reagan gave the Medal of Freedom, I learned to appreciate that the triangulation that cracked Soviet rule in the Eastern bloc was a partnership not only of Reagan and John Paul II but also of the democratic labor movement. The conversations I have had since coming to the Forward make me feel more strongly than ever that it would be a tragic blunder simply to turn our back upon the concerns that animate the social-democratic side of that victorious alliance.
There are ironies that fascinate me. To find, for example, something encapsulating the positive views on immigration held today by, say, the Wall Street Journal editorial page, one would have to go back to the speech on immigration delivered in Congress in 1917 by Meyer London, a socialist from the Lower East Side. During the years when I was in Europe, I covered a series of economic conferences hosted by Senator Bill Bradley and Congressman Jack Kemp. I can still recall an economist of the AFL-CIO arguing that the way to solve the American trade problem (and unchecked immigration, too, I would add) is to boost wages, and thus buying power, in the third world. That ought to offer fertile common ground for pro-growth economics in a post-NAFTA world.
On the question of shared moral and religious values, I am struck by a powerful sense of déjà vu. I started my journalistic career in high school by launching a mimeographed newspaper with a friend named Theodore French. He was a fundamentalist Christian conservative, I a liberal, secular Jew. We disagreed on much, but had a wonderful time. Now, after a generation in newspapering, much of it spent as a member of the editorial board of the Wall Street Journal, I find myself covering many of the same questions we wrestled with then. I have long since come to understand that there are many marvelous individuals on the Christian Right and many shared values. In contrast to a number of individuals I admire, though, I do not feel the urge to mount a dais with evangelical Christians.
In my political reading, I have been returning with ever-more frequency to the writings of the Founding Fathers and the framers. I am repeatedly struck not only by their brilliance but by the way the founding period seethed with the kinds of issues we are talking about today. Maybe, apropos the editors’ statement, it is not stability but struggle that makes America so great. The more I think about the possibility of a constitutional convention in the years ahead, the more I tend to welcome the idea. Even more so these days, I also find myself thinking about the future of the Jews and our democratic institutions. I do feel that the fate of the Jews and of Israel is, as Norman Podhoretz has done so much to teach my generation, bound up with that of America. But I find myself less worried about all this than energized.
Edward N. Luttwak
Anyone inclined to dismiss the editors’ statement as a bit of foolish “declinism” just when the indices are jumping and the Dow is high, had better explain why some 4.9 million Americans were under correctional supervision on December 31, 1994, counting 2.8 million on probation, 671,000 on parole, 958,704 in state prisons, 95,034 in federal care, and some 446,000 in local jails, for a total of one incarcerated American out of 189 men, women, and children, as compared with the (already very high) 1980 ratio of one to 480.
It hardly matters which way an explanation leans, outer extremes included, for whether it is hysterical-racist reaction persecuting the insufficiently meek among the oppressed, or a necessary defense against hordes of morally depraved malefactors, or any milder variant of either, that cages so many Americans, the apparent opposites coincide in implying a society disintegrating at its margins.
True, criminality is usually too atypical to define the condition of societies, though a population of 4.9 million, plus the unknown number of their uncaught colleagues, is an already non-trivial proportion of all Americans. If, moreover, 4.9-plus million (is it 5 million, 6 million, or 10 million?) have strayed far enough to fall within the purview of the criminal laws even if uncaught, a still larger number of parents, dependents, and victims must also be deprived of that modicum of tranquility that civilized life both requires and is meant to assure.
Still, it is altogether more consequential that around the deep darkness of outright criminality there is a much larger penumbra of disordered, often acutely unhappy, lives manifest in the prevalence of alcoholism, medicalized drug dependence, and the mass consumption of illegal narcotics.
And beyond the variously addicted, only a still larger circle of chronic unhappiness can account for the prevalence of that peculiar free-floating anger which foreign visitors often notice, even when unvented in outbursts of sheer fury over trivialities. Anger has become as American as sushi, forming the very basis of Ross Perot’s political party.
Why are so many Americans so unhappy, in spite of life opportunities still so ample that they excite the envy of much of the world?
All manner of plausible causes present themselves, but the master cause may be the most elemental: homo sapiens, a familial animal like the wolf or hyena rather than solitary like the bear, is genetically unequipped to live without the emotional support of uncles, aunts, first cousins, and second cousins, in addition to siblings, parents, and children. Just as wolves and hyenas need frequent lickings and cuddlings to hunt successfully and survive, homo sapiens cannot maintain a tolerable level of serenity amid the vicissitudes of life without the resilient, many-sided support that only an entire panoply of relatives can provide. Parents and children suffice to hug each other, but any family celebration—or that necessary loan or gift—requires uncles and aunts, and while it might take a properly devoted second cousin to secure a job, first cousins complement siblings in good times and bad. Periodic gatherings for the ceremonies of birth, marriage, and death, the much more frequent celebration of birthdays, name-days, anniversaries, and seasonal feasts, as well as webs of less formal socializing, all serve to keep the machinery of human families in working order.
That is how people live all over the world in village, town, or city—but not in contemporary America, where the dislocations of a turbo-charged economy undergoing accelerated structural change, or merely the efflorescence of a most extreme individualism, prevent the upkeep of families. Aside from the geographic scattering that itself reflects the priority of career over family, attendance at any surviving gatherings is easily forgone not only for the sake of a long-hoped-for holiday but for a mere weekend at the seaside, or even to watch a sporting event.
Most Americans lost their second cousins long ago (some never reached these shores), but so poor has their family maintenance become of late that few have first cousins in working order, and even siblings may feel no automatic sense of obligation to one another. At the same time, the transformation of divorce from desperate last resort to ever-present alternative has in turn transformed marriage from a solid extension of the birth-family into a temporary shelter, apt to be blown down by any passing emotional wind. Finally, the demise of lifetime employment has removed the security provided by oh-so-synthetic but stable corporate “families.” No wonder emotionally bereft Americans give so many presents to themselves that their savings are the scantest fraction of any developed country’s national income (therapeutic shopping is simply the most widespread of addictions).
Evolution will of course catch up with the changes that have left so many Americans in or near conditions of bear-like emotional solitude. But that bit of genetic restructuring might take a little while, say 10,000 or 20,000 years. In the meantime, remedies are not impossible: e.g. re-regulation and other suitable measures to stabilize the economy, thus favoring Gemeinschaft over efficient Gesellschaft. But the Left still indulges the new, nonfamily “lifestyles,” unmoved by the spectacular failure of those experiments; while for the Right, the Unabomber’s manifesto put it crisply:
The conservatives are fools: they whine about the decay of traditional values, yet they enthusiastically support technological progress and economic growth. Apparently it never occurs to them that you can’t make rapid, drastic changes in the technology and the economy of a society without causing rapid changes in all other aspects of the society as well, and that such rapid changes inevitably break down traditional values. . . .
Wilfred M. McClay
I share many of the concerns of those who fear national breakdown. But I also fear the power of self-fulfilling prophecy. The problems we face are ultimately problems of consciousness and spirit, will and imagination, morale and purpose; and it is all-important that we not lose heart, or yield to seductive doctrines of historical inevitability. Nor should we exaggerate our difficulties by romanticizing the past and perversely downplaying our real accomplishments. It is dismaying, for example, to see the genuine racial progress we have made in this country now being all but dismissed, as public discourse becomes engulfed by the poisons of institutionalized grievance, accusation, suspicion, and guilt—an appalling and destructive spectacle.
So far as these issues are concerned, and others such as immigration reform, the atmosphere has become so thick with psychopathology and menace that our problems are becoming almost impossible to discuss, and therefore impossible to resolve rationally and democratically. I am tempted to say that our troubles may actually be less formidable than they are made to seem. But it is an illusion to think that one can distinguish between the “real” and the “psychological,” the rational and the affective, the objective and the subjective, in the life of a nation. Sickness of soul, loss of core values—these conditions can be as fatal as any material disease. And it is much easier to transform a nation’s institutions and government than it is to restore the soul and character of its people.
What makes these observations all the more troubling is the fact that most of the problems mentioned in the editors’ statement represent the extension—the hyperextension, really—of principles Americans regard with pride and affection. The cultivation of ethnic and racial consciousness is encouraged by our commitment to tolerant pluralism. Our openness to immigration is evidence of America’s generosity, and its commitment to individual opportunity and upward mobility. The existence of stratification, too, reflects a culture that prizes and rewards individual enterprise and achievement. Distrust of authority is as American as the Declaration of Independence and the Constitution. Diversity of moral and religious perspectives is as American as the First Amendment.
But these liberties represent only part of the American heritage. The Founders and their successors presumed that the exercise of such liberties would be restrained by countervailing forces of moral obligation and social order—especially the force of religion. Today, however, in the age of the sovereign self, all such restraints are viewed as illegitimate and repressive. Hence many of the best elements of our tradition have become distorted into pernicious caricatures of themselves. Hence the querulous insistence upon endless rights and entitlements, without corresponding obligations. Hence the preoccupation with identity and self-expression, which turns respect for the dignity of the individual into a blank check for preening narcissism and nonnegotiable “voices.”
Hence, too, the now-automatic recourse to oppression and victimization as categories of moral justification, categories that discharge moral responsibility by displacing it onto others. By misunderstanding our own history and traditions, and denying the restraints that human nature requires, we allow good things to be transformed into evils.
Yet those traditions and that history are still there to be freshly appropriated, and this gives us grounds for hope. The story of the prodigal son forms part of the bedrock of our civilization. That story not only warns us against the dangers of prodigality, but also offers us the hope that it is not yet too late for the repentant to recover what they have squandered. Although we measure the development of nations with metaphors of the life cycle—infancy, youth, maturity, middle age, and so on—such terms do not really do justice to the mystery of how and why cultural renewal occurs. Sometimes, as for the prodigal son, the discipline of adversity plays a key role, particularly for those who have tried their limits, made some foolish choices, learned the price of things the hard way—and discovered that real freedom comes, not from living a life of unencumbered boundlessness but from conscientiously and imaginatively playing the hand they have been dealt. We would do well to take that lesson to heart.
To do so is not at all incompatible with reaffirming a sense of America as a land of possibility. But it may be time, as Irving Kristol has recently suggested, to replace our wispy invocations of the American Dream with something more substantial. Perhaps, indeed, it was the easy resort to just such sentimentalism, as a substitute for concrete and realistic thinking about what America can and cannot be, that made the common culture of the postwar years so vulnerable to assault by ever-more romantic, expansive, and self-indulgent interpretations of the “Dream.” The current crisis has encouraged clear and hard-headed thinking about the many ways in which our society’s opportunities presuppose a certain social and moral order; and that is all to the good.
I feel encouraged, then, by many of the developments that make up the current resurgence. This encouragement has less to do with the things I see in the major media than with things I see around me. I am impressed and moved by the depth of commitment I see in many individuals I know, some of them emerging out of unpromising situations and broken lives, to restore and protect their families, marriages, and neighborhoods, and recover a dimension of self-transcendence in their lives. Yet, given the forces they struggle against, particularly the coarsening influences of our grotesque and corrupt popular culture, it is hardly surprising that they do not see themselves as restorers of the common culture. On the contrary, they see themselves as refugees from it, as strangers in their own land.
That is why it is important to distinguish between the near and long term—because there is so very far to go. There can be no easy restoration of the common culture, particularly when so many fundamentally decent Americans see it—at least, in its present, debased form—as their enemy. By the same token, although much of the nation is warmly responsive to the current resurgence, another part is disdainful and fearful of it, and it will take a long time for that hostility and distrust to relent, particularly among the cultural elites in the media and academia. In the short term, we can expect polarization to increase, conflict to grow, and the middle ground to shrink. It is a prospect that will require great patience, firmness, and clarity of purpose if it is to be surmounted.
The editors ask whether the United States is headed for balkanization or even breakdown; the answer, if one examines institutions which might represent in embryonic form the diverse multicultural America of the future, seems virtually self-evident.
Last year a segment of the public-television program Frontline focused on a high school in Berkeley, California. The school’s present racial mix roughly approximated what America itself is supposed to look like in the year 2050: 40-percent white, a combined black and Chicano/Latino majority, a small number of Asians. Apart from some Asian-white friendships and a few exceptional blacks, students from the major groups spoke to one another only in the stilted language of political argument.
Berkeley High was taking the first steps to eliminate academic tracking, which meant that in some instances students who could not read above the comic-book level shared classes with kids able to grapple with Joyce or Dostoevsky. Needless to say, teacher attention was focused on the lower achievers. Parents complained, but few of the brighter pupils minded having few academic demands made of them. What did bother them was being mugged in the hallways, a regular complaint. Some whites took independent study (thereby avoiding having to come regularly to school) simply because they had been beaten up too often.
The most telling aspect of the Frontline segment was the high degree of anti-white politicization of Berkeley High’s Chicano students, who emphasized their Mayan and Aztec roots while defiantly disclaiming an American identity. While they did poorly in their courses, the Chicanos were adept at holding rallies and starting clubs whose purpose was to vilify the oppressive white man. In a revealing segment, the mother of one Chicano facing suspension for not attending classes confronted the principal: while her English was plainly limited, she was sufficiently assimilated into the multicultural idiom to complain that her son’s disciplinary problems were due to the schools “discrimination” and “racismo.”
One comes away from such a program wondering why anyone who does not positively yearn for the break-up of the United States would think it desirable to allow the entry of roughly a million third-world immigrants—many of them with no education whatsoever—into the United States every year. The whites at Berkeley High survive: they go out for crew, they take SAT prep courses, they admit their frustration about being beaten in the hallways, and they get into college. But it would be surprising if many whites were attending the school fifteen years hence.
That kind of exodus has already occurred in urban school districts all over the United States and is a testament to a deeply felt cultural sentiment, which is typically denigrated as racism. While white Americans are ready to integrate schools and neighborhoods so long as their general tone can be maintained, after a certain tipping point they go elsewhere. Demographers now tell us that white flight is no longer simply movement from black-majority cities to white suburbs, but from regions with burgeoning Latino populations to the relatively Anglo heartland; the flight is more pronounced among working-class than upper-class whites. The latter do not rely on public schools and may be simply more international in their outlook.
It is, of course, the same upper classes, the symbolic analysts of Robert Reich’s famously fulsome description, who constitute the 10 percent of the American population which may actually benefit from the emerging global economy. For the rest, the information revolution means that American businesses may manufacture as easily in Mexico or Malaysia as in Michigan, a fact which leaves tens of millions of Americans with the prospect of declining wages or no useful work at all in the coming decades. Thus far, the drop in American salaries has been gradual, and its consequences blunted by the entry of wives into the workplace. Instead of diminished income, Middle Americans have been making do with smaller families and less parental supervision of the children they do have.
Either trend, by itself—the demographic displacement of the European ethnic groups which settled and built this country, or a widening of class disparities—would ensure that the next 30 years in the United States will be more tumultuous than the last. Their conjuncture would seem to guarantee a future of social strife, whose shape and outcome are hard to predict.
Ten years ago, my main political concern was the struggle against Communism. As recently as the mid-1980’s, it was still possible to believe that two social propositions—in my view, essential to national cohesion—were so nearly axiomatic that one did not even need to think about them. First, that while the United States should make every reasonable effort to help black Americans assimilate into the mainstream and ensure that no American faced racial discrimination, its fundamental identity was that of a Western, European-stock democracy—albeit one seasoned and in some ways plainly enriched by minorities. Second, that an American male of average abilities could find a way to support a family without Stakhanovite endeavor. Both of these assumptions have been eroded with stunning rapidity in the past decade, and my pessimism about the national project has grown accordingly.
While the efforts of contemporary conservatives to reestablish norms for family life and reform the welfare system are salutary, they may be thwarted not only by defenders of a liberal status quo but by the economic trends discussed above. In this realm, the Republican resurgence fails to address key problems, and may well exacerbate them. The new congressional majority surely worries more about lower taxes and free-market principles than about social cohesion. For conservatives comfortable with the perspectives of global business, the weakened position of American workers is not much of a concern.
As for immigration reform designed to halt the transformation of the United States into a third-world majority country, among current or lapsed presidential candidates, only Pete Wilson and Pat Buchanan have even addressed the issue, and they have hardly been praised for their efforts. A different kind of conservatism—one which seeks to confront more directly the issue raised by the editors of COMMENTARY—still seems to be several years away.
Remember that when we turned American culture inside out and upside down in the 1960’s, we did so for reasons that struck many as utterly consistent with America’s national project—even as a completion of that project. We were trying to accomplish an agenda of liberation that had two related, but quite distinct, aims. The one that gave our cultural revolution moral force—that was serious and public-spirited rather than merely personal and self-indulgent—was an effort to solve America’s race problem, at last fulfilling our national promise of freedom and equality for all. And surely it had notable successes.
But those successes came at a very high cost, not because of the goal but because of the means used to reach it. We laid waste to vast areas of our national culture, overturning some of its most fundamental beliefs and institutions, in the belief that we were accomplishing something valuable that we could gain in no other way.
In morality, for example, we undermined the bedrock idea of personal responsibility. Certainly the idea that individual behavior is shaped by the environment, rather than being the product of the responsible individual’s free will, was nothing new in the 60’s. But we moved that idea to the center of our culture when our nation’s elites agreed that blacks were victims of American society, of the “system,” and as such were exempt from any number of moral judgments. In this way, formerly clear distinctions between right and wrong grew fuzzy, first in relation to blacks, then for everyone.
In law, we bent our unique Constitution out of shape in order to achieve racial justice. Beginning in the mid-60’s, a series of federal court decisions, based on flamboyantly specious reasoning, overturned the American ideal that we do not discriminate by race. Instead we vigorously discriminated by race, first in forced busing, then in affirmative action. These vast social-engineering projects called into question such core tenets of American culture as meritocracy and even the basic democratic principle of the equality of every individual in the eyes of the state.
When these projects did not work, and when affirmative action in particular failed, we jettisoned yet another set of American ideals to explain why. One real reason affirmative action failed, as Thomas Sowell has explained, is that it placed many black students in colleges too demanding for them; students who would have succeeded solidly at a good state college were adrift at Cornell or Dartmouth. Instead of understanding this straight-forward explanation, professors, administrators, and black students themselves came up with an array of alternative reasons. The standards by which the judgment of failure was made, they claimed, were illegitimate because racist (or sexist, or classist, or inimical to the whole array of other special interests that jumped on the victim bandwagon along with blacks); the subject of study—the canonical writers of the Western tradition who embody and transmit our deepest values and virtues—was itself racist (or phallocentric or homophobic).
My point is that this was a continuous development that began with good intentions and ended in absurdity. But those good intentions formed the basis of many well-meaning Americans’ self-esteem, and therefore kept them from admitting the deepening absurdity. After all, the cultural revolution did not leave America without values and virtues; it created a new code—centering on multicultural tolerance, “compassion,” “sensitivity,” and “growth”—that shouldered aside older and better virtues.
Furthermore, many Americans, beginning with the elites, had a still bigger stake in the 60’s cultural revolution. For alongside its aim of liberating blacks, the great transformation also aimed to liberate them, personally, as individuals. To that strand, which produced the sexual revolution and the counterculture, they owed much that they valued—above all, their sexual freedom. So much of our moral life has to do with sex and the obligations with which we surround it: so the sexual revolution was, quite literally, a revolution in morals as well as manners.
This strand of the cultural revolution gave to many an even more personal stake in the transvaluation of values that resulted from its other, more political, strand. If we were telling ourselves that our own sexual fulfillment was of prime importance, that it was worth breaking up a marriage and a family over, we could be wholehearted in seeing nothing to stigmatize in the whole welfare-dependent way of life; the illegitimacy it fostered was as blameless as the dependency it created. Conversely, if we believed that, in the political realm, the “system” was unjust and oppressive, how much easier to believe that the bonds we wanted to break in our personal lives were illegitimate, that transgressing them was progressive, almost a service to society. The same feedback loop operated for those who glamorized drug use, or dropping out, or general rebelliousness. And now it repeats itself as farce: millionaires peddle songs that celebrate black rape and murder of women, righteously asserting that in this way we may know the pain we cause by the bad conditions we create in the ghetto.
However much professors may prate that there is no truth, truth there is, and it asserts itself with a vengeance. We have done the experiment; we have lived by the new culture for a generation; we have had 30 years of mocking at the great, the wise, the good; and the results are in. The results are the underclass, the homeless. Our effort at liberation created a class of minorities worse off than ever before, because they got all the wrong messages from our revolutionized culture (not from “black culture,” which is but a dialect of the larger American culture). They believed that they were victims, justified in an adversary relation to America, entitled to welfare, not responsible for their actions, and doing right to be promiscuous or to produce children out of wedlock.
And now a second wave of results is evident: the breakup of the mainstream American family, the wholesale harm done to children as a result, and the sense of rootlessness and indefinable loss that smolders in so many young people who grew up in shattered, inadequately nurturing families and in a culture they now find so flimsy, unsustaining, and just plain wrong.
More and more Americans can discern the reality and see that something is amiss, that perhaps there are truths about the right way to live which three millennia of thinkers about mankind’s experience have earnestly tried to grasp and convey. So the culture is radically shifting again, with gathering speed.
Harvey C. Mansfield
Lack of virtue is dimming our national prospect. This is a simpler statement than the one posed for the symposium, which lists possible causes of moral decline rather than calling it by name. We Americans would rather not use that name. We are all of us liberals of one sort or another because we put liberty ahead of virtue, and we do that not because we believe the two are incompatible but, on the contrary, because we want both. We think that virtue comes with liberty when liberty is the main goal, and we doubt that liberty will come along unbidden when government aims at virtue.
Those known as liberals today, however, do not share this classic liberalism, which can be found in Locke, Kant, Mill, and the American Founders as well as in the minds of ordinary Americans. Liberals today have succumbed to moral nihilism, and they say that virtue does not exist except as self-esteem, self-bestowed and confirmed by an indulgent society. Above all, liberals fear to be judgmental (except of those who dare to pass judgment). But virtue depends on praise and blame, on passing judgment. To deny virtue is to diminish it by removing standards of excellence from view, and we end up settling for less, or when that is too boring, heading for mischief. A free society that forgets virtue suffers from mediocrity and criminality, and that is America’s condition now. Or it would be, if our liberals were still dominant.
In fact, liberals are tired, dispirited, defeated, and done for. But the mess they have left remains to be dealt with, and a new liberalism closer to original liberalism but now called conservatism needs to be put in place.
The task of conservatism, most generally, is to make our government work again. Our government is self-government, and we need to make people think that government works well enough so that they are willing to consider it their own. “Alienation” from Washington is not a healthy condition, but it is a healthy reaction when Washington misgoverns. The misgovernment we have experienced is too much government in the attempt to do by endless regulation what a free people can do only on its own. The liberals left nothing to be done by our good will and our virtue; instead, they relied on the coercion of bureaucracies and courts, while undermining the authority of the police.
In America, authority is a democratic creation even when it is unelected, as in police, parents, and teachers. The same is true of what is called “stratification,” which describes the layers of unequal prestige that a democratic people permits itself. What matters is not the fact of inequality which is essential to authority or prestige but whether ordinary people think the inequality is reasonable and contributes to the virtues they want to see preserved.
At present no group in our society is performing worse, and less deserves its authority, than the liberals who dominate the teaching profession. From elementary schooling to postgraduate study their program is multiculturalism, the substitution of many non-Western cultures for one Western culture. Multiculturalism, however, is not so much the scattering of a common culture as dumbing it down. The multicultural curriculum is designed for self-esteem—not the real kind you earn but flattery given out of sympathy. It is meant to be undemanding and it succeeds. Soft grading is an essential feature of a feel-good education, even at top institutions. My colleagues are afraid to give C’s, and so they give their students the same grades they got in high school. From top to bottom in American education teachers have lost the integrity required to speak an unpleasant truth; they do not even believe in the idea of such integrity.
It is vital to wrest control of our education from the liberals still secure in the redoubts from which they criticize all stratification but their own. All partial measures such as school prayer and school choice should be understood as having this aim. The media who seem to be leading our moral decline in fact follow the liberal trend of our education. Without professors to teach and endorse their irresponsibility, they might fall back on common sense. Behind Murphy Brown’s celebration of the single mother there is, of course, feminism.
Feminism is now the greatest blight on our national prospect and the greatest threat to moral responsibility. In its opposition to the principle of the division of labor, in its desire to construct an undivided society never before seen in human history, feminism is a form of Marxism. But it is hardly recognizable as such because it begins from the right of equal pay for women—and who can object to that? Equal pay, however, includes equal right to a job, thus disregarding the male status of protector and provider. Although feminism speaks of equality, it is in practice more interested in independence. For protection the liberated woman will turn away from the husband who loves her to the government whose very impersonality allows her to think she is free. (Feminism’s love of Big Government is neo-Marxist.) Children may not be so dispensable as a husband—witness Murphy Brown—but they will grow up without a father. Conservative women who do not follow feminism to the end are nevertheless caught up in its inherent radicalism, of which they are often unconscious.
Though I may have shown more fear than hope, I am much cheered by the 1994 election. We are at a critical juncture after the defeat of one system and before the institution of another. Nothing makes a coherent conservatism inevitable, and it will take brains and luck to put the economic and the moral or cultural conservatives together and keep them in an enduring majority. Our salvation has been the “basic stability of American institutions,” or, one could say, the Constitution. Despite repeated criticisms and foolish proposals from progressives and liberals throughout this century, the Constitution has survived. It is time to suit our policies to a limited government, as the Constitution recommends. This is the only feasible self-government for us.
There is, as Adam Smith famously said, a lot of ruin in a nation. That there are troubling things in America today is true, but we ought not to get overwrought. The essence of political wisdom is to see things in proportion. The ills of this moment may strike people as outsized precisely because the very much greater perils that we survived during 40 years of the cold war have evaporated. America endured the Axis and the Soviets; it will endure “unchecked immigration.” It endured the Great Depression; it will endure “increased economic and social stratification.” It endured the counterculture and urban riots of the 1960’s; it will endure today’s “distrust of authority.”
Racial polarization, too, must be seen in perspective. It has been our state for almost 30 years, since the blacks in the civil-rights movement jettisoned the slogan “freedom now” in favor of “black power.” Yet however grievous today’s racial problems, they pale in comparison with what came before. The era of racial polarization was preceded by nearly a century of Jim Crow and, before that, slavery; these were far more fundamental threats to our national project. Still, today’s polarization is demoralizing. Our failure to achieve racial healing after the triumph of the civil-rights revolution of the 1960’s takes the luster off a glorious and redemptive moment in American history.
When the civil-rights movement went off the rails, one of its many oddball offshoots was the crusade of James Foreman, who took to interrupting worship services in mostly white churches to demand the payment of “reparations.” In retrospect, perhaps it would have been wise to take up this proposal. If there had been a way to give black Americans some latter-day equivalent of “40 acres and a mule,” perhaps that would have enabled us to put the past behind us and turn to the future in a constructive spirit. Instead, black America seems immersed in a bottomless well of grievance, blame-placing, and even paranoia—so heartbreakingly evident in the widespread belief among blacks that O.J. Simpson is innocent.
Of the woes that the editors’ statement enumerates, the dissolution of moral values is the only one about which I cannot so readily say that we have endured worse before. In some senses, we have. On the frontier or in urban slums of an earlier era, we experienced plenty of moral dissolution. And what was slavery but the gravest traduction of our moral values? And yet, a case can be made that we are suffering (or enjoying) a pervasive erosion of standards of personal conduct to an extent unprecedented in our history.
Spurred by the counterculture of the 1960’s, with its ethos of “if it feels good, do it,” we have become a nation of irresponsibles. The most egregious symptom of this is rampant illegitimacy. Another is the extremely high rate of divorce, driven by the widely accepted notion that an adult’s discomfort at remaining in a union with another who was freely chosen and once loved deserves consideration above a child’s anguish at losing the daily presence of a parent. Between illegitimacy and divorce, how many American children today enjoy the simple security of being born into a home with a mother and a father and growing up with the two of them? And even among those lucky ones whose families are intact, how many—as Mary Eberstadt has asked so tellingly in these pages (“Putting Children Last,” May 1995)—receive priority over the parental quest for self-fulfillment?
Another realm of pervasive irresponsibility is our criminal-justice system. In 1989, Joel Steinberg was found to have tortured his six-year-old daughter to death. (It turned out she was not his daughter at all, but a child whom he had stolen.) Then-Mayor Edward I. Koch gave voice to natural justice when he declared that Steinberg should be boiled in oil. But, in the event, Steinberg received a light jail term on the grounds that he enjoyed using cocaine while torturing his daughter, including on the fatal occasion, and therefore could not be held fully accountable for his actions.
More generally in our courts, a first offense seems rarely to result in punishment; and many second and third offenses do not, either. Routine plea-bargaining reduces sentences, and for those who do manage to end up in prison, every day served without committing some new transgression earns a day’s reduction for “good behavior.” On top of this, the obviously guilty are sometimes set free on the grounds of procedural peccadilloes by the authorities, and judges sometimes set prisoners free because of jail overcrowding, giving priority to a felon’s “right” not to be crowded over the right of the rest of us not to be preyed upon.
The American economy is relatively healthy at the moment. But for a few decades our rate of economic growth has been modest, trailing that of most industrialized countries. Here, too, irresponsibility may be the main cause, in the form of our comparatively low rate of savings. Since we are substantially better off than any of the other industrialized nations, it should be easier for us to save more, but our craving to consume is overwhelming. Even in the realm of foreign policy—where America established an exemplary record of shouldering burdens during the cold war—Democrats and Republicans today seem to compete over who can offload more burdens faster, heedless of the perils that a retreat from leadership may invite.
Will the conservative resurgence save us? I hope so. But the conservative wave is a thing of parts. Insofar as it reforms the criminal-justice system—through three-strike (or, better, two-strike) laws, reinstating the death penalty, abolishing parole, correcting the tilt toward the rights of defendants and convicts—it can make important progress. Insofar as it champions an ethos of responsibility, toward our spouses and children, toward our fellow citizens and fellow humans, it may be our salvation. But the conservative tide also contains a contrary current which champions self-regard above all else. This approach opposes all measures of public morality, resists taxation not only for unnecessary government expenditures but for necessary ones as well, and would turn America’s back on the world, come what may.
We need a conservatism that appeals to our better selves more than to our selfishness. Of course, that may be political folly. But if it is, that is the essence of our problem.
I am on record as a pessimist who sees America moving toward a custodial democracy, an almost caste-like society in which the affluent treat a growing portion of the American population (increasingly white) as wards of the state. The only antidote I can imagine, a radical return to a Tocquevillian America, has seemed hopelessly out of reach. But I have to say that the election of 1994 is making me hedge. The freshman class of the House represents something genuinely new in my lifetime, and one has to ask: what becomes possible if their kind gets a working majority in both houses of Congress? If you’re taking bets you should still ask for odds, but a revitalization of the Founders’ project is now something that realistic people can at least imagine.
When it comes to less grandiose goals, there is still more reason to be optimistic. The greater part of the nation seems headed toward a restoration of some important elements of the pre-60’s American culture. Coincidentally, I have just published (in the Fall 1995 Public Interest) an article specifying the reason for this optimism. To summarize:
The new American upper class—the coalition of the cognitive elite and the rapidly growing ranks of the affluent which Richard J. Herrnstein and I described in The Bell Curve, and which is now increasingly labeled the “overclass,” after Michael Lind’s coinage—appears to be in the process of adopting a full suit of traditional values. In each of the examples that follow, the statistical trendlines have yet to change. But statistical trendlines seldom change until a few years after the direction of the country has changed, and there are signs that that change is under way.
The family? Educated white America has been nearly immune from the scourge of illegitimacy (only about 2 to 4 percent of the children born to non-Latino white college graduates are born out of wedlock). Statistically, divorce has never been as severe a problem among the upper classes as in the rest of society, and there are indirect signs—in, for example, the strongly conservative shift in the received wisdom about the effects of divorce on children—that divorce will drop. Traditional child-rearing among the affluent seems to be making a comeback. The evidence, still anecdotal, suggests that career mothers are increasingly figuring out ways to stay home with their young children, even if it means interrupting their careers.
Education? Every educational trend among the affluent is moving in the direction of more challenging courses, firmer discipline, and greater parental control over the schools. These forces will only gather strength in the years to come if the role of the federal government and the educational establishment weakens, as there is reason to think might happen.
Religion? The baby-boomers are going back to church and synagogue for a variety of reasons, ranging from concern over their children’s upbringing to the natural process of coming to grips with their own looming mortality. In the society at large, something resembling the great religious awakenings of the 18th and 19th centuries seems to be under way.
Sexual mores? A recent large-scale and careful survey of contemporary American sexual behavior reveals far less libertinism in private than one might have expected from the pervasive public display of sexuality. That public display shows signs of changing as well, as Hollywood is clubbed over the head with evidence that old-fashioned romance and stories about courage, honor, and fidelity make for increasingly good box-office returns.
Against all this, there are plain and important threats to our common culture. These rightly worry many of my friends and colleagues. But we have to distinguish between the real deterioration in the common culture that currently exists and the extraordinary shift in momentum that may in time reverse that process of deterioration.
With respect to immigration, I am unpersuaded by those who identify our common culture with Europe. For me, the bedrock of American culture comprises the ideals of individualism, self-reliance, and love of freedom. A properly designed immigration system attracts people who embrace those ideals, and the continuing infusion of such immigrants has been America’s unique way of staying young. True, the government has managed to make a hash of immigration in the last 30 years, but there is nothing wrong that cannot be fixed.
As for the unraveling that has occurred because of multiculturalism, postmodernism, moral relativism, and the habit of counting by groups, current public feeling has turned, often overwhelmingly, in opposition to everything these movements represent. The multiculturalists and postmodernists may be disturbingly well-entrenched in our universities, and they continue to do mischief, but they have no significant allies beyond the campus. Within that environment, moreover, they are increasingly isolated from students who find them middle-aged, silly, and irrelevant. Strong affirmative action may still be in full force, but the intellectual debate and the moral high ground have dramatically shifted away from its proponents in just the last year.
On issue after issue, the liberal establishment at the end of the century is as irrelevant to the evolving received wisdom as conservatives were at mid-century. A colleague brought this home to me recently when he remarked that the New York Times is “in danger of becoming a rag for the Upper East Side.” There is hyperbole in that, but truth as well. As I watch the national revitalization of distinctively American ways of thinking that I had thought were moribund, I am reminded of Adam Smith—perhaps there was, indeed, a great deal of ruin in this particular nation. And I am reminded of Bob Dylan, too: I don’t think it takes a weatherman to know which way the wind is blowing.
Richard John Neuhaus
America is a nation so large and various that there is ample evidence to support almost any generalization one might make about it, including this one. And so it is also with a subject such as the national prospect. But generalize we must if we are to have any kind of bearing on the world and our place in it.
I do recall being asked, back in the early 60’s when I was a young man more or less on the Left, whether American society should be described chiefly in terms of stability or of fragility. The answer seemed obvious: America was a society so stable that it could well bear, and indeed very much needed, our most radical challenges. Of course radical challenge then meant racial desegregation, calling attention to “the other America” of poverty, and questioning U.S. policy in Southeast Asia. It was an innocent liberalism, or so it seemed, years before the full manifestation of the madness of the counterculture. Much has happened since then, including my becoming older, which I would like to think is not unrelated to becoming wiser.
Now we have mountains of data and social wounds beyond number demonstrating the extent of cultural decay and disarray. What do I say now to the question asked long ago? I say that America is an astonishingly resilient society and there is good reason to be hopeful about the next half-century.
Among the truly worrying things that were not with us 30 years ago are an entrenched underclass and an entrenched overclass. The latter, however, is not nearly so entrenched as the former. For the urban and mainly black underclass—radically isolated from the opportunities and responsibilities of American life—the remedy, if there is one, requires a thorough recasting of public policies to strengthen the mediating institutions, especially the family, and a great spiritual and moral awakening in the black churches. Our politicians do not have the wit or the nerve for the first, and the second is the work of God and thus quite beyond our control. Absent such remedy, the prospect is a huge urban parallel to the Indian reservations, or something like the inflexibly stratified society warned against in, for instance, The Bell Curve. It is immeasurably sad, and casts an ominous shadow upon the American future.
The overclass, on the other hand, may now be in retreat, making its last stand in the university. The overclass—concentrated in the semi-intellectual jobs of government, media, liberal religion, and the academy—exhibits a contempt for the American experience that is heartily reciprocated by most Americans. That was the message of November 1994; it was sent 22 years earlier with the rejection of George McGovern, but Watergate and other distractions prevented its effective delivery until now. Ronald Reagan’s election could be, and was, attributed to the maddening magic of a personality, while 1994 was as undeniably the vox populi as representative government can produce. Some liberals will still be elected, and conservative forces may be divided, but the successful politics of the foreseeable future will define itself in opposition to the liberalism of the last three decades.
Politics is, in largest part, a function of culture; at the heart of culture is morality, and at the heart of morality is religion. Morality and religion provide the commanding truths by which people live, or think they should live. What distinguishes the overclass from earlier ruling elites, and the reason it is now in retreat, is that it championed emancipation from the commanding truths of the past without proposing new truths in their place. More precisely, it proposed only two commanding truths: the gospel of the radically autonomous individual, which requires liberation from limiting commands, and the dogma of egalitarianism, which requires imposing limits on the consequences of freedom through taxation, quotas, and expansive government regulation. Moral nihilism combined with governmental despotism is an inherently contradictory program that is self-destructing before our eyes.
The overclass is forced to retreat on several fronts. The drive against the expense, size, and overweening intrusions of government will stall from time to time but is, I suspect, of enduring power. As for the media, the technological explosion in communications will no doubt bring with it an increase in the pornographies of sex, violence, and sundry fanaticisms, but it is already breaking the hold of the prestige press and networks. Can we imagine today anything of comparable public influence to Walter Cronkite’s oracular conclusion of the evening news, “And that’s the way it is, November 20th, 1965”? Does anyone now read the editorials of the New York Times except out of idle curiosity? The power of the major media, almost always overestimated, is rapidly declining.
Then there is the overclass in the churches. Remember Harvey Cox’s The Secular City? Remember religionless Christianity and the death-of-God movement? Remember the National Council of Churches? As recently as fifteen years ago, the last was an institution of the establishment comparable to the American Medical Association. Today it desperately clings to a skeletal existence and enjoys all the public credibility of the most recent incarnation of SANE/Nuclear Freeze. Liberal religion is in an uninterrupted free-fall, and with it the leadership that gave the revolution of the 60’s much of its appearance of moral legitimacy. Evangelical Protestantism and Catholicism, despite internal conflicts, are robust and growing, and it is generally recognized that, at the end of the 20th century, the pontificate of John Paul II is the world’s moral baseline.
Finally, there is the university, the last redoubt of the overclass. Especially the great research universities. Protected by their credentialing power, by munificent endowments, and by tenure, they seem impervious to the ideas and sensibilities that move the outside world. The redoubt may hold for a long time—perhaps until the credentialing power is removed, endowments are imperiled, and the tenure system is shattered. But we must believe that there are many in the university who will tire of life in the intellectual and cultural backwaters, who will be shamed by popular derision, and who will want to join the great debates now conducted by writers, think tanks, and journals outside their sheltering walls.
We are left with the bothersome question: can the opponents of the overclass, in their triumph, produce a ruling class that effectively proposes the commanding truths by which alone a culture can flourish? Can it be done without the participation of the members of the overclass who have these 30 years devoted their lives to cultural treason? Perhaps the question is premature. The retreat is still under way. Later we can discuss the terms of surrender, which should be generous.
Those who argue that the hot atomic core of the present crisis is social division underestimate the strength of the American system. Their fear of balkanization, racial polarization, and immigration is overblown. They have also misconstrued the American idea.
The American idea—building a system worthy of human dignity, a system of natural liberty—is not an idea for Americans only. It aims like an arrow at human nature, not a fragment of the race. That is why the American network of institutions—its system—has immense powers to capture the human heart. In becoming American, one knows that one’s dignity and liberty, given at birth, are being enhanced; one does not have to renounce what one was before; and one feels that one is at last in some strange way and for the first time “home.” In this sense, the power of the system to “Americanize” immigrants is undiminished.
The real national emergency arises from a different nexus. The intellectual/political class in this country has for some decades been falling into moral disorientation. The two concepts fundamental to the American system, truth and liberty, have been corrupted. If one cannot oppose power with truth, only power is left.
From the corrupted beliefs of this class come most of the evils—the cancers—that threaten the body politic: a multiculturalism for which truth is irrelevant; a welfare system for the young and able-bodied that corrupts their moral independence and makes them cynical and prey to self-destruction; a way of speech among “progressives” that reveals fear of argument; and the propagation of the self-destructive faith that there is no right and wrong, only what you desire. Of all our institutions, the universities and the media are the sickest unto death, with the empty progressive church leadership next behind.
The emergency we face is moral. Since the intellectual/political class controls the heights of communication, both the moral relativism that infects them (when they are attacking the morals of others) and the moral intolerance that blinds them (when they have power to enforce their own) are being taught to our young in ways that parents are powerless to block. The schools themselves have become conduits of propaganda; they are no longer midwives of critical thinking—they punish it. A nation whose children do not learn to think critically cannot long remain free.
Meanwhile, most older Americans still hold these truths: that a people incapable of self-government in their private lives, under the standards of reason and law, cannot possibly govern their public lives under those same standards; that religion is still the best guarantee that a people will recognize objective truths in the moral order, and that (as Judaism was the first to teach us) there is an undeceivable Judge before Whom all, no matter how rich or how powerful, are held accountable; and that without a religious and self-governing people the institutions of liberty are not likely to be long sustained.
Indeed, as Tocqueville pointed out, the notion that persons have inalienable rights, and are never to be taken as means but only as ends, depends on a society’s belief in personal immortality and will not long survive the perishing thereof. Some individuals may be able to convince themselves otherwise; but they stand on a trap door over nihilism, even if it is today a warm and sentimental nihilism. In a serious nihilism (by contrast), liberty is no better than slavery, and rationality no better than violence.
To be a civil place, a society depends upon having as many policemen as there are citizens—inner policemen, consciences—commanding that they respect other citizens as their Maker respects them, and as they would be respected themselves. This is the Great Commandment of civilization. Civilized peoples persuade one another through rational argument; barbarians club one another. But to form consciences in the young demands the patient attention of parents over many years. (Even a dead parent can sometimes exercise this role, as when a mother points to a picture of a father killed in the war and says to her son, he would be so proud of you—or so ashamed.)
As for the current conservative political resurgence: it will be hard enough, but still relatively easy, to break the morally stultifying power of the Washington bureaucracy over American life, and to restore the strength of those local associations that constitute civil society, the true center of gravity for any free society. And it will be hard enough, but still relatively easy, to reform the welfare system for the young and able-bodied (the War on Poverty for the elderly actually did work, if perhaps with unrealistic financing). But what will really sear our souls is the struggle to bring about the tens of millions of moral conversions that are going to be necessary if this country is to have a fourth Great Awakening.
Since intellectual and moral reawakenings cannot be mass-produced, this task must be accomplished one by one, through the voluntary choice of each. In America, it must be accomplished pluralistically, among Jews, Catholics, Protestants, and others, all of whom are—happily—united in recognizing the need for responsible self-government in private life, if the experiment of the free society is to endure.
It is entirely possible that the conservative movement—which at least is clear-sighted about the dimensions of the nation’s emergency—will temporarily fail, in an election or two. That apart, one can imagine that it will succeed in its larger purposes if not in all its concrete programs. (It is even possible that the Democrats, coming to their senses, will again lead the way.) Such success may end the worst abuses of imperial power reaching out from Washington, and reverse many currently false moral signals and perverse incentives. But the heart of the matter will remain moral and individual.
Each citizen must seize the governance of his own soul, by examining his current beliefs and behaviors in the light of objective moral truth—at least that objective moral truth which consists in recognizing the current decline both in self-discipline and in respect for others. Not many Americans will need persuasion on this point; it is too visible in the 600-percent increase in violent crime, and the 500-percent increase in births out of wedlock since 1960. The current growth of consciencelessness, if it proceeds, will both unravel civil society and render all of us unsafe. For public consciencelessness to grow, all that is required is for good people to do nothing.
Instead, many good people, of all faiths and none, have decided to emerge from their long silence and their subservience to current elites and to act, beginning with themselves. This is a hopeful sign.
Conservatives have a perverse habit of refusing to take yes for an answer. Seldom in the nation’s history have their prospects been so promising, yet large numbers of them continue to talk as if the sky were in unrelenting free fall. Conservative intellectuals in particular subscribe to the myth of declension, confusing conditions in the universities where most of them reside—and where things really are a mess—with the state of the nation at large. The national prospect is not unclouded (to suppose that it could be would be to succumb to the utopianism from which conservatives ought to be immune), but it is, all in all, brighter than at any time in recent memory.
That is because liberalism, moribund for so long, has finally seen its lifeline go flat. The Left has been intellectually incapacitated for better than two decades, but it had been so dominant for so long—essentially since the Progressive movement early in the century—that it took until November 1994 for that incapacity to be definitively reflected in politics. And that will not change back any time soon. For the foreseeable future, social democracy is dead in America. Liberals themselves understand that. They have transformed themselves from ideologues into “problem solvers.” Their preferred ploy is to insist that our current problems “transcend traditional categories of liberal and conservative.” Right. When you’re losing an argument, the natural inclination is to change the subject.
America has always been liberal in the classic sense, formed by the commonsense morality of the Anglo-Scottish Enlightenment, inspired by the prudent idealism of 1776 and 1787, wedded to a bourgeois democratic-capitalist order. The Left allowed itself to become unmoored from that during the 1960’s, and as it drifted into radicalism it left the Center to be claimed—and to some extent redefined in more conservative terms—by the Right. Today America’s traditional liberalism rests securely in conservatism’s hands. (A number of paleoconservatives would like to reinterpret the American tradition, but theirs is an America made up of marginals and exotics. Those who are uncertain as to whether the right side won the Civil War have no claim on the American future.)
Not that Panglossianism is in order. Liberals have made a shambles of things, and we will be a long time cleaning up. The universities, as noted, are thoroughly disoriented and dispirited, most of them with no clear sense of mission beyond sheer survival. So also with the mainline churches, which are sure they must remain “prophetic,” but somewhat less clear as to what to be prophetic about (especially since no one is listening in any case). The elite media remain mainly leftist in impulse, though segments of them, at least, are beginning to follow the election returns.
The worst of the liberal regime’s legacy to us—ironic because here its good intentions cannot be doubted—is the matter of race relations. No one knows what to do about the (disproportionately black) underclass—though getting rid of the present welfare system seems a necessary start—and middle-class blacks, for all the economic gains they have made, seem if anything more suspicious of whites, more persuaded of implacable racial hostility, than they were prior to the civil-rights revolution. They have firmly internalized decades of liberal claims of pervasive white racism. The available evidence (supported by personal experience) suggests that that perception is quite wrong, that it confuses racial prejudice with a refusal by whites any longer to submit to pointless and unjustified racial guilt-tripping. But the perception remains, impervious—or so it seems—to contrary evidence or professions of good will. For that and other reasons, race remains the great American dilemma.
These matters aside (admittedly no small aside), there is reason to believe that there really is such a thing as a Middle American majority and that it is determined to take back the political culture from its putative betters. Multiculturalism, moral relativism, and dyspeptic feminism may rule on the campuses, but they rule nowhere else. Most Americans do not know what postmodernism is, and to the extent they do, they are not buying it.
It is necessary to enter a caution here. Middle Americans were not simply bemused observers of the elite dabbling in cultural decadence from the 1960’s onward; they were to a not inconsiderable degree participants in it. Of all the promises of liberation from previous restraints put forward by the cultural gurus of the 60’s, the most beguiling by far was that of sexual license. Even the Puritans, after all, were tempted by the maypole. In matters ranging from indulgence in pornography to acceptance of promiscuity to approval of divorce at whim (“I’m not fulfilled”), ordinary Americans were happy to accept the permission slips the cultural liberators issued so insouciantly. They rationalized that corrupted form of toleration that says, “If you won’t judge me, I won’t judge you.” The casual smuttiness that came to dominate popular culture could not have occurred if the public had not, at the least, acquiesced in it.
But one senses that Americans have learned from their experiment in self-indulgence. We will never be Puritans again, but we do seem to have drawn back from libertinism. “Dan Quayle was right”—now widely accepted in popular judgment—serves as a summary of a certain kind of cultural turnaround.
Moral and religious values are not unraveling. They are in fact coming together—as everything from True Love Waits to Promise Keepers testifies. Have we a common culture? Only in the most rudimentary bourgeois sense. But it is enough: we are not a counterculture. As for the stability of our institutions, think of it in comparison with the 60’s and wonder at our ability to come together.
But do not think of it in terms of the immediate postwar era. That offered a unique historical moment which cannot be recaptured and which was not, for that matter, all that idyllic. (Remember Joe McCarthy and Jim Crow?)
We can only seize the day. History offers no guarantees, and human nature being what it is, there will always be evidence aplenty of cultural decay. But the good news is that the bad guys are in retreat, and for conservatives, who need always to remind themselves of the difference between realism and pessimism, that should be good news enough.
As late as the 1970’s, the black-white divide was the main source of national anxiety. By then Americans had gradually shaken down into a division between two races which were themselves loosely united in a common American culture and identity. Today, we again find ourselves, as at the turn of the century, divided among numerous ethnic identities and scattered over several cultures. Immigration and bureaucracy have between them created new “minorities”—some artificial such as Hispanics who are neither a racial, nor a cultural, nor even a linguistic category—with new cultural identities to match. The American identity is being reduced to a sort of philosophical or constitutional umbrella sheltering distinct and permanent cultural nations, including a reborn African-American nation, in all their (artificial) authenticity.
We are gradually replacing a country in which two races were held together, however unequally, by language, history, customs, memories, moral rules, a civic religion, and all the mystic chords of a rich common culture, with one in which a multiplicity of tribes coheres uneasily around a tabloid culture and two stark principles: (1) we should not physically harm one another; and (2) we should treat each other “equally.”
And this last, which originally meant legal equality among individuals, has evolved into the idea of substantive equality among groups (or cultures, or races, or whatever). It has thereby generated crude group competition for government benefits and privileges and an intrusive bureaucracy to administer the spoils.
It is this structure of subsidies and regulations which the Republican party was elected to reform last November. In doing so, however, Republicans will find themselves attacking the operational liberal concept of American identity in a welfare state: namely, government redistribution of economic resources and opportunities in order to create a sense of national solidarity among the various tribes. Now, redistribution will be accepted by the vast majority of Americans as a means of giving the deserving poor a helping hand. But when it is justified as an egalitarian device or a social bond, it fractures society, inspiring dependency in the recipients and resentment in the donors. Still, if liberal redistributionism is not the formula for American solidarity, what is? What can conservatives offer?
Utopian conservatives, a distinctive American breed, are content with the idea of America as a philosophical umbrella sheltering all who subscribe to the Declaration of Independence. But these principles cannot be the basis of an American identity because they do not distinguish Americans from liberal-minded foreigners. Moreover, though adherence to them is an important part of being an American, it is only a part. The principles of the American Founding are the conscious political expression of a wider and richer American culture in which liberty and equality are lived experiences rather than abstract ideas.
That wider culture is—or until recently has been—the real nursery of the American character which shapes even those Americans, from Corliss Lamont to Malcolm X, who self-consciously reject the American political tradition. It is now threatened not only by deliberate attacks such as bilingualism and multiculturalism, but also by a more insidious loss of national memory symbolized by the recent failure to celebrate V-E and V-J days with any real solemnity. To repair it must now be the main task of any serious (i.e., non-utopian) conservative.
Most contributors to this symposium would probably agree on most proposals for repair: the teaching of history which treats America as a great achievement, if inevitably a flawed one; a speech policy which aims to make English the first tongue of all Americans (while encouraging English-speakers to learn other languages); the phasing-out of official policies which divide Americans by race, ethnicity, and gender; and the replacement of multiculturalism by . . . well, by a genuine interest in other cultures for one thing, but more crucially by attending to the improvement of an enriched common culture. But there is also a more controversial requirement: a pause in immigration to allow America to digest the mass immigration of the last fifteen years and turn the newcomers into Americans. We know that this works. The 1925-65 pause transformed the heterogeneous “Second Wave” immigrants into the great American middle class.
That middle class, however, and the inclusive concept of American identity that it incarnated noticeably omitted black Americans. Indeed, just as they were about to enter it in the late 60’s following the civil-rights revolution, they were held back—by low-wage immigrant competition, easy welfare, and the temptations offered by the elite’s “permissive” culture—from full economic participation in U.S. society. That in turn produced a number of perverse social results: a black American elite that is overreliant on political avenues of progress; the growth of a disproportionately black underclass—with poisonous effects on race relations; the spread of irrationalist genocidal fantasies in the ghetto; and the growing attraction of black nationalist philosophies to prosperous as well as poor black people. All of these testify to the disaffection of many black Americans from even the most inclusive “American” identity.
Nor are whites without blame here: some conservatives, in the preference they show for hardworking immigrants, implicitly reject any duty to prefer the interests of the native-born over foreigners. They even refer to such a notion as “nativism,” adopting a concept of Americanism in which immigrants with a strong work ethic are somehow more “American” than unemployed black fellow citizens. (I write as a hard-working immigrant.) This is the dark side of Utopia.
Any concept of America which does not include black Americans as founder-members will be wounded, guilt-ridden, and incomplete. So how can public policy, especially one that eschews redistributionism, help shape a genuinely ail-American identity? As long as black America is in parlous social condition, it will remain alienated. Reforms in welfare, labor, immigration, and schools that promote full black integration into the economy are therefore crucial. Achieving them will be difficult enough. But it will also fall to conservatives to influence by rhetoric and policy the evolution of an American identity more appealing to black America. As well as emphasizing the innumerable black contributions to that identity, they must also underscore what is in danger of being forgotten: namely, that the black American identity is built largely on Anglo-American, even Wasp, foundations. It owes more to the King James Bible than to the Qu’ran, more to Shakespeare than to Swahili.
It is the most difficult of political tasks, requiring of Newt Gingrich and his colleagues subtlety and imagination of a high order. But then, as the British Labor politician Denis Healey said in another context: if you can’t ride two horses at once, you shouldn’t be in the bloody circus.
Perhaps the only reason for doubting that the national prospect is bleak is the incredible ability of the country to recover from its crises. After all, we did come out of the Great Depression of the 30’s; we did fight a successful war, though we were politically and militarily unprepared; and we did get out of the Vietnam doldrums.
However, the symptoms of the current crisis are different, though familiar: increasing racial polarization; a permanent underclass on welfare; more young, unmarried mothers; rampant crime; lowering of standards; loss of respect for law and authority; the demagogic use of the idea of multiculturalism. On the whole, there has been a growth of a trendy radicalism in the academy, the media, and among professional people who think of themselves as liberals.
As we know, there has been a not surprising backlash on the part of Middle America, which carried the Republicans to power in the recent congressional elections. The fact is that America has been divided, culturally and politically, into what might be called the ordinary population and the radical, liberal sector of the more educated classes. These are not solid blocs, but, generally, average citizens have held on to traditional values of work, family, and religion, though they have been somewhat infected by the fashionable relativism and hedonism. The new pseudo-intelligentsia, on the other hand, has absorbed all the faddish beliefs and causes, including gay liberation, radical feminism, absolute relativism, contempt for tradition, and the general emphasis on rights over responsibilities.
All this is recognized by people who have not been ideologically blinded. But two questions remain. What are the causes? And what is the cure? The cause, I am afraid, is not clear. To be sure, the general population puts a brake, for itself, on wild and destructive notions, but it cannot counter the influence of the new radicalism. At most, the very anti-intellectualism of the country has made it somewhat resistant to hothouse ideas.
However, this new mindless radicalism is more difficult to fathom. I find it baffling that after the demise of worldwide Communism—and socialism—there is, instead of a rethinking of all the old political questions, a new kind of leftist orthodoxy. The ethos today is not Communist, though there is a certain amount of empty neo-Marxism. But every other kind of modish movement is in the air, in self-designated advanced circles. The only explanation that occurs to me is that since the French Revolution, to be on the Left, regardless of any other considerations, has been thought to be morally and politically superior in presumably enlightened sectors. As a result, what we have is not the traditional Left, but a varied movement of radical extremisms.
In some quarters, the Republican landslide is taken as a sign of political renewal. But except for the possible cutting down of affirmative action and a reform of the welfare system—whose ultimate shape is still not clear—the prospect of a salutary political change is not yet evident. So far, most of the conservative responses to the prevailing radical chic have been rhetorical. Besides, the Republicans have their own agenda, which is not necessarily a cure for the current malaise. And the pro-life movement does not enhance family values, nor does it reduce teenage pregnancy or single motherhood. It is unfortunately true that the ethos of American society cannot be changed by verbal appeals to family values, the work ethic, and religion. Pep talks go unheard by those responsible for the social breakdown. And even the liberal pacesetters do not read the opposition press.
Only a complete transformation of the culture can, in my opinion, reverse the downward trend of American society. But, we must ask, how is this to be accomplished? How is a culture to be changed when it is upheld by those responsible for its decline? There is, in fact, a dumbing-down, a lowering of consciousness, in a large part of our intellectual life.
If we are to find a reason for optimism, perhaps, as some people believe, the politically-correct culture will run its course. Or perhaps the magical recuperative powers of the country will provide the cure. But the question is whether this will be possible after such an enormous breakdown.
So far, there has been a reservoir of common sense in the American people that has avoided catastrophic extremes, and there seems to be a natural pull toward the political Center—though, to complicate matters further, the Center itself is not always the most desirable position. In addition, our culture in the past has tended to be pragmatic, and not given, except at the fringes, to far-out theories and ideologies.
Of course, the miracle of recovery may not come. The closest parallel to the present situation is in the Stalinized 30’s. In some respects, the situation then was even graver, for the corrosive counterculture was reinforced by the large political weight of the Soviet Union and its tremendous propaganda machine.
On the other hand, the academy and the media were not as extensively infected as they now are. And many of the more serious and gifted writers and intellectuals made up a counterforce to the influence of the Communists. Today, there is no longer an equivalent large body of independent intellectuals, for the postmodern intelligentsia has been to a great extent homogenized and absorbed into the larger culture. Perhaps the only thing left is for uninfected writers and intellectuals to close ranks and make a bold effort to renew the culture as a whole.
But if the situation is not improved, if the media and academic culture continue to divide the country, there is a danger of a further backlash. As it is, there is considerable disillusionment with government, with the country’s growing racial polarization, with the welfare system. And if neither the Democrats nor the Republicans offer some basic solutions, there is the threat of the public turning to a demagogue waiting to exploit the failure to solve the political and cultural problems in the customary way.
Should such a tragic outcome occur, we would have to blame primarily the careerism and the irresponsibility of the politically-correct new radicalism.
In my view, the United States in recent decades has had the misfortune of acquiring a large and vociferous intelligentsia, not unlike that which brought such calamity to Russia. Its distinguishing qualities are the belief that it knows better than the “people” what is good for them, and that the government, even if democratically elected, does not truly represent the nation and its interests.
In today’s United States, intellectuals of this persuasion exercise a disproportionate influence on the media, which allows them to spread a mood of self-doubt: they project their own discontent on the population at large. In fact, however, they are fairly isolated and speak largely for themselves. After 1972, when they managed to capture the leadership of the Democratic party, they have progressively marginalized it by making it the captive of special-interest groups and of an outdated ideology. They have thus allowed the Republicans to seize the high ground of a truly national party. The November 1994 congressional elections were, I believe, a watershed: they mark a revulsion against the liberal values promoted by the intelligentsia.
I find that our political and economic institutions function reasonably well. True, there is a great deal wrong with the executive and legislative branches of government, and the cynicism of the voters about them is not unwarranted. We probably have fewer public-minded officials today than 50 years ago. More people enter government service to enrich themselves, which the constant growth of the share of the GDP under government control makes possible. But such matters are always relative. We have a more corrupt government than Britain, Germany, or the Scandinavian countries. But ours is far sounder than the governments of Italy or Japan. Our elections do reflect the people’s will whenever the people feel strongly.
Two institutions give cause for worry.
The judiciary system seems to favor the rich and powerful to a greater degree than in the past. The fact that Senator Edward Kennedy never had to stand trial after being personally involved in a fatal accident with all kinds of unsavory aspects casts a shadow on our legal system. The same holds true of Clark Clifford, who was implicated in a major financial scandal and yet got away scot-free. The O.J. Simpson trial has been a travesty of justice.
And then there is the educational establishment, which I happen to know best.
Our institutions of higher learning and advanced research have no peer in the world. The universities on the European continent have never quite recovered from the twin disasters of Nazism and Communism. The lead which U.S. universities have secured in science and technology is not threatened. Nevertheless, there are dark clouds overhead.
Our secondary-school system is in shambles. It does not perform its proper function of preparing youth either for citizenship or for higher education. The mission which in the past had been performed by secondary schools is now left to the colleges, which means that the level of higher education is being lowered. In contacts with Harvard freshmen, presumably the cream of the country’s high-school graduates, I have been struck by their cultural rootlessness. Interviews with applicants for my freshman seminar reveal that they are almost totally unfamiliar with the world’s great literature: apart from Dostoevsky’s Crime and Punishment, which they read as a thriller, and Flaubert’s Madame Bovary, apparently the staple of advanced French-language courses, they have read none of the literary classics. Montaigne, Dickens, Tolstoy, George Eliot, Thomas Mann, Chekhov are mere names to them—if that. They are very quick to learn. But their cultural background is a vast void which is fleetingly filled with TV shows and advertising slogans, comics, and movies. I find this very depressing.
The other troubling factor in our educational system is its relentless politicization. This takes two forms; one is the familiar political correctness, a regime of intellectual conformity which requires professors to adhere, actively and passively, to a code of propriety as rigid as were Victorian standards of discourse on matters of sex. It particularly affects everything that touches on gender and race. But it also touches more recondite subjects. For example, it is taboo to relate the behavior of human beings to that of animals, and to explain it in terms of inborn qualities (instincts). The guardians of correctness require that all or (if they are more reasonable) virtually all of human behavior be depicted as learned, because this means that people can be infinitely molded by indoctrination and social pressure. Edward O. Wilson of Harvard, one of the founders of sociobiology, has suffered not only verbal abuse but physical harassment for his heretical views. Western civilization must not be praised. Literary courses concentrate on such subjects as female oppression in novels. The list can be continued endlessly.
The politicization of universities also expresses itself in pressures exerted on faculties by administrations to hire women and minorities at the expense of white males. To some extent, this is fair because it redresses a gross imbalance: I recall when some 30 years ago I recommended a female student for an opening at an Ivy League university and was bluntly told, “We do not hire women.” No one says today, “We do not hire white men,” because we are not as blunt: but it resembles reality. Recently a female professor told me that her husband had been turned down for tenure at his university despite an excellent record and high recommendations. She attributed it to discrimination, which she accepted as a fact of life: a “white heterosexual male” stands little chance, she explained. I have heard this view from others, too. I find it an outrage. I also fear such blatant discrimination will have a very debilitating effect on our higher education.
When I arrived in the United States from Europe in 1940, I landed in a small college in the Midwest. War was distant but fast approaching. I recall the college president saying in a private conversation that the United States might not be able to meet the challenges it faced because it had grown “soft.” Events disproved his fear. I believe the United States today is sound and can face the future with confidence because it has a unique capacity for introspection and renovation. The intellectuals’ Weltschmerz, fortunately, reflects only their own personal Welt and their own personal Schmerz.
In 1970 I was a junior at the University of California, Santa Barbara. In February of that year, an anthropology instructor by the name of Bill Allen was denied tenure, much to the dismay of hundreds of idealistic Anthro-1 students who not only admired Allen’s fieldwork with South American Indians but were understandably grateful for the A’s he allowed them to grant themselves in his course.
A petition drive to retain Allen was launched, which led to a rally on campus, which led to a torch-lit march on Isla Vista, the student neighborhood next to the campus. Before the night was through, protesters-turned-rioters had burned down the local branch of the Bank of America. One fellow I knew took pictures that night, and he turned one of them into a popular poster that showed the charred remains of the bank next to its still-standing sign, above which was superimposed: “Don’t Bank on America.”
Allen was already forgotten when more riots erupted in Isla Vista two months later, and again two months after that, each suppressed by what campus opinion angrily denounced as police brutality. In keeping with the logic of the times, many students and professors decided to boycott the university, and a number of classes and lectures were moved to apartments and church halls in Isla Vista itself. My Russian history survey convened at St. Mark’s student center, where one classmate reveled in our assigned reading of The Possessed. “It’s happening here, man,” he explained, his clueless eyes bright with excitement.
It is hard to imagine any public institution in the United States today going through a similar patch of pampered, heedless idiocy. I like to think that my old university—like the U.S. generally—proved remarkably resilient 25 years ago. But at what cost? The cultural and political poisons released during that time continue to debilitate every aspect of American life.
Almost all the categories listed by the editors as problems today were in place by the late 1960’s, beginning with the rejection of authority and the movement toward multiculturalism and racial polarization, trends that went hand in hand with the Marxian notion that our country was destined to face an ever-widening gap between rich and poor. (Even the newer debates over immigration are rooted, at least in part, in the affirmative-action policies of the no-longer-color-blind late 60’s.) It goes without saying that these views are pushed by those least likely to be moved by indications that the U.S. has provided more freedom and more opportunity to more people than any polity in history.
Of course, commercial democracy’s inability to defend its achievements is well documented. (Serves it right for coopting so much of the 60’s counterculture in the first place.) By any measure, our country today is infinitely richer materially than it was in 1970, let alone 1945, and it continues to undergo the sort of technological advances that made our victory in the cold war almost anticlimactic. All this has been accompanied by the social and intellectual breakdown set in motion during the 60’s, the never-ending rise of popular culture and cheap entertainments to primacy in our daily life, and the further descent of our intellectual and media elites into celebrity, chronic adolescence, and other forms of self-absorption. Meanwhile, much as segregation scarred the American psyche during the late 50’s and early 60’s, so the destruction of the black family and the murderousness of inner-city life wears on our national psyche today. The civil-rights movement arose to strike down segregation; there has been no comparable response to the blight of urban black life. As the recent welfare debate suggests, government can perpetuate the underclass; it cannot even begin to eradicate it.
Fred Barnes has written that the conservative resurgence is being led, in the House at least, by politicians who never had anything to do with the 60’s counterculture—indeed, they have made their careers openly opposing it. Their rise vindicates the longstanding views of the majority of the American electorate that never embraced that culture even as they could not escape it. Nonetheless, politics cannot redeem our culture, and fortunately it is not being asked to. Budgetary matters already have top priority, if only because of a growing understanding that generational demographics will require increasingly prudent public spending. There is not likely to be any major rethinking of the welfare state’s obligations, but through various forms of tinkering and decontrol, government will inevitably accede to, if not encourage, private alternatives to the services it so inefficiently provides.
Because it has no choice but to promise less, government will help defuse the politicized debates that have rocked the nation for more than a quarter-century. The easy decadence that comes with prosperity will remain—America’s genius has always been that it does not try to change the human condition but adjusts to it—but we will have to turn to something other than politics to solve our deeper problems.
“This is not the country my father fought for,” a one-time colleague who grew up as an Army brat was telling me over lunch five years ago. He sang a threnody of national faults, and I could only hang my head in mute agreement—crime, multiculturalism, educational collapse, everything conservatives have worried over and fought against for twenty years or more.
He grew more and more excited. From multiculturalism he began talking about the threat posed by immigrants, and from that threat to the threat posed by native-born blacks. As he was taken over by his passion and imagined me an ally in it, he began dropping words into his monologue that in his calmer moments he never would have used with me, words like “nigger” and “wetback” I had heard used only in rages and then only maybe twice before outside of a movie or TV show. And then, forgetting himself entirely, he allowed as how Jews were blocking the true story of our national decline.
It is not only inconvenient to hear words you might have spoken coming out of the mouth of a racist, nativist anti-Semite. It is also a reminder that ideas you hold dear may be used as weapons in a war you never intended to fight—a war in which those weapons may be turned against you just as my one-time colleague turned his assault on multiculturalism into an assault on Jews.
This is my warning as we consider the national prospect. Those who believe America is in a period of cultural decline are obviously correct; I am not at all sure how anyone of good will could argue otherwise.
And yet, and yet, and yet. It is one thing to worry over and battle against the dumbing-down of our schools; the assault on taste, standards, and truth posed by multiculturalism; the rise of repellent sexual egalitarianism; even the dangers of advanced consumerism are becoming increasingly worrisome.
But it is quite another thing to make the leap from that point to the notion that the nation itself is in parlous and irreversible decline. After all, nations are always in parlous moral health; nations are gatherings of people, and people are sinners. When the United States was putatively healthier, back in the 30’s and 40’s and 50’s, 12 percent of its population was living in de-facto or de-jure immiseration and the Wasp majority protected its position in the elite by means of explicit quotas and exclusions.
The declinists are both wrong and spiritually noxious. After all, the purpose of declaring the nation in decline is to root out the causes of the decline, extirpate them, and put the nation on the road to health. But, for some of them, the search for causes always leads to blacks, immigrants, and Jews. In William Faulkner’s The Sound and the Fury, Harvard’s own Quentin Compson finds himself suicidal over America’s conversion into the “land of the kike home of the wop.”
Blacks and Jews are ever the inevitable, juicy target—so inevitable that they still find a link in the fevered minds of the paleo-Right, even though all blacks and Jews have in common now is the way the paleo-Right links them.
What blacks, Jews, and immigrants always seem to lack in the eyes of declinists is some version of the American character—that which my one-time colleague believed his father to have fought for. The dark underbelly of the American political experiment is the very idea of an American character itself. It is, fundamentally, an un-American idea. It is the nature of America that there is no one American character. Demography is not destiny in America as it is everywhere else; where you come from is not who you are.
I can find no quarrel with the brief of particulars offered by the declinists. But their central idea gives heart and strength to people whose threnodies can sound like the song of the siren—and must, like the siren’s song, be resisted by all strong men.
Reflecting on your statement, I come up with good news and bad news.
The good news begins with the American political system. If anything, my confidence in the stability of that system has been strengthened in recent years. Our political institutions weathered the ferocious assault on their legitimacy launched by the Left in the 60’s and they now show every sign of adjusting nicely to the revolution (or what I prefer to call the counterrevolution) being led by the Right.
In this particular sphere the counterrevolution represents an effort to correct the imbalances created by the Left over the past 50 years in the distribution of power between the states and the central government, and between the legislatures and the courts. My guess is that this effort will succeed to a significant extent, though I suspect that it will make less difference than some on the Right imagine.
There are pundits who detect instability in the disaffection of many voters—the “angry white males” who have become the contemptuous liberal stereotype du jour—and in all the speculation about independent candidacies and third parties. I do not agree. In my view, these phenomena are themselves part of the revolt against big government and against the Democrats as the party of big government. As such they are a symptom not of instability but of flexibility.
To continue with the good news, my confidence in the American economic system has also grown stronger in recent years. Once upon a time, most of my fellow intellectuals everywhere in the world took it more or less for granted that American capitalism was either unviable or unjust or both, and that it was therefore doomed to lurch from crisis to crisis until it finally collapsed. I myself never went that far, but neither was I a great booster of free enterprise and free markets and free trade. Then, about 2 5 years ago, my eyes were finally opened to the wonders of American capitalism, and when I look around me today I still see what I began seeing then in spite of the dust that keeps being thrown in the air to blind us all to the simple truth. Indeed, for all the apocalyptic rhetoric of American economic decline, the United States is still, and shows every sign of remaining, rich beyond the dreams of avarice. And for all the talk about “increased economic and social stratification,” prosperity is still more widely shared here than anywhere else.
Nor is there any indication that Americans in general, as opposed to intellectuals on the Left, are seriously stricken with class envy and resentment. In the past, such passions (as observers from Tocqueville on down consistently noted) were much weaker here than they were in Europe, and they still are. As for the persistence of poverty, by now everyone knows, though few are yet willing to say so in public, that this is a problem largely confined to the largely black underclass. And more and more people are coming to understand that the “root causes” of the underclass condition are in any case not economic.
Which brings me to the bad news—the moral and cultural sphere. By contrast to what I feel about the political and economic realms, my confidence here has been very severely shaken in recent years, and not just with respect to the underclass.
In one of his novels, Benjamin Disraeli famously described the rich and the poor in Victorian England as “Two nations, between whom there is no intercourse and no sympathy; . . . who are formed by a different breeding; are fed by a different food, are ordered by different manners, and are not governed by the same laws. . . .” So it is in America today where morality and culture are concerned. We too are two nations. One of them is mostly made up of middle-class people still bound by the traditional norms that only yesterday were accepted by virtually all Americans and enforced by law and by custom alike. The other consists of an odd combination of groups who live at both ends of the social and economic scale and who—in very different ways, for very different reasons, and with very different consequences—have been liberated from those traditional norms.
The two American nations are often said to be in a state of war, and it is easy to see why. Passions on both sides run high, with enough hatred and enough fear in each camp to fuel not just a metaphoric civil war but a real one. I am not predicting the outbreak of armed hostilities, but on the other hand, I find it difficult to envisage the terms of a treaty that would usher in an era of harmonious coexistence.
To anyone like myself who regards the realm of morality and culture as the life blood of both the polity and the economy, this split of the American people into two nations is far more threatening to the national prospect than any strictly political or economic problems, however serious, that we may have.
Will the counterrevolution, which is for all practical purposes the political arm of the traditionalist nation, settle the war decisively in its favor? I doubt it. In spite of the lurid paranoid fantasies of the Left, the counterrevolution does not possess the power—or, in my opinion, the ruthlessness—to sweep away everything the liberationist nation so aggressively imposed upon the traditionalist nation when it was in the saddle: affirmative action, feminism, gay rights, and multiculturalism, along with (to borrow a phrase from a not-unrelated context) their emanations and penumbras.
Still, speaking as a refugee from the liberationist nation who has become a patriotic citizen of the traditionalist nation, I hope that the counterrevolution will at least manage to hold the line against liberationist expansionism, containing it within the enclaves it has already conquered. With any luck this will be enough to keep the polity stable and the economy vibrant; and, with a little more luck, it could even, like the foreign-policy strategy on which it is modeled, just conceivably lead to victory in the end.
This is the second time in recent memory in which serious questions have been raised about the future of the American system. In 1968, of course, the evidence of crisis was, superficially at least, much more compelling than is the case today. University campuses were in flames, the cities were swept by race riots, the war effort was effectively subverted, drugs were in wide use and their mind-expanding effects were made the object of outright celebration. Capitalism itself came under attack; the American economic model was condemned as an engine of injustice and was therefore scorned as a pattern for the rest of the world. All this, and much, much more, took place during a period of unprecedented prosperity, robust growth, near-full employment, an explosion in career opportunities for the young, and a revolution in the legal and, to a somewhat lesser extent, economic status of blacks.
The anti-American case failed, rejected by the American people and repudiated by events. Yet today we find ourselves engaged in many of the same debates and fighting many of the same battles. There is, however, a crucial difference: those who previously pressed the anti-American case from outside the institutions of power now occupy positions of influence within government, the universities, and other institutions which they once denounced as reactionary and unresponsive.
The consequences have been most visible, and catastrophic, in the related areas of race relations and what might be called multicultural policies. The premise that American society is permeated by an intractable institutional racism, a controversial and generally rejected notion when first posited 30 years ago, has been adopted as state doctrine and has led to a degree of government coercion never experienced by Americans except in wartime. The state routinely selects winners and losers on the basis of skin color, and sometimes gender and nationality as well. Our laws now provide economic opportunities to immigrants from places like Korea and Peru which are denied to displaced white blue-collar workers from places like Akron and Detroit. Urban schools spend millions on bilingual education, a politically motivated enterprise with dubious educational merit, while laying off math and history teachers and abandoning practically all programs in the arts.
Unlike subsidies to wheat farmers or aeronautics corporations, race policy is highly intrusive: every American living in a multiracial environment experiences its effects—in the schools, the workplace, the delivery of municipal services, the housing market.
Americans are, furthermore, well aware that many policies have been retained despite their obvious failure. One can hardly think of a state action which has contributed more, in a brutally direct way, to the decline of a cherished institution than the impact of busing on urban public education. Yet today busing schemes are being proposed all over America, despite overwhelming evidence that the policy has actually impeded integration in many communities.
Americans are bewildered when policies that are unpopular and unsuccessful are perpetuated, even for the most high-minded of goals. Nor are they oblivious to the suspension of critical judgment on standards, crime, personal responsibility, and similar concerns which our contemporary race regulations have engendered.
Finally, the federal government’s ill-conceived venture into social engineering has had the effect of validating that most un-American of principles: group rights. Americans despise the idea of dividing resources along racial, ethnic, and gender lines. But if government changes the rules, Americans inevitably will make their accommodations. They will seek out their own victim’s niche. Or they will simply opt out of institutions they regard as tainted. Common sense alone would suggest that a government which apportions benefits by quota will inevitably alienate those who do not enjoy protected status.
Nor will the beneficiaries of preferences be satisfied. Blacks already resent the inclusion of Hispanics and Asians in affirmative-action programs; Hispanics and Asians are likewise convinced that blacks pull political strings to get more than their fair share.
Obviously, race policy is not exclusively, or even primarily, responsible for our troubled condition. Yet it is difficult to think of another issue which has exercised so decisive an influence in delegitimizing the hallowed American concept of government as servant of all the people.
It goes without saying that the source of much of our current discontent is economic uncertainty. Capitalism’s antipathy to traditional values has been much commented on; that we are in the midst of a transition from one economic era to another certainly accentuates the normal market disruption of established patterns of life.
Ultimately, however, the market may prove more a unifying than a divisive force. Just as the market punishes sloth and corruption in the economy, so it can be expected to weed out some of the wrong-headed ideas which have come to dominate our political culture. The remorseless logic of international competition has already generated much of the momentum toward a revival of educational standards. The realities of today’s economic environment have also contributed to a new emphasis on learning and the work ethic, and a pronounced shift within black America away from expectations of government help.
This brings us to the conservative ascendancy. Unlike the Left, conservatives clearly want America to prevail in today’s world of global economic rivalry, just as conservatives wanted America to win the cold war. Conservatives are actually willing to try new solutions to problems which others consider intractable—crime, welfare, family breakdown. In this respect, last year’s election results are yet another sign of the enduring American capacity for change and self-correction.
But if conservatives have profited from their intellectual vitality, they could easily be thwarted by a tendency to evade the constellation of economic problems which are contributing to a sense of American unease and decline. Conservatives do themselves a disservice by acting as if problems do not exist or by trying to prove, through doubtful statistics, that middle-class fears are unfounded. Liberalism was punished for a policy of denial on issues like race, crime, and the poor; something similar could well happen to conservatives who fail to speak with candor and understanding to those Americans who are the casualties of economic change.
There is, furthermore, a strain of conservatism which fuels a sense of alienation and aggrievement among whites as surely as Jesse Jackson solicits a sense of victimization among minorities. Should conservatism and, more generally, the Republican party become identified as the party of Waco, Ruby Ridge, and the right to bear semi-automatic weapons, they will suffer the consequences as surely as did liberals obsessed by the CIA, Iran-contra, and the national-security state.
Those of us who were adults in 1945 would agree that it is a radically different America today. Some of the changes are good, some are more illusory than real, and almost none is centrally related to the “conservative resurgence.”
Some of the differences between now and then reflect the fact that the changes in the quarter-century after World War II were so wildly dramatic. The American economy promised to spiral upward toward the day when everyone would become an increasingly endowed member of the middle class. Family income increased by about 80 percent in those 25 years. That pace has slowed considerably.
But while many young, middle-class Americans may be put out that their fortunes are not increasing as rapidly as did their parents’, most do not wake up in the morning concerned about where their next personal computer is coming from. Any McDonaldization of wages has not prevented disposable income from edging upward. And, of course, those who are completely left out or displaced by the economic restructuring would tend to welcome the kind of government intervention which is not on the conservative agenda.
At the beginning of this half-century, the values of freedom and human rights were also uniquely and dramatically triumphant. We had just beaten the Nazis and were conducting a successful cold war against the other evil empire. And the causes of democracy and human rights were resurgent at home as well. We were the civil-rights generation, reversing in law and in practice the centuries-old oppression of blacks—as well as reversing the repressive immigration law of the 1920’s and 1930’s, and instituting massive social-welfare programs, largely for the rehabilitation of those previously disadvantaged.
There are many second thoughts about some of those programs, but most Americans are not, as is so often suggested, in a backlash mood, ready to reinstitute racism or to chop down the safety net. Although the fringe racists in America are more expressive and violent than they were in the postwar period, they are not more numerous by any count, and are rejected by almost all Americans. Most people were always offended by the quotalike programs now under attack, but very few are interested in rolling back bedrock civil rights—or the substantial black middle class which has developed. That is not what the popular conservative mood is about.
Furthermore, the majority of Americans say they are not as interested in spending less money on social welfare programs as they are in getting rid of programs which have not worked. In 1992, more than nine out of ten surveyed Americans said that making people self-sufficient is more important than cutting costs. According to a 1994 survey summarized by the National Opinion Research Center, “What the public wants is not less spending but spending that works.”
In other words, it’s not the money, it’s the principle of the thing. Sure, everyone would be pleased if the government spent no more than it took in, but most Americans are not primarily worried about the national deficit. When the membership of the Christian Coalition was asked in September of this year to identify the most important issue facing the nation, fewer than one out of ten named either the budget deficit or taxes. Most of them said that the essential issue facing the country is its “moral decline,” and that is what people in general say. When I asked my sixteen-year-old grandson what was missing in his generation that he imagined had been present in mine, he said, “boundaries.”
The conservative impulse in America today is primarily driven neither by a mean-spirited desire to suppress or hoard money on the backs of the less fortunate nor by some tightly-wound conservative ideology about the economy or government. Philip Converse once defined such an ideology or belief system as “a configuration of ideas and attitudes in which the elements are bound together by . . . functional interdependence”; and he documented the fact that such comprehensive, integrated, political-belief systems are restricted to a tenth of the population, at most.
The new rank-and-file conservatives express a diversity of gripes, but their central, common-denominator reaction is to the sharp change in prevailing standards and expectations. Large numbers of mainstream Americans have been disturbed—not for the first time in our history—by an apparent loss of their “way of life,” by a sense that society is losing its cultural and moral moorings. They see this deterioration in terms of crime, incivility, drug use, family breakdown, illegitimacy, and other forms of social irresponsibility. They see the pathologies of the welfare problem as expressive of such deterioration. And they see the government and political leadership as having been complicit in these problems.
It would be a mistake to think that this more conservative rank and file has been hooked on the comprehensive ideology of any formal conservative movement or of the Republican party. Even after what they wrought at the polls in November 1994, conservatives have not expressed real confidence that the Republican party can deliver what they want. And, insofar as changing America’s cultural direction is concerned, they are probably right.
There are too many Americans who have drifted too far from their roots and traditional sources of value. No government or political party can by itself reverse that drift. The culturally conservative mood was not created by a conservative political movement; more likely, it was the other way around.
But that growing mood, in a still benign and productive America, is a hopeful sign of natural regeneration. If one believes that there is a basic human need for standards, for the familial and communal structures which give meaning to life, then one can believe that the mood will grow. A conservative political movement can help around the edges, as long as it does not confuse that mood with its own ideological system.
What kind of future is in store for the United States? I alternate between pessimism and optimism on different days of the week. There are days when I think that the nation is going downhill at full speed, and there are days when I think that the promise of America is undiminished. This uncertainty may be due to schizophrenia, but I think it more likely that the prospects are indeed mixed and that it yet remains for us American citizens to determine which future lies ahead.
It is easy to feel down, given the explosion of crime, violence, incivility, intolerance, and other indicators of societal dysfunction over the past half-century. Having come of age in the 1950’s, I am still angered when I see men and teenagers refuse to yield their seats on a crowded subway to a pregnant or elderly woman; I am still distressed by the varieties of pornography that are freely available today; I still treasure the idea that people should be judged as individuals, not as members of racial and ethnic groups; I am still appalled by professors who claim that they have a duty or right to indoctrinate students with their own views.
Yet there are days when I remind myself how lucky I am right now to live in the United States. A trip abroad usually serves as a useful reminder of the openness, energy, and fluidity of American society, as well as of a distinctive American common culture that is easier to discern from a distance. Despite the many harmful trends, there remain plenty of reasons to believe that American society has the resilience and dynamism to absorb newcomers and to sustain a vigorous democratic culture. One sees it in the emerging patchwork quilt of urban multiculturalism that appeals across ethnic groups, creating a new kind of American universalism. One sees it in the successful racial integration of most sectors of American society, to an extent that was unimaginable in the 1950’s. One sees it in the immigrant youngsters who are taking the academic world by storm, refusing to be indoctrinated with admiration for the dictatorships their parents fled, and filing lawsuits against racial quotas that exclude them from educational opportunities. I do not know which of these visions will prevail, but it seems to me that the public policies of the past 30 years have mainly promoted fraying rather than cohering.
Which brings us to the question of whether the recent conservative resurgence is likely to arrest or reverse the negative trends.
My own sense is that it will not, for several reasons. First, the conservative resurgence actually began in 1968 with the election of Richard Nixon. (Both Jimmy Carter and Bill Clinton were, when chosen, considered the most conservative Democrat in the race.) This means that for nearly 30 years, conservative Republican Presidents were unable to alter the culture, to reverse the tide of entitlement programs, or even to leave a decisive stamp on the judiciary (seven of today’s Supreme Court Justices were selected by Republican Presidents, but only four are consistently conservative in their judicial philosophy). And supposedly conservative Democratic Presidents were quickly captured by the statist-bureaucratic wing of the party in Congress.
Second, even if conservative Republicans are able to rewrite federal legislation in fundamental ways, there may be little correspondence between federal programs and some of the egregious behaviors that are tearing the social fabric (e.g., violence, out-of-wedlock births, drugs, family dissolution). There may be no necessary connection between electoral results and such phenomena as the divorce rate, the illegitimacy rate, and the culture of violence (especially in view of Republicans’ antagonism to gun control). The answers, alas, may not lie in Washington.
Last, the current conservative resurgence is burdened by the self-righteousness of some conservatives. It is sometimes hard to tell the difference between the liberals’ bureaucratic nanny state and the conservatives’ censorious nanny state; both aim to use the federal government to control what people may do in their private lives. On issues like abortion and homosexuality, the best Republican statement came from Governor William Weld of Massachusetts, who said at the 1992 convention, “I want the government out of my pocketbook and out of my bedroom.” But Weld is exactly the kind of Republican who is out of favor with the ascendant wing of the party. Furthermore, the conservatives’ efforts to reform the federal role—a long-needed, politically difficult, and intellectually demanding change—will be hampered or even waylaid by the appearance of indifference to suffering. As Republicans discard ineffective programs, they must give careful thought to what takes their place to avoid hurting those who are most vulnerable.
But despite my doubts about the ability of politicians to save us from ourselves, I remain hopeful about American society, not least because the election of 1994 has forced everyone to review old assumptions and to think anew in defining problems and solutions. That is a sign of a healthy, dynamic society.
One of the greatest ironies of contemporary American culture is that the forces splintering it to the point of near dissolution are the very forces that defined it in the first place. If there ever was a sense of a common culture and a clear democratic purpose—and I think there was, even among groups who may have felt marginal—it was because a set of fundamental principles helped define this country: that as a nation of immigrants opportunity should be free of consideration of caste and class, and that no individual should be unduly constrained by inherited custom or obligation. These principles have been powerful enough to undo even the fault line created by American slavery, in an extended and painful reparation that is as unique in world culture as the sin of slavery is common.
Yet what has happened? Somehow, each of these ideas has been turned into a cartoon, distended and twisted so that the very structure of American society is now at risk. The goal of a common culture created out of many cultures is rejected; there is no longer any notion of a transcendent community which binds together diverse ethnicities. Instead we have a multicultural battleground of factions, each group claiming separate and unequal territory, rejecting the very goal of shared ground or overarching perspectives. Immigration, which was the traditional way in which America transformed itself, with new versions of the world becoming grafted onto the old, is now often seen as a threat, because there seems to be no stable center in the host culture. Meanwhile, opportunity under the law has been changed into litigious use of the law to serve group or private ends.
America’s great democratic traditions have now met their nemeses. The distrust of inherited distinction has led to a distrust of all distinction. The promise of possibility has been replaced by the expectation of delivery. The continuing reinvention of the present has left us with little sense of a past. And the issue of race, which once threatened to undermine the idealism latent in the American political vision, is now again looming large over attempts to reconstitute that vision.
In addition, the tradition of arts and letters has become so frayed that contemporary cultural life has been effectively cut off from its deepest and richest roots. American culture is now widely treated as something no more than twenty years old, designed for consumers barely that age. American culture, always known for its youth and freshness, even in the ways in which it adopted and transformed European models, has now turned youth and freshness into a fetish. There is even a desire to discard all distinctions between popular entertainment and art works with more profound ambitions.
Diversity, equality, democracy—all noble and important ideas—are thus misapplied and exaggerated. It is as if the coherence and weight of American life had dissolved, allowing our greatest ideas to fly madly about in the air, colliding with one another. The steadying forces in the American experiment in national construction seem to have been removed.
In recent years, the argument has been made again and again that capitalism must bear the responsibility for this contemporary disarray. The free market, in this view, has created an unquenchable desire for wealth, caused a disintegration of the nuclear family and traditional communities, replaced the values of religion with the values of acquisition. But as we know from Max Weber, the ideas behind capitalism—of labor, investment, and deferred pleasure—can also have the opposite implications. There is no necessary connection between capitalism and cultural dissolution; capitalism (an increasingly vague and inaccurate term, in any case) can even become a force for social and cultural cohesion.
In fact, the problem in America is not capitalism, but the culture in which it flourishes and which gives shape to its goals. This is a culture dependent on immediacy; often thin, lacking resonance; and, as we have too often seen, easily shattered.
Many of this century’s political arguments have been attempts, often misguided, to provide some ground for American society, some sense of stability and depth. One dominant approach, during the past generation, has been to try to create a marketplace that will provide some social weight and substance.
Here, in part, lie the origins of the efforts to impose upon corporations a set of regulations, obligations, and notions of community. But meanwhile private and cultural life was set loose in the opposite direction, treated in libertarian fashion. The result has been a hastening of the disintegration of the American character, as fewer and fewer restrictions are placed upon its restless desires. Despite those artificial attempts to raise the “social consciousness” of institutions, fewer and fewer have, in the event, served as counterballast to the centrifugal force of dissolution.
The conservative political resurgence in recent years has, in part, been an attempt precisely to reverse this dynamic, to reestablish a centripetal tug, insisting upon ideas of tradition, obligation, and community in home and civic life, while allowing the marketplace to be the arena for libertarian activity. I am heartened by some aspects of this revisionism, but also skeptical about its restorative power.
It has taken a long time for the American character to lose its foundation and it will take even longer to restore it. How can ideas like tradition and community be established and encouraged without artifice? How can the free market flourish while being grounded in a sober, more contained culture? How can the arts and sciences once again be regarded as sources of knowledge? And how can elitism be turned into a virtue in a democratic society? Those are the challenges as we near the century’s end.
“Political Stability and indestructible intrinsic unity” are two of the Soviet Union’s “major distinguishing features,” Leonid Brezhnev declared in 1977. Not long after he uttered these words, all sorts of trouble began to appear: the masters of the Kremlin started an unwinnable war in Afghanistan; an engineer pushed the wrong button on the console of a nuclear-power plant in Chernobyl; dissidents sprouted everywhere like mushrooms after spring rain. In short order, the twelve-time-zone empire over which Brezhnev had presided—serene and self-confident in his undemocratic purposes—rapidly disintegrated and disappeared.
Meanwhile, at about the same time, in another superpower across the sea, a frightening picture was being sketched:
Our country moves agonizingly, aimlessly, almost helplessly into one of the most dangerous and disorderly periods in history . . . our economy careens, whiplashed from one extreme to another. Earlier this year, inflation skyrocketed to its highest levels in more than a century; weeks later, the economy plummeted, suffering its steepest slide on record. . . . Manufacturing plants lie idle across the country. The hopes and aspirations of our people are being smothered.
These were not the words of some neo-Marxist economics professor tenured at a Princeton or a Yale. Rather, they came from an impeccably respectable source, one near to the bosom of conservatism itself. But after they appeared—in the preamble of the Republican-party platform of 1980—the United States under Jimmy Carter did not slide, as the preamble’s authors feared, “irretrievably into an abyss.” Instead, along came the Reagan administration and the seven fat years.
One might, from these two cases, formulate an iron law: the prospect that a civilization will decline and fall is inversely proportional to the quotient of pessimism that prevails. Optimism—so often a synonym for hubris—signals a clear and present danger; pessimism, with the searching self-criticism that accompanies it, vitality and strength. Keeping this law and its corollaries in mind, one asks: is the United States today, fifteen years after the terrible crisis of the Carter administration, heading inexorably toward balkanization and collapse?
There is no blinking, of course, the many grave problems that afflict the United States, ranging from the multiculturalists loose in our schools to the murderers loose on our streets. There is also an increasingly widespread recognition that remedies for these afflictions will not be easily found, that the afflictions themselves grow from a root that cannot be extirpated by government, and that even the most intelligently crafted programs and policies of a Republican-dominated Congress cannot begin to address our ills.
Yet, ironically, our very difficulties in devising solutions are themselves a noteworthy indication that our country is getting well. One need only contrast the extravagant hopes invested in the War on Poverty and the Great Society with the urban rubble such programs left in their wake, to understand that a sense of limits is a disposition that tends toward social health.
If the American prospect is to be happily fulfilled, a most crucial task is fostering a temperament of restraint, dampening enthusiasm and encouraging realism about what politics can and cannot achieve. We should not, of course, look to Washington for aid in this endeavor. Though we can expect the Republicans to shear off some of the more hideous excrescences on the face of the compassion-state, they cannot be counted on for much beyond that. In the grip of their electoral successes, and with the vast sums of the federal treasury and the Archimedean levers of the federal government in their hands, the Republicans will easily be seduced by ambitions that extend far beyond undoing the harm which liberal optimism has wrought. “Why not aspire to build a real Jurassic Park?” asks Newt Gingrich in his book, To Renew America. “It may not be at all impossible, you know.”
President Clinton, for his part, having felt the sting of the voters’ November 1994 electoral rebuke, began virtually overnight to speak a new language, calling for limited government, and endorsing such measures as a balanced budget and prayer in the public schools. None of this will convince anyone, least of all Clinton’s own supporters within the pitiable remnants of the Democratic party, that he has undergone any sort of political conversion or change of heart. All he has done is to take one more step in his animating lifelong quest “to maintain my political viability within the system,” as he once put it in his now-famous letter explaining why, after he had received the high and lucky selective-service lottery number 311, he would not be signing up for the University of Arkansas ROTC after all.
The President’s peregrinations, cynical though they may be, have an illuminating and cheering aspect nonetheless. Despite lamentation from the pundits about gridlock and demosclerosis, the haste with which Clinton has attempted to conceal his tracks and shift to the center forcefully reminds us of our political system’s unimpaired resilience and how very far we are from the ossification which led the Soviet empire to break apart.
Though our salvation surely does not lie in Washington, neither does our ruin. With the great crises and urgent dangers of the cold war behind us, politics is receding in importance; what a Clinton or a Gingrich says or does matters considerably less than it did when a Roosevelt or a Rayburn was in charge. At a moment, moreover, when Serbs, Croats, and Muslims are slaughtering one another like wolves, balkanization is hardly a word that applies to our noxious but rather more pasteurized variety of ethnic strife. And if one considers all the dark days our country has faced in this century, and how many great challenges, both foreign and domestic, we have successfully overcome, fears of “unraveling” and impending “breakdown” seem terribly misplaced.
There is reason to hope, therefore, that the iron law of pessimism will continue to hold. The very fact that thoughtful people are entertaining such dire prospects for the United States is already a positive sign. If we continue to engage in spirited soul-searching about our many genuinely worrisome and intractable woes, the corrective mechanism of public opinion can be set in motion. Indeed, an intolerance of our ailments and of those answerable for them—a welcome and desirable intolerance—is visibly gathering steam. One day soon we shall discover that our current anxieties, like those we suffered during the Carter years, have served us remarkably well, and the modestly better future of which we dream has quietly and imperceptibly arrived.
Irwin M. Stelzer
Oh, woe. All is lost. America’s decline, long predicted by its intellectual class, is not only clear to all, but is accelerating. So we are told.
America’s privileged classes (the story goes) are increasingly withdrawing from society, eschewing its public schools, parks, and police in favor of private ones. The middle class is being ground between the upper stone of cheap foreign labor and the nether stone of corporate downsizing. The poor, meanwhile, are increasingly with us, breeding future generations forever dependent on welfare handouts and the ill-gotten proceeds of muggings and drug dealing.
Meanwhile, we are also warned, America’s culture becomes increasingly violent, vulgar, and venal. To top it all off, the collapse of the Soviet Union is making it possible (in the words of Lester Thurow) for “capitalists . . . to employ more ruthless approaches to getting maximum profits without worrying about political pressure.”
What is troublesome about this tale of disarray and decline is that it represents a broad consensus: from both leftish Robert Kuttner and neoconservative icon Irving Kristol we hear about the death of the American Dream. From both Robert Reich and Charles Murray we hear about the dangers of our homegrown variant of Latin America’s highly stratified society, with the very rich and the very poor separated, not by a healthy middle class but by sturdy fences woven of money and intellect. From both nervous trade-union leaders and satisfied business executives we hear about the threat, or, depending on the point of view, the promise of low-cost Indian computer programmers and educated Vietnamese willing to work for a dollar a day.
Fortunately, this consensual whining is like a bad pointillist painting, concocted by a committee: each point is brilliantly conceived and placed on the canvas, but the entire picture cannot survive critical scrutiny.
Most Americans do know, although they may be loath to admit it, that talk of the death of the American Dream is self-flagellating nonsense. They may tell pollsters that they are worried about the future, theirs and their children’s, but they are buying homes and cars in record numbers, enrolling their children in institutions of (so-called) higher learning that they complain they cannot afford, and borrowing money in amounts that only people confident of their ability to repay debt out of rising incomes would contemplate.
Anyone who leaves Cambridge, Washington, and New York long enough to see the recreational vehicles parked in the driveways of quite modest homes in Colorado, or the new so-called “edge cities” springing up around Phoenix, or the smiling faces of those who have fled high-tax, big-government California for the more congenial environs of Las Vegas, knows that the declinists are projecting from limited data bases, indeed. They are also ignoring developments that do not support their gloomy theses. Consider these:
- When income figures are adjusted for inflation and changes in family size, they show that average family income has risen since the early 1970’s; and real spendable income per capita rose 45 percent between 1970 and 1993. Indeed, as Karl Zinsmeister shows in a recent issue of the American Enterprise, “. . . today’s baby-boomers are, on average, about two-thirds better off than their parents were at the same age.”
- The American economy is once again the world’s most competitive, earning that ranking because of its basic strength, its management expertise, and the depth and efficiency of its capital markets. America is increasingly dominant in new technologies like computers and telecommunications and many of the world’s major corporations, like Mercedes Benz and Toyota, are choosing it as the preferred location for their new plants. Most significant, if American jobs were vanishing, the unemployment rate could not be at its current, irreducible level.
- The American economy continues to grow, sometimes rapidly, sometimes slowly. But grow it does, and without the inflation that has in the past accompanied rapid growth. The so-called “soft landing” is, in reality, continued gradual ascent to new heights of affluence.
- America remains the destination of choice for the world’s immigrants. This, of course, upsets nativists who confuse the corrupting effects of nonassimilationist multiculturalism and premature eligibility for welfare benefits with the enriching effects of an inflow of people seeking a better life, and willing to work for it.
- Americans are living longer, healthier lives than did their parents, and their children are likely to live longer still. This may frighten those who worry about the ability of the Social Security and medical systems to bear the cost of this wondrous development, but a nation worried that too many of its people are living too long, because medical science is advancing at a staggering pace, is a nation in search of a worry.
- Americans are safer in their homes and streets than they have been for a long while. Crime rates are declining so rapidly that even New York City’s neighborhood-watch cadres are having difficulty finding recruits.
- The nation’s culture may be in sad disrepair in the view of its intellectual elite, but it also offers more people more access to ballets, operas, classical films, history lessons, and other allegedly elevating products than ever before.
- The increasing differential between the incomes of skilled and unskilled workers is producing just the effect one would expect: an upgrading of skills. Whereas only 40 percent of high-school graduates enrolled in college in 1960, more than 66 percent do now.
But these facts are less important than the most crucial development of recent years: the shattering of liberalism’s control of the levers of government, and the retreat from the failed policies that have created the problems that both Left and Right bemoan. In its place we have a new consensus, one that reflects broad agreement as to policies that can correct the flaws that do exist in American society. We now know that we must remove incentives to undesirable behavior, such as out-of-wedlock births. We now know that government has become bloated and intrusive, and must be whittled down to some more manageable size. We now know that our fiscal house must be put in order. We now know that better training and education is the answer to the increasing skill differential that is driving the income-distribution trends that some find worrying. We now know that criminality, no matter how seemingly petty, must be punished certainly and severely.
The remaining debate is about details. Important issues, but nowhere near as important as the fact that we now have broad agreement as to the direction in which the nation must move. That consensus should give heart at least to those declinists whose views are based on something other than America-hatred, and allow them to sleep peacefully, dreaming again the exceptional American Dream.
In 1945 the American national project was a liberal project, just as America’s political culture was liberal. That it has now unraveled seems incontestable, and the reason obvious: liberalism has run its course. The confidence with which postwar liberals advanced their vision of a tolerant, pluralistic but nonetheless broadly common culture is a thing of the past. When in the 60’s the liberal establishment proved unwilling to defend its own understanding of America against the forces of antinomianism, all else became inevitable, even predictable. The ruling class of the 60’s has been succeeded by a new class shaped by Vietnam and racked by self-doubt. Multiculturalism in its fullest, most Orwellian sense is the natural consequence of this self-doubt. No elite can long afford to lose its nerve.
Of more immediate concern, however, is the unexpected failure of the conservative movement to fill the vacuum of cultural leadership. Perhaps one should not be surprised that no Republican presidential candidate to date seems to have the slightest idea of what is at stake (except, ironically, for Pat Buchanan, whose approach to the crisis is a burlesque but who at least sees that there is a crisis and has some notion of its wellsprings). But I am struck by the comparable slowness with which many conservative intellectuals have come to grips with the fact that there is a Kulturkampf afoot, and that it is the defining political reality of the post-Soviet era.
The problem is that American conservatism has to a large extent ceased to be an idea-generating enterprise and turned into a purely political movement, feeding off its own intellectual capital. The philosophical “big tent” under which the Right has huddled for the last quarter-century consists primarily of classical liberalism, a body of ideas peculiarly ill-suited to providing direction in a time of cultural disintegration. Nor will this disintegration be reversed by the kind of lowest-common-denominator libertarianism that is gaining sway among the members of Generation X: it will, in fact, be accelerated. New thinking is necessary, and time is short.
The most important task facing conservative intellectuals today is to develop a responsible consensus position on culture around which the largest possible number of conservatives, neoconservatives, and libertarians can unite. This will be far more difficult than was the formulation in the 60’s of the “fusion” conservatism that made possible the election of Ronald Reagan; the underlying differences are greater. Yet it is no less critical.
What is missing from the present-day American political scene is a Ronald Reagan of culture: a “great communicator” who can dramatize the perils of balkanization, present an affirmative vision of America’s common culture, and thereby lead the way back from the brink. But such politicians do not forge their own philosophies—they borrow them from the work of intellectuals. To listen to the current crop of presidential candidates is to realize how completely we, for our part, have fallen down on the job.
R. Emmett Tyrrell, Jr.
No, I do not share the pessimists’ view that America is staggering toward balkanization, and I am not even all that pessimistic about the condition of American institutions.
I do think that in the spirit of good will toward men it would be a nice touch for some of us to fly over to Moscow and head out to those retirement homes where the cold war’s KGB agents now roost and cheer them up. No doubt a bottle of good Tennessee bourbon would do the job. They have reason to be proud of their achievement. They were masterful at assisting and encouraging the world’s left-wing intellectuals and intellectualoids in picking all the scabs of American society until we developed some very deep scars. No doubt we are more divided along racial lines than a few generations back when there was far less rancor. And the same can be said for other ethnic polarizations. I doubt there is the class animosity of a few generations back, but now we have gender rivalry and antisocial pathology raised to the false estate of enlightened thought.
Politics could do something about racial discrimination and poverty, but it can do nothing about men and women who do not like their human nature or biology. A sex change never goes far enough. As for the universities’ practice of according the mantle of enlightened thought upon the shrieks and grumbles of malcontents, there simply are not enough psychiatrists to treat the faculties of all the nation’s universities. Universities have never been essential to American life, according to the Heritage Foundation’s Adam Meyerson, and I shall take his word for it.
Ironically, the Left, which set out some three decades ago to bring all of America into a new sense of community, has only managed to divide us and to dream of new ways to divide us. America’s bountifulness and sense of fair play assuaged the class struggle. Now we have these other struggles, feminism, black militancy, the gay movement. All, of course, divide us and destroy community, as do diversity and multiculturalism—capriciously defined as they are. There is not much we can do about these pests. Yet I am optimistic. The leaders of all the above-mentioned centrifugal forces spinning away from our Republic’s original shared values are au fond frauds. Show me one leader of feminism, militant homosexuality, civil rights, or any of these so-called movements who deeply believes in the justice of the cause. Show me one who is not prospering handsomely in all his bellyaching. They are no longer serious about the “revolution.” They are less destructive.
These are not Gandhis burning with indignation and injurious ideas. These are not even Communist agitators working the mob with an eye to power. These are Jesse Jacksons diverted by thoughts about that swell estate for sale on 100 acres in temperate climes, and Gloria Steinems, now sadder, wiser, and closer to dotage, summering on Martha’s Vineyard. To be sure, these humbugs have nearly ruined many useful institutions. The universities are, at best, yesteryear’s high schools. The judicial system is fit for daytime television—in fact, one case has been there for almost a year. Government is an anarchist’s dream. All this puts me in mind of a favorite rule of mine: whatever institution the liberal reformer comes to dominate eventually loses all sense of its fundamental purpose and most of its ability to function. The university? City government? The judiciary?
Yet there is hope. The very fraudulence of the Left—if that is the term for these humbugs—leaves them otiose and conflicted. They have done about as much damage as they could. Moreover, in their mountebankery they have heaped upon their sacred causes policies that are loathsome to ordinary Americans and preposterous. They have no substantial following. At home, abroad, and even in lands once the hope of world socialism, the meretricious allure of the Left has vanished. It is worrisome that our institutions are in disrepair, but they can be salvaged. Values govern, and the sound values of our heritage are even now reviving us.
I know history never repeats itself, but it does approximate itself; and by the next century the old Republic will have passed through a period of reform that returns the cities to democracy and to Madison’s conception of republican virtue or an approximation of it. The rest of the country will follow. In fact the rest of the country is already almost there. Prosperity will be widespread. Americans will be doing what the Greeks advised centuries ago, cultivating the virtues—fitfully.
The premise is strange. Just who are the many observers who observe these many things unraveling? They sound like the observers (mostly conservatives) of the 1970’s who announced that America had lost its nerve, and those of the 1980’s who said America was in decline. Wrong thrice.
I do not remember America in the 1950’s as so serene or so terrific. It was less democratic, much poorer; more socially stratified, more racially polarized, and more anti-Semitic than now.
Immigration is not “unchecked,” as “many observers” say. Each year we take in about 800,000 legal immigrants and between 200,000 and 300,000 illegals, while about 150,000 people emigrate. That is a net gain of about a million immigrants per year, which is like one couple entering a ballroom with 500 people in it. The rate is about one-quarter what it was in the early part of this century. The “unravelers” must also reexamine their charge of balkanization when blacks, whites, Latinos, Asians, Jews, and Gentiles are intermarrying at rates never seen before. How is it that no one talks about the melting pot when it is melting at record speed? Where is Israel Zangwill now that we need him?
We are also doing rather well economically. Rhetoric to the contrary notwithstanding, the middle class has moved ahead over the last 25, 15, and 5 years. And the U.S. remains the number-one geopolitical, economic, cultural, military, scientific, educational, and diplomatic power in the world. We have more global influence than any nation in history, and we promote good ideas: democracy, markets, individualism, pluralism, often successfully.
We do have some very serious problems, located largely in the realm of social values. If America founders, it will be on values, not economics.
The emergence of the new liberal mindset placed in motion the “something-for-nothing” state that allowed crime without much punishment, welfare without much work, educational advancement without much study, and preference without much merit. In 1992, candidate Bill Clinton was quite right to pledge, endlessly, “no more something for nothing.” Accordingly, we have to restore punishment to crime, restore disincentives to welfare, restore serious standards to schools, restore sufficient merit to preference. (A great deal of the “inequality” apparent in American income distribution is due to the corrosion of values, i.e., the stunning rise in the number and rate of female-headed households.)
The restoration is proceeding apace, driven by Republicans, with some help from Clinton (finally) and even from parts of the great mess that was once the Democratic party. The end of welfare as we know it may actually happen.
Much, not all, of the social damage has come from government policies. What government caused, government can cure. What politics caused, politics can cure. What liberals caused, conservatives can cure. What liberals have caused, liberals can possibly cure, if they change. It is all doable. The image of squeezing the toothpaste back in the tube is inappropriate. Think rather of taking a wrong fork in the road, returning to the intersection, and starting down a better road to a better place.
We have grown accustomed to pouring scorn on our politicians. But there are times when politics, and probably only politics, can cure what ails us. We forget that there are times when politics can become magical. “Many observers” will probably miss it, but just wait a while, and watch.
For as long as I’ve been thinking about these things, I’ve thought it appropriate that the national anthem (or, to be precise, the first verse of it, which is what everyone sings, or tries to) ends with a question: does the “star-spangled banner still wave” over a home of freedom, made possible by a courageous people?
What Francis Scott Key had in mind, of course, was the survival of the United States as an independent nation: which, from the vantage point of a prisoner aboard a British man-of-war bombarding Baltimore on the night of September 13-14, 1814, was no sure thing. Nor has it been a sure thing since. The physical survival of the United States as an independent nation was at issue in the “Sunken Road” at Antietam and when the 20th Maine and the 15th Alabama fought for Gettysburg’s Little Round Top; it was at stake in the bloody contest with Hitler and Tojo; it was quite literally put at risk in the nuclear duel with the Soviet Union.
Democracies, however, face issues of moral, as well as physical, survival. And if we shift the focus of Key’s question in the political-cultural direction, we see that on at least two occasions within living memory—during the Great Depression, and during the classic period of the civil-rights struggle (1956-64)—it was not at all clear, until the crisis was resolved, that the “land of the free and the home of the brave” would not implode. History tests America’s character, not simply its military defenses.
And that history illustrates two points: first, that the United States is not a settled business, but rather an ongoing experiment whose outcome is never finally secure; and second, that the immediate post-World War II period was something of an aberration—a kind of pleasant interlude amid continuing crisis.
Many of our fellow citizens hoped, indeed expected, that the post-cold-war period would be a similar time-out, a return to normalcy. But the end of the cold war helped bring a long-simmering cultural crisis to the surface of American public life, and the politics of that crisis have been exceptionally fierce. The reason why is not terribly hard to grasp.
The cultural crisis of American public life today engages diametrically opposed understandings of the human person, human community, and human destiny. Each camp considers itself the bearer of an orthodoxy, a “true teaching”; both are infallibilists with regard to their core doctrines; neither is much given to compromise. The two camps can be described in various ways, but a simple, accurate description of them would run something like this: in thinking about the national prospect and the prospects of individual American citizens, one camp instinctively reaches for the language of rights and laws, while the other instinctively speaks of rights and wrongs.
From my point of view, it is the camp of rights and wrongs—intellectually muddled and stylistically offputting as it can be—that allows us, if not optimism, then at least a prudent hope for the future of the United States.
For this camp understands that democracy is not a machine that can run of itself. This camp understands that you cannot have a democracy without a certain critical mass of democrats. This camp understands that “democrats” are people who have internalized a set of habits—virtues, they used to be called—that make self-governance possible. In sum: this camp understands that you cannot have a self-governing republic without self-governing citizens: citizens who are governed from within by a moral law which teaches them self-command and duty to others.
The other camp—the camp of rights and laws—denies much (and, in some instances, all) of this. It has detached rights from obligations. It thinks, not of the person (i.e., of human beings created with intelligence and free will, and thus with capacities for wisdom and virtue), but of the individual: the Self, auto-constructed without reference to any significant communities or moral relationships. It is incapable of conceiving the importance of civil society to democracy because it cannot imagine entering into the kind of self-denying moral obligations to others that make the mediating institutions of civil society possible.
The gravest threat to the national prospect in the aftermath of the Communist crack-up is this ideology of the Imperial Autonomous Self. Its most lethal social impact to date has come in the urban underclass, where the behavioral margin of error is much narrower than among the children of affluence. But we cannot expect that the chaos now on display in parts of our society where the writ of the law has simply ceased to run will be forever confined to those asphalt combat zones. The chaos may take different forms in suburbia, or in small-town America. But should the philosophy of those Nike ads—“Just do it”—prevail, then we will know that we have lost the United States.
Yet something else is stirring in America these days, something that looks like a revolt against the autonomy project, a rebellion against the Nike ethos. It makes the front pages of the papers because of the activism of religious conservatives. But, intriguingly enough, I often find it on campus and among young professionals (especially—imagine!—lawyers). Some call it the beginnings of another Great Awakening. If that is what it is, it is going to be a different one this time around, because it will be thoroughly ecumenical (involving Roman Catholics as well as evangelical and fundamentalist Protestants) and interreligious (involving Jews as well as Christians).
I frankly do not know whether my grandchildren and great-grandchildren will live in a United States that is in moral and cultural continuity with its origins. If they do—if the contributors to COMMENTARY’s l00th-anniversary symposium on the national prospect can answer Francis Scott Key’s question in the affirmative—it will be because such a Great Awakening has renewed the religious and moral foundations of the American experiment in ordered liberty.
James Q. Wilson
“It is the best of times, it is the worst of times; it is the age of wisdom, it is the age of foolishness; it is the epoch of belief, it is the epoch of incredulity; it is the season of light, it is the season of darkness; it is the spring of hope, it is the winter of despair; we have everything before us, we have nothing before us. . . .”
Americans have the ability to hold contrary beliefs to an extent that would have astonished even Charles Dickens. We despair of our politicians and revere our country; we are optimistic about our own destiny but pessimistic about that of the nation; we yearn for bold leaders and are cynical about those who step forward; we are rationalists who are drawn to revealed religion and pragmatists who entertain all manner of conspiracy theories; we love our children but think the younger generation rotten; we believe our own child is doing well in school but that schools are failing everybody else’s children; we want a smaller, less expensive government without any reduction in the benefits government now supplies; we jealously protect our rights and lament the decline of responsibility.
After the French Revolution of which Dickens wrote, Tocqueville explained that it had come about, not because everyone lacked hope, but because hope had been kindled but its promise too slowly realized. Today, most of us have not merely the hope but enjoy the reality of a degree of comfort, freedom, and peace unparalleled in human history. And we can’t stop complaining about it. The revolution of rising expectations has taken us all prisoner.
We want it all but get much less. Our economy is more robust than that of almost any other nation—but real wage rates are stagnant and savings are low. Our property is safer here than it would be in many European cities—but our lives are more at risk. Our politicians are infinitely solicitous of the slightest tendency in public opinion—and so they appear to be self-serving careerists. Most people like the jobs at which they work and the communities in which they live—but an underclass of persistent size and angry cynicism fills parts of our large cities. We have struck down almost every vestige of legal segregation and left no occupation closed to African-Americans—and the two races eye each other suspiciously. We have spent more money per person on health and education than almost any other nation on earth—and our doctors feel harassed, our patients neglected, our educators cheated, and our students confined. Our universities are the envy of the world and lodestones for students from every nation—and the graduates are unable to satisfy their employers that they know how to write. People line up to see the beauty of Aladdin and the heroism of Apollo 13—and the mindless violence and prurient vulgarity of exploitative television and cinema. We live longer and better lives, with less risk of disease, and in a vastly cleaner and more salubrious environment—and we twitch in panic at every report of some imaginary or exaggerated threat: alar, radon, asbestos, breast implants, global warming.
We wanted to create a free, prosperous society, and we did, only to discover that some people misuse their freedom and expect prosperity without effort. We were determined to care for the elderly, only to discover that we selected methods that will in time bankrupt us. With somewhat less determination we set about fashioning a safety net for the disadvantaged, only to learn that it is much harder than we had imagined to help people without changing them, often in ways we do not like.
It is the best of times because we here have gone further than the people of almost any other land in giving to mankind what it wants—freedom, prosperity, and opportunity. It is the worst of times because we have learned that freedom has a cost in licentiousness and predation; that prosperity no matter how great never satisfies our wants or eliminates our envy; and that the promise of equal opportunity will be heard by some to mean the assurance of equality of result.
How could it have been otherwise? There is no plan or program that could have produced a much different result because there is no plan or program that can resolve the contradictions of human nature. We have made the mistake of vastly exaggerating our capacity to solve every problem by converting good intentions into satisfactory results.
In time, perhaps, we shall learn to live with our mixed blessings and recall, with Immanuel Kant, that “out of timber so crooked as that from which man is made nothing entirely straight can be built.”
Ruth R. Wisse
The Yiddish expression zindik nisbt—literally, “do not sin”—warns complainers not to wallow in their disappointment lest they forfeit the good that is still there and theirs to enjoy. In that spirit, I offer several examples of our relative political well-being:
Admittedly, Americans could not experience the same surge of confidence upon the fall of the Soviet Union as they felt at the end of World War II. The protracted struggle against Communism did not result in any instantaneous triumph; the political ruin of Russia confounded our hopes for its foreseeable improvement; and because this country did not totally mobilize against Communism as it did against the Nazis, Americans could not feel a corresponding unity in this second victory. In fact, as one horror sequence after another emerges from the Soviet archives, we shudder to recall that until recently whole factions of our elites preached moral equivalence between our two competing views of social justice, some even favoring the great socialist experiment over ours.
But democratic capitalism’s political victory over Communism remains the story of the century, and those who did engage in that critical struggle should never stop explaining its importance. Unlike dictatorship, democratic culture takes a very long time to mature, and can only be cultivated from the ground up, through centuries of self-disciplining education. Each new generation is required to conserve the country’s institutions and laws, and to train the next in its civic duties. That the process is precarious we realize anew with each immigrant wave and baby boom, but that precariousness is the necessary condition of a free society. Our strength and weakness are one and the same: our political life at any moment can only be as robust as the generation at its helm. But how much easier it should now be to appreciate our less-than-perfect system with its incremental improvements when we compare it with the catastrophic legacy of revolutionary socialism!
America remains impressive. The Gingrich Revolution is badly named. Not a revolution at all, but the kind of healthful corrective that this political system was designed to produce, the election of a conservative-tending Congress is helping to restore confidence (my confidence, anyway) in the ability of people to govern themselves. I heard Gingrich on MTV explaining that student loans have to be paid for, either by the individual who takes the loan, or else, less equitably, by taxpayers who do not reap its benefits. With pedagogic authority and tact, he persuaded his student audience to assume responsibility for itself sooner rather than later, and to stop pretending that any benefit can be “free.” This is the kind of patient argument that has to be made. Many younger conservatives now realize it is not enough to govern; they will have to demonstrate that what conservatism conserves in America is liberal democracy, and that the Republic will only be able to secure its freedoms in perpetuity if it can cultivate a conserving impulse in each of its citizens.
The status of anti-Semitism in contemporary America seems to me yet another sign of the health of our polity. Although it might appear to be only a special interest of Jews, anti-Semitism can be an objective test of political sobriety. Opposition to Jews has been such a convenient tool of “the politics of rage” in this century that one fully expected it to be used here by demagogic politicians on the Left and Right. True to form, some disaffected black leaders and white populists have been trying to explain to their respective constituencies that the Jews are responsible for their sufferings, and that usurpation of Jewish positions is the quickest way to more money and power. When we consider how much anti-Jewish money and malice Arab governments and their surrogates have pumped into the American system since the early 1970’s, and how much success they have enjoyed in the universities and in the media, we should find it the more remarkable that no national politician of recent years has been able to make anti-Semitism work for him at the polls. And without such political success, there can be no political anti-Semitism of the European or Middle Eastern variety.
While anti-Semitism is elsewhere on the rise again as an ideological force—most debilitating to the political confidence of Israelis—it has failed to inflame the American spirit. American politicians who want to move from the fringes into the mainstream have had to mute their anti-Jewish rhetoric, whether, like Jesse Jackson they appeal to disaffected minorities of the Democratic party’s left wing or, like Pat Robertson and Buchanan, they appeal to the Christianity and nativism of the Republican Right. This is not to minimize the Jew-baiting that persists, but to make the important distinction between prejudice and the politics of destruction. Given that anti-Semitism generally fails as a tool of American politics, students of politics would do well to analyze the success that this “failure” signifies.
By emphasizing the positive, I do not overlook all that is wrong. America is at present reaping the consequence of the movement beginning 30 years ago that turned adolescence into an ideology. Many of our current epidemics—drugs, family breakdown, racial distrust, civic irresponsibility—were promoted as features of liberation, and injected into the system from the elites downward. Worst was the disrespect for America’s achievement, the inability to appreciate the fruits of freedom even as they were being exploited.
The editors’ question about the national prospect hinges, I believe, on this matter of appreciation. No human society can be perfect, but the more a society believes in perfectibility, the more it will be plagued by its imperfections. What, then, are to be the sources of our daily satisfaction and national pride? How does the “system” teach our children to say thanks for their good fortune, to consolidate their achievements, to take real pleasure in the permanent tension of democratic politics?
Those who first imagined America as a Christian country trusted in God and gave their thanks to Him. Religion provided the context and the language for spiritual satisfaction that politics alone is powerless to grant. Before we agonize further over our decline, we have to find a way of blessing what is already in our hands.
Of 1945 I have no first-hand experience. So I will begin with 1984 when I first arrived in Washington, D.C. to take up work at the Heritage Foundation’s Policy Review. As I recall, three books had recently appeared on the scene: The Liberal Crack-Up, Losing Ground, and Window of Opportunity. The first, by R. Emmett Tyrrell, Jr. of the American Spectator, was about the political ideology animating post-1960’s “liberalism”: intolerance, coercion, and racialism. In a word, illiberal-ism. The second, by the social scientist Charles Murray, described how the jewel of programmatic liberalism, the Great Society’s various initiatives to end poverty, had put in place perverse incentives, making the condition of the poor worse, not better. The third was by a relatively undistinguished third-term Congressman from Georgia, Newt Gingrich. If memory serves, he laid out a plan for a Republican takeover of Congress, along with an ideology known as the “opportunity society.”
I did not remain in Washington long enough to see what would happen, returning only recently to work at the Public Interest. But the authors were prescient: post-1960’s liberalism as a political ideology, and as a set of policy initiatives, is indeed in the final stages of completely unraveling, leaving a great deal in disrepair. And Newt Gingrich, now the most powerful man in town, has a plan to put things back into good working order: the Contract With America. Whether he will succeed or not depends partly upon how bad things are and, next, upon the character of the Republican counterrevolution.
One would have to be a political naïf to argue that the worst is behind us. The work of some of the best social scientists indicates that crime will get worse, and more violent; that 30 years of well-intentioned welfare policies have created a generation of Americans who cannot do without government support; that the social capital necessary to support a thriving economy and a healthy liberal democracy is dwindling fast. And there are other signs of serious social decay (my source here is William J. Bennett’s The Index of Leading Cultural Indicators): since 1960, illegitimate birth rates rose 400 percent, the percentage of families headed by a single parent tripled, the divorce rate doubled, SAT scores dropped 73 points while television viewing increased, etc.
It is too early to say how the conservative political resurgence will affect these worrisome trends. Some are susceptible of political solutions, others not. We can reduce the incidence of violent crime by cracking down on repeat offenders. Similarly, if we have not discovered how to end poverty (who has?), we have learned that ill-conceived welfare programs do more harm than good. Additionally, budgets can be balanced, federalism recalibrated, tax laws made more family-friendly, abortion sharply curtailed. And then there are problems that seem to be beyond the reach of politics, at least in the short term. When considering such things as an increasingly vulgar culture, one’s hopes for the future are chastened by Rousseau’s warning that “censorship can be useful for preserving mores, but never for reestablishing them.” Still, if good mores in contemporary America are in retreat, they have not yet vanished, and special praise is thus due to religious conservatives who have risked the calumny of the media and the professoriate—and even of their libertarian allies—for insisting that politics not be an amoral or a “neutral” enterprise.
But none of this means that friends of conservatism need be mindless cheerleaders of conservatism. The French philosopher Alain Finkielkraut has pointed out that in the 18th and 19th centuries, opponents of the Enlightenment, although “relentless defenders of the past,” were “in spite of themselves . . . inventors of something new.” In their attempt to beat a path back to the old-time religion, they instead broke new ground, discovering culture, history, and the subconscious. American conservatism’s counterrevolution could also inadvertently push us farther forward.
This has not happened yet; but American conservatism has adopted some of the Left’s bad habits. For example, groups on the Left have for decades bitterly attacked law-enforcement agencies as threats to American liberties, and charged that other government agencies have been involved in various conspiracies, from spreading the AIDS virus among African-Americans to assassinating John F. Kennedy. What a surprise to discover that some conservatives are not immune to this sort of mass paranoia and poisonous anti-government sentiment. Waco and Ruby Ridge (where mistakes were made) are becoming conservative shorthand, as My Lai was for liberals, for a corrupt, evil American government. Jeane J. Kirkpatrick gave a name to this reflex: blaming America first.
Similarly, for decades liberal jurists have been rewriting the U.S. Constitution in light of various “progressive” notions of justice. What a surprise to discover that conservatives too would like to rewrite the Constitution (though by more proper means), from offering a balanced-budget amendment to instituting term limits for Congressmen to drafting a religious-freedom amendment. Some of these measures are justified by their supporters as corrections of past liberal excesses, and they have a point. Yet one wonders whether such constitutional reforms will not further attenuate popular respect for the Constitution, ultimately preparing the ground for something quite new.
Finally, the Left has of late been enamored of the notion that human beings are defined by their racial and ethnic heritages. As the jargon has it, humans are “situated selves” for whom transcendence of tribe, race, or culture is impossible. What a surprise that, in their jeremiads against immigration, an increasing number of conservatives are making the same argument. True, our immigration policies need redrafting, but in doing so the Right need not join the Left in rejecting the nation’s historic commitment to the ideals of assimilation and the melting pot.
Perhaps it was inevitable that conservatives would be influenced by some of the intellectual currents that have swept through the dominant liberal culture. Indeed, many of America’s most intractable problems—weakened families, a lack of consensus on values, an enervated civil society—are in some degree inevitable so long as we remain “moderns.” The cause for alarm, then, is not the existence of the problems but how quickly they developed and worsened from the 1960’s onward. The rapidity of this deterioration was surely not inevitable and is surely to some extent reversible. Beyond that, the American conservative temperament is, I trust, more willing than its liberal counterpart to live with a few contradictions and imperfections, and less tempted to implement utopian solutions.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
The National Prospect
Must-Reads from Magazine
t can be said that the Book of Samuel launched the American Revolution. Though antagonistic to traditional faith, Thomas Paine understood that it was not Montesquieu, or Locke, who was inscribed on the hearts of his fellow Americans. Paine’s pamphlet Common Sense is a biblical argument against British monarchy, drawing largely on the text of Samuel.
Today, of course, universal biblical literacy no longer exists in America, and sophisticated arguments from Scripture are all too rare. It is therefore all the more distressing when public intellectuals, academics, or religious leaders engage in clumsy acts of exegesis and political argumentation by comparing characters in the Book of Samuel to modern political leaders. The most common victim of this tendency has been the central character in the Book of Samuel: King David.
Most recently, this tendency was made manifest in the writings of Dennis Prager. In a recent defense of his own praise of President Trump, Prager wrote that “as a religious Jew, I learned from the Bible that God himself chose morally compromised individuals to accomplish some greater good. Think of King David, who had a man killed in order to cover up the adultery he committed with the man’s wife.” Prager similarly argued that those who refuse to vote for a politician whose positions are correct but whose personal life is immoral “must think God was pretty flawed in voting for King David.”
Prager’s invocation of King David was presaged on the left two decades ago. The records of the Clinton Presidential Library reveal that at the height of the Lewinsky scandal, an email from Dartmouth professor Susannah Heschel made its way into the inbox of an administration policy adviser with a similar comparison: “From the perspective of Jewish history, we have to ask how Jews can condemn President Clinton’s behavior as immoral, when we exalt King David? King David had Batsheva’s husband, Uriah, murdered. While David was condemned and punished, he was never thrown off the throne of Israel. On the contrary, he is exalted in our Jewish memory as the unifier of Israel.”
One can make the case for supporting politicians who have significant moral flaws. Indeed, America’s political system is founded on an awareness of the profound tendency to sinfulness not only of its citizens but also of its statesmen. “If men were angels, no government would be necessary,” James Madison informs us in the Federalist. At the same time, anyone who compares King David to the flawed leaders of our own age reveals a profound misunderstanding of the essential nature of David’s greatness. David was not chosen by God despite his moral failings; rather, David’s failings are the lens that reveal his true greatness. It is in the wake of his sins that David emerges as the paradigmatic penitent, whose quest for atonement is utterly unlike that of any other character in the Bible, and perhaps in the history of the world.
While the precise nature of David’s sins is debated in the Talmud, there is no question that they are profound. Yet it is in comparing David to other faltering figures—in the Bible or today—that the comparison falls flat. This point is stressed by the very Jewish tradition in whose name Prager claimed to speak.
It is the rabbis who note that David’s predecessor, Saul, lost the kingship when he failed to fulfill God’s command to destroy the egregiously evil nation of Amalek, whereas David commits more severe sins and yet remains king. The answer, the rabbis suggest, lies not in the sin itself but in the response. Saul, when confronted by the prophet Samuel, offers obfuscations and defensiveness. David, meanwhile, is similarly confronted by the prophet Nathan: “Thou hast killed Uriah the Hittite with the sword, and hast taken his wife to be thy wife, and hast slain him with the sword of the children of Ammon.” David’s immediate response is clear and complete contrition: “I have sinned against the Lord.” David’s penitence, Jewish tradition suggests, sets him apart from Saul. Soon after, David gave voice to what was in his heart at the moment, and gave the world one of the most stirring of the Psalms:
Have mercy upon me, O God, according to thy lovingkindness: according unto the multitude of thy tender mercies blot out my transgressions.
Wash me thoroughly from mine iniquity, and cleanse me from my sin. For I acknowledge my transgressions: and my sin is ever before me.
. . . Deliver me from bloodguiltiness, O God, thou God of my salvation: and my tongue shall sing aloud of thy righteousness.
O Lord, open thou my lips; and my mouth shall shew forth thy praise.
For thou desirest not sacrifice; else would I give it: thou delightest not in burnt offering.
The sacrifices of God are a broken spirit: a broken and a contrite heart, O God, thou wilt not despise.
The tendency to link David to our current age lies in the fact that we know more about David than any other biblical figure. The author Thomas Cahill has noted that in a certain literary sense, David is the only biblical figure that is like us at all. Prior to the humanist autobiographies of the Renaissance, he notes, “we can count only a few isolated instances of this use of ‘I’ to mean the interior self. But David’s psalms are full of I’s.” In David’s Psalms, Cahill writes, we “find a unique early roadmap to the inner spirit—previously mute—of ancient humanity.”
At the same time, a study of the Book of Samuel and of the Psalms reveals how utterly incomparable David is to anyone alive today. Haym Soloveitchik has noted that even the most observant of Jews today fail to feel a constant intimacy with God that the simplest Jew of the premodern age might have felt, that “while there are always those whose spirituality is one apart from that of their time, nevertheless I think it safe to say that the perception of God as a daily, natural force is no longer present to a significant degree in any sector of modern Jewry, even the most religious.” Yet for David, such intimacy with the divine was central to his existence, and the Book of Samuel and the Psalms are an eternal testament to this fact. This is why simple comparisons between David and ourselves, as tempting as they are, must be resisted. David Wolpe, in his book about David, attempts to make the case as to why King David’s life speaks to us today: “So versatile and enduring is David in our culture that rare is the week that passes without some public allusion to his life…We need to understand David better because we use his life to comprehend our own.”
The truth may be the opposite. We need to understand David better because we can use his life to comprehend what we are missing, and how utterly unlike our lives are to his own. For even the most religious among us have lost the profound faith and intimacy with God that David had. It is therefore incorrect to assume that because of David’s flaws it would have been, as Amos Oz has written, “fitting for him to reign in Tel Aviv.” The modern State of Israel was blessed with brilliant leaders, but to which of its modern warriors or statesmen should David be compared? To Ben Gurion, who stripped any explicit invocation of the Divine from Israel’s Declaration of Independence? To Moshe Dayan, who oversaw the reconquest of Jerusalem, and then immediately handed back the Temple Mount, the locus of King David’s dreams and desires, to the administration of the enemies of Israel? David’s complex humanity inspires comparison to modern figures, but his faith, contrition, and repentance—which lie at the heart of his story and success—defy any such engagement.
And so, to those who seek comparisons to modern leaders from the Bible, the best rule may be: Leave King David out of it.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Three attacks in Britain highlight the West’s inability to see the threat clearly
This lack of seriousness manifests itself in several ways. It’s perhaps most obvious in the failure to reform Britain’s chaotic immigration and dysfunctional asylum systems. But it’s also abundantly clear from the grotesque underfunding and under-resourcing of domestic intelligence. In MI5, Britain has an internal security service that is simply too small to do its job effectively, even if it were not handicapped by an institutional culture that can seem willfully blind to the ideological roots of the current terrorism problem.
In 2009, Jonathan Evans, then head of MI5, confessed at a parliamentary hearing about the London bus and subway attacks of 2005 that his organization only had sufficient resources to “hit the crocodiles close to the boat.” It was an extraordinary metaphor to use, not least because of the impression of relative impotence that it conveys. MI5 had by then doubled in size since 2001, but it still boasted a staff of only 3,500. Today it’s said to employ between 4,000 and 5,000, an astonishingly, even laughably, small number given a UK population of 65 million and the scale of the security challenges Britain now faces. (To be fair, the major British police forces all have intelligence units devoted to terrorism, and the UK government’s overall counterterrorism strategy involves a great many people, including social workers and schoolteachers.)
You can also see that unseriousness at work in the abject failure to coerce Britain’s often remarkably sedentary police officers out of their cars and stations and back onto the streets. Most of Britain’s big-city police forces have adopted a reactive model of policing (consciously rejecting both the New York Compstat model and British “bobby on the beat” traditions) that cripples intelligence-gathering and frustrates good community relations.
If that weren’t bad enough, Britain’s judiciary is led by jurists who came of age in the 1960s, and who have been inclined since 2001 to treat terrorism as an ordinary criminal problem being exploited by malign officials and politicians to make assaults on individual rights and to take part in “illegal” foreign wars. It has long been almost impossible to extradite ISIS or al-Qaeda–linked Islamists from the UK. This is partly because today’s English judges believe that few if any foreign countries—apart from perhaps Sweden and Norway—are likely to give terrorist suspects a fair trial, or able to guarantee that such suspects will be spared torture and abuse.
We have a progressive metropolitan media elite whose primary, reflexive response to every terrorist attack, even before the blood on the pavement is dry, is to express worry about an imminent violent anti-Muslim “backlash” on the part of a presumptively bigoted and ignorant indigenous working class. Never mind that no such “backlash” has yet occurred, not even when the young off-duty soldier Lee Rigby was hacked to death in broad daylight on a South London street in 2013.
Another sign of this lack of seriousness is the choice by successive British governments to deal with the problem of internal terrorism with marketing and “branding.” You can see this in the catchy consultant-created acronyms and pseudo-strategies that are deployed in place of considered thought and action. After every atrocity, the prime minister calls a meeting of the COBRA unit—an acronym that merely stands for Cabinet Office Briefing Room A but sounds like a secret organization of government superheroes. The government’s counterterrorism strategy is called CONTEST, which has four “work streams”: “Prevent,” “Pursue,” “Protect,” and “Prepare.”
Perhaps the ultimate sign of unseriousness is the fact that police, politicians, and government officials have all displayed more fear of being seen as “Islamophobic” than of any carnage that actual terror attacks might cause. Few are aware that this short-term, cowardly, and trivial tendency may ultimately foment genuine, dangerous popular Islamophobia, especially if attacks continue.R
ecently, three murderous Islamist terror attacks in the UK took place in less than a month. The first and third were relatively primitive improvised attacks using vehicles and/or knives. The second was a suicide bombing that probably required relatively sophisticated planning, technological know-how, and the assistance of a terrorist infrastructure. As they were the first such attacks in the UK, the vehicle and knife killings came as a particular shock to the British press, public, and political class, despite the fact that non-explosive and non-firearm terror attacks have become common in Europe and are almost routine in Israel.
The success of all three plots indicates troubling problems in British law-enforcement practice and culture, quite apart from any other failings on the parts of the state in charge of intelligence, border control, and the prevention of radicalization. At the time of writing, the British media have been full of encomia to police courage and skill, not least because it took “only” eight minutes for an armed Metropolitan Police team to respond to and confront the bloody mayhem being wrought by the three Islamist terrorists (who had ploughed their rented van into people on London Bridge before jumping out to attack passersby with knives). But the difficult truth is that all three attacks would be much harder to pull off in Manhattan, not just because all NYPD cops are armed, but also because there are always police officers visibly on patrol at the New York equivalents of London’s Borough Market on a Saturday night. By contrast, London’s Metropolitan police is a largely vehicle-borne, reactive force; rather than use a physical presence to deter crime and terrorism, it chooses to monitor closed-circuit street cameras and social-media postings.
Since the attacks in London and Manchester, we have learned that several of the perpetrators were “known” to the police and security agencies that are tasked with monitoring potential terror threats. That these individuals were nevertheless able to carry out their atrocities is evidence that the monitoring regime is insufficient.
It also seems clear that there were failures on the part of those institutions that come under the leadership of the Home Office and are supposed to be in charge of the UK’s border, migration, and asylum systems. Journalists and think tanks like Policy Exchange and Migration Watch have for years pointed out that these systems are “unfit for purpose,” but successive governments have done little to take responsible control of Britain’s borders. When she was home secretary, Prime Minister Theresa May did little more than jazz up the name, logo, and uniforms of what is now called the “Border Force,” and she notably failed to put in place long-promised passport checks for people flying out of the country. This dereliction means that it is impossible for the British authorities to know who has overstayed a visa or whether individuals who have been denied asylum have actually left the country.
It seems astonishing that Youssef Zaghba, one of the three London Bridge attackers, was allowed back into the country. The Moroccan-born Italian citizen (his mother is Italian) had been arrested by Italian police in Bologna, apparently on his way to Syria via Istanbul to join ISIS. When questioned by the Italians about the ISIS decapitation videos on his mobile phone, he declared that he was “going to be a terrorist.” The Italians lacked sufficient evidence to charge him with a crime but put him under 24-hour surveillance, and when he traveled to London, they passed on information about him to MI5. Nevertheless, he was not stopped or questioned on arrival and had not become one of the 3,000 official terrorism “subjects of interest” for MI5 or the police when he carried out his attack. One reason Zaghba was not questioned on arrival may have been that he used one of the new self-service passport machines installed in UK airports in place of human staff after May’s cuts to the border force. Apparently, the machines are not yet linked to any government watch lists, thanks to the general chaos and ineptitude of the Home Office’s efforts to use information technology.
The presence in the country of Zaghba’s accomplice Rachid Redouane is also an indictment of the incompetence and disorganization of the UK’s border and migration authorities. He had been refused asylum in 2009, but as is so often the case, Britain’s Home Office never got around to removing him. Three years later, he married a British woman and was therefore able to stay in the UK.
But it is the failure of the authorities to monitor ringleader Khuram Butt that is the most baffling. He was a known and open associate of Anjem Choudary, Britain’s most notorious terrorist supporter, ideologue, and recruiter (he was finally imprisoned in 2016 after 15 years of campaigning on behalf of al-Qaeda and ISIS). Butt even appeared in a 2016 TV documentary about ISIS supporters called The Jihadist Next Door. In the same year, he assaulted a moderate imam at a public festival, after calling him a “murtad” or apostate. The imam reported the incident to the police—who took six months to track him down and then let him off with a caution. It is not clear if Butt was one of the 3,000 “subjects of interest” or the additional 20,000 former subjects of interest who continue to be the subject of limited monitoring. If he was not, it raises the question of what a person has to do to get British security services to take him seriously as a terrorist threat; if he was in fact on the list of “subjects of interest,” one has to wonder if being so designated is any barrier at all to carrying out terrorist atrocities. It’s worth remembering, as few do here in the UK, that terrorists who carried out previous attacks were also known to the police and security services and nevertheless enjoyed sufficient liberty to go at it again.B
ut the most important reason for the British state’s ineffectiveness in monitoring terror threats, which May addressed immediately after the London Bridge attack, is a deeply rooted institutional refusal to deal with or accept the key role played by Islamist ideology. For more than 15 years, the security services and police have chosen to take note only of people and bodies that explicitly espouse terrorist violence or have contacts with known terrorist groups. The fact that a person, school, imam, or mosque endorses the establishment of a caliphate, the stoning of adulterers, or the murder of apostates has not been considered a reason to monitor them.
This seems to be why Salman Abedi, the Manchester Arena suicide bomber, was not being watched by the authorities as a terror risk, even though he had punched a girl in the face for wearing a short skirt while at university, had attended the Muslim Brotherhood-controlled Didsbury Mosque, was the son of a Libyan man whose militia is banned in the UK, had himself fought against the Qaddafi regime in Libya, had adopted the Islamist clothing style (trousers worn above the ankle, beard but no moustache), was part of a druggy gang subculture that often feeds individuals into Islamist terrorism, and had been banned from a mosque after confronting an imam who had criticized ISIS.
It was telling that the day after the Manchester Arena suicide-bomb attack, you could hear security officials informing radio and TV audiences of the BBC’s flagship morning-radio news show that it’s almost impossible to predict and stop such attacks because the perpetrators “don’t care who they kill.” They just want to kill as many people as possible, he said.
Surely, anyone with even a basic familiarity with Islamist terror attacks over the last 15 or so years and a nodding acquaintance with Islamist ideology could see that the terrorist hadn’t just chosen the Ariana Grande concert in Manchester Arena because a lot of random people would be crowded into a conveniently small area. Since the Bali bombings of 2002, nightclubs, discotheques, and pop concerts attended by shameless unveiled women and girls have been routinely targeted by fundamentalist terrorists, including in Britain. Among the worrying things about the opinion offered on the radio show was that it suggests that even in the wake of the horrific Bataclan attack in Paris during a November 2015 concert, British authorities may not have been keeping an appropriately protective eye on music venues and other places where our young people hang out in their decadent Western way. Such dereliction would make perfect sense given the resistance on the part of the British security establishment to examining, confronting, or extrapolating from Islamist ideology.
The same phenomenon may explain why authorities did not follow up on community complaints about Abedi. All too often when people living in Britain’s many and diverse Muslim communities want to report suspicious behavior, they have to do so through offices and organizations set up and paid for by the authorities as part of the overall “Prevent” strategy. Although criticized by the left as “Islamophobic” and inherently stigmatizing, Prevent has often brought the government into cooperative relationships with organizations even further to the Islamic right than the Muslim Brotherhood. This means that if you are a relatively secular Libyan émigré who wants to report an Abedi and you go to your local police station, you are likely to find yourself speaking to a bearded Islamist.
From its outset in 2003, the Prevent strategy was flawed. Its practitioners, in their zeal to find and fund key allies in “the Muslim community” (as if there were just one), routinely made alliances with self-appointed community leaders who represented the most extreme and intolerant tendencies in British Islam. Both the Home Office and MI5 seemed to believe that only radical Muslims were “authentic” and would therefore be able to influence young potential terrorists. Moderate, modern, liberal Muslims who are arguably more representative of British Islam as a whole (not to mention sundry Shiites, Sufis, Ahmmadis, and Ismailis) have too often found it hard to get a hearing.
Sunni organizations that openly supported suicide-bomb attacks in Israel and India and that justified attacks on British troops in Iraq and Afghanistan nevertheless received government subsidies as part of Prevent. The hope was that in return, they would alert the authorities if they knew of individuals planning attacks in the UK itself.
It was a gamble reminiscent of British colonial practice in India’s northwest frontier and elsewhere. Not only were there financial inducements in return for grudging cooperation; the British state offered other, symbolically powerful concessions. These included turning a blind eye to certain crimes and antisocial practices such as female genital mutilation (there have been no successful prosecutions relating to the practice, though thousands of cases are reported every year), forced marriage, child marriage, polygamy, the mass removal of girls from school soon after they reach puberty, and the epidemic of racially and religiously motivated “grooming” rapes in cities like Rotherham. (At the same time, foreign jihadists—including men wanted for crimes in Algeria and France—were allowed to remain in the UK as long as their plots did not include British targets.)
This approach, simultaneously cynical and naive, was never as successful as its proponents hoped. Again and again, Muslim chaplains who were approved to work in prisons and other institutions have sometimes turned out to be Islamist extremists whose words have inspired inmates to join terrorist organizations.
Much to his credit, former Prime Minister David Cameron fought hard to change this approach, even though it meant difficult confrontations with his home secretary (Theresa May), as well as police and the intelligence agencies. However, Cameron’s efforts had little effect on the permanent personnel carrying out the Prevent strategy, and cooperation with Islamist but currently nonviolent organizations remains the default setting within the institutions on which the United Kingdom depends for security.
The failure to understand the role of ideology is one of imagination as well as education. Very few of those who make government policy or write about home-grown terrorism seem able to escape the limitations of what used to be called “bourgeois” experience. They assume that anyone willing to become an Islamist terrorist must perforce be materially deprived, or traumatized by the experience of prejudice, or provoked to murderous fury by oppression abroad. They have no sense of the emotional and psychic benefits of joining a secret terror outfit: the excitement and glamor of becoming a kind of Islamic James Bond, bravely defying the forces of an entire modern state. They don’t get how satisfying or empowering the vengeful misogyny of ISIS-style fundamentalism might seem for geeky, frustrated young men. Nor can they appreciate the appeal to the adolescent mind of apocalyptic fantasies of power and sacrifice (mainstream British society does not have much room for warrior dreams, given that its tone is set by liberal pacifists). Finally, they have no sense of why the discipline and self-discipline of fundamentalist Islam might appeal so strongly to incarcerated lumpen youth who have never experienced boundaries or real belonging. Their understanding is an understanding only of themselves, not of the people who want to kill them.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Review of 'White Working Class' By Joan C. Williams
Williams is a prominent feminist legal scholar with degrees from Yale, MIT, and Harvard. Unbending Gender, her best-known book, is the sort of tract you’d expect to find at an intersectionality conference or a Portlandia bookstore. This is why her insightful, empathic book comes as such a surprise.
Books and essays on the topic have accumulated into a highly visible genre since Donald Trump came on the American political scene; J.D. Vance’s Hillbilly Elegy planted itself at the top of bestseller lists almost a year ago and still isn’t budging. As with Vance, Williams’s interest in the topic is personal. She fell “madly in love with” and eventually married a Harvard Law School graduate who had grown up in an Italian neighborhood in pre-gentrification Brook-lyn. Williams, on the other hand, is a “silver-spoon girl.” Her father’s family was moneyed, and her maternal grandfather was a prominent Reform rabbi.
The author’s affection for her “class-migrant” spouse and respect for his family’s hardships—“My father-in-law grew up on blood soup,” she announces in her opening sentence—adds considerable warmth to what is at bottom a political pamphlet. Williams believes that elite condescension and “cluelessness” played a big role in Trump’s unexpected and dreaded victory. Enlightening her fellow elites is essential to the task of returning Trump voters to the progressive fold where, she is sure, they rightfully belong.
Liberals were not always so dense about the working class, Williams observes. WPA murals and movies like On the Waterfront showed genuine fellow feeling for the proletariat. In the 1970s, however, the liberal mood changed. Educated boomers shifted their attention to “issues of peace, equal rights, and environmentalism.” Instead of feeling the pain of Arthur Miller and John Steinbeck characters, they began sneering at the less enlightened. These days, she notes, elite sympathies are limited to the poor, people of color (POC), and the LGBTQ population. Despite clear evidence of suffering—stagnant wages, disappearing manufacturing jobs, declining health and well-being—the working class gets only fly-over snobbery at best and, more often, outright loathing.
Williams divides her chapters into a series of explainers to questions she has heard from her clueless friends and colleagues: “Why Does the Working Class Resent the Poor?” “Why Does the Working Class Resent Professionals but Admire the Rich?” “Why Doesn’t the Working Class Just Move to Where the Jobs Are?” “Is the Working Class Just Racist?” She weaves her answers into a compelling picture of a way of life and worldview foreign to her targeted readers. Working-class Americans have had to struggle for whatever stability and comfort they have, she explains. Clocking in for midnight shifts year after year, enduring capricious bosses, plant closures, and layoffs, they’re reliant on tag-team parenting and stressed-out relatives for child care. The campus go-to word “privileged” seems exactly wrong.
Proud of their own self-sufficiency and success, however modest, they don’t begrudge the self-made rich. It’s snooty professionals and the dysfunctional poor who get their goat. From their vantage point, subsidizing the day care for a welfare mother when they themselves struggle to manage care on their own dime mocks both their hard work and their beliefs. And since, unlike most professors, they shop in the same stores as the dependent poor, they’ve seen that some of them game the system. Of course that stings.
White Working Class is especially good at evoking the alternate economic and mental universe experienced by Professional and Managerial Elites, or “PMEs.” PMEs see their non-judgment of the poor, especially those who are “POC,” as a mark of their mature understanding that we live in an unjust, racist system whose victims require compassion regardless of whether they have committed any crime. At any rate, their passions lie elsewhere. They define themselves through their jobs and professional achievements, hence their obsession with glass ceilings.
Williams tells the story of her husband’s faux pas at a high-school reunion. Forgetting his roots for a moment, the Ivy League–educated lawyer asked one of his Brooklyn classmates a question that is the go-to opener in elite social settings: “What do you do?” Angered by what must have seemed like deliberate humiliation by this prodigal son, the man hissed: “I sell toilets.”
Instead of stability and backyard barbecues with family and long-time neighbors and maybe the occasional Olive Garden celebration, PMEs are enamored of novelty: new foods, new restaurants, new friends, new experiences. The working class chooses to spend its leisure in comfortable familiarity; for the elite, social life is a lot like networking. Members of the professional class may view themselves as sophisticated or cosmopolitan, but, Williams shows, to the blue-collar worker their glad-handing is closer to phony social climbing and their abstract, knowledge-economy jobs more like self-important pencil-pushing.
White Working Class has a number of proposals for creating the progressive future Williams would like to see. She wants to get rid of college-for-all dogma and improve training for middle-skill jobs. She envisions a working-class coalition of all races and ethnicities bolstered by civics education with a “distinctly celebratory view of American institutions.” In a saner political environment, some of this would make sense; indeed, she echoes some of Marco Rubio’s 2016 campaign themes. It’s little wonder White Working Class has already gotten the stink eye from liberal reviewers for its purported sympathies for racists.
Alas, impressive as Williams’s insights are, they do not always allow her to transcend her own class loyalties. Unsurprisingly, her own PME biases mostly come to light in her chapters on race and gender. She reduces immigration concerns to “fear of brown people,” even as she notes elsewhere that a quarter of Latinos also favor a wall at the southern border. This contrasts startlingly with her succinct observation that “if you don’t want to drive working-class whites to be attracted to the likes of Limbaugh, stop insulting them.” In one particularly obtuse moment, she asserts: “Because I study social inequality, I know that even Malia and Sasha Obama will be disadvantaged by race, advantaged as they are by class.” She relies on dubious gender theories to explain why the majority of white women voted for Trump rather than for his unfairly maligned opponent. That Hillary Clinton epitomized every elite quality Williams has just spent more than a hundred pages explicating escapes her notice. Williams’s own reflexive retreat into identity politics is itself emblematic of our toxic divisions, but it does not invalidate the power of this astute book.
Choose your plan and pay nothing for six Weeks!
When music could not transcend evil
he story of European classical music under the Third Reich is one of the most squalid chapters in the annals of Western culture, a chronicle of collective complaisance that all but beggars belief. Without exception, all of the well-known musicians who left Germany and Austria in protest when Hitler came to power in 1933 were either Jewish or, like the violinist Adolf Busch, Rudolf Serkin’s father-in-law, had close family ties to Jews. Moreover, most of the small number of non-Jewish musicians who emigrated later on, such as Paul Hindemith and Lotte Lehmann, are now known to have done so not out of principle but because they were unable to make satisfactory accommodations with the Nazis. Everyone else—including Karl Böhm, Wilhelm Furtwängler, Walter Gieseking, Herbert von Karajan, and Richard Strauss—stayed behind and served the Reich.
The Berlin and Vienna Philharmonics, then as now Europe’s two greatest orchestras, were just as willing to do business with Hitler and his henchmen, firing their Jewish members and ceasing to perform the music of Jewish composers. Even after the war, the Vienna Philharmonic was notorious for being the most anti-Semitic orchestra in Europe, and it was well known in the music business (though never publicly discussed) that Helmut Wobisch, the orchestra’s principal trumpeter and its executive director from 1953 to 1968, had been both a member of the SS and a Gestapo spy.
The management of the Berlin Philharmonic made no attempt to cover up the orchestra’s close relationship with the Third Reich, no doubt because the Nazi ties of Karajan, who was its music director from 1956 until shortly before his death in 1989, were a matter of public record. Yet it was not until 2007 that a full-length study of its wartime activities, Misha Aster’s The Reich’s Orchestra: The Berlin Philharmonic 1933–1945, was finally published. As for the Vienna Philharmonic, its managers long sought to quash all discussion of the orchestra’s Nazi past, steadfastly refusing to open its institutional archives to scholars until 2008, when Fritz Trümpi, an Austrian scholar, was given access to its records. Five years later, the Viennese, belatedly following the precedent of the Berlin Philharmonic, added a lengthy section to their website called “The Vienna Philharmonic Under National Socialism (1938–1945),” in which the damning findings of Trümpi and two other independent scholars were made available to the public.
Now Trümpi has published The Political Orchestra: The Vienna and Berlin Philharmonics During the Third Reich, in which he tells how they came to terms with Nazism, supplying pre- and postwar historical context for their transgressions.1 Written in a stiff mixture of academic jargon and translatorese, The Political Orchestra is ungratifying to read. Even so, the tale that it tells is both compelling and disturbing, especially to anyone who clings to the belief that high art is ennobling to the spirit.U
nlike the Vienna Philharmonic, which has always doubled as the pit orchestra for the Vienna State Opera, the Berlin Philharmonic started life in 1882 as a fully independent, self-governing entity. Initially unsubsidized by the state, it kept itself afloat by playing a grueling schedule of performances, including “popular” non-subscription concerts for which modest ticket prices were levied. In addition, the orchestra made records and toured internationally at a time when neither was common.
These activities made it possible for the Berlin Philharmonic to develop into an internationally renowned ensemble whose fabled collective virtuosity was widely seen as a symbol of German musical distinction. Furtwängler, the orchestra’s principal conductor, declared in 1932 that the German music in which it specialized was “one of the very few things that actually contribute to elevating [German] prestige.” Hence, he explained, the need for state subsidy, which he saw as “a matter of [national] prestige, that is, to some extent a requirement of national prudence.” By then, though, the orchestra was already heavily subsidized by the city of Berlin, thus paving the way for its takeover by the Nazis.
The Vienna Philharmonic, by contrast, had always been subsidized. Founded in 1842 when the orchestra of what was then the Vienna Court Opera decided to give symphonic concerts on its own, it performed the Austro-German classics for an elite cadre of longtime subscribers. By restricting membership to local players and their pupils, the orchestra cultivated what Furtwängler, who spent as much time conducting in Vienna as in Berlin, described as a “homogeneous and distinct tone quality.” At once dark and sweet, it was as instantly identifiable—and as characteristically Viennese—as the strong, spicy bouquet of a Gewürztraminer wine.
Unlike the Berlin Philharmonic, which played for whoever would pay the tab and programmed new music as a matter of policy, the Vienna Philharmonic chose not to diversify either its haute-bourgeois audience or its conservative repertoire. Instead, it played Beethoven, Brahms, Haydn, Mozart, and Schubert (and, later, Bruckner and Richard Strauss) in Vienna for the Viennese. Starting in the ’20s, the orchestra’s recordings consolidated its reputation as one of the world’s foremost instrumental ensembles, but its internal culture remained proudly insular.
What the two orchestras had in common was a nationalistic ethos, a belief in the superiority of Austro-German musical culture that approached triumphalism. One of the darkest manifestations of this ethos was their shared reluctance to hire Jews. The Berlin Philharmonic employed only four Jewish players in 1933, while the Vienna Philharmonic contained only 11 Jews at the time of the Anschluss, none of whom was hired after 1920. To be sure, such popular Jewish conductors as Otto Klemperer and Bruno Walter continued to work in Vienna for as long as they could. Two months before the Anschluss, Walter led and recorded a performance of the Ninth Symphony of Gustav Mahler, his musical mentor and fellow Jew, who from 1897 to 1907 had been the director of the Vienna Court Opera and one of the Philharmonic’s most admired conductors. But many members of both orchestras were open supporters of fascism, and not a few were anti-Semites who ardently backed Hitler. By 1942, 62 of the 123 active members of the Vienna Philharmonic were Nazi party members.
The admiration that Austro-German classical musicians had for Hitler is not entirely surprising since he was a well-informed music lover who declared in 1938 that “Germany has become the guardian of European culture and civilization.” He made the support of German art, music very much included, a key part of his political program. Accordingly, the Berlin Philharmonic was placed under the direct supervision of Joseph Goebbels, who ensured the cooperation of its members by repeatedly raising their salaries, exempting them from military service, and guaranteeing their old-age pensions. But there had never been any serious question of protest, any more than there would be among the members of the Vienna Philharmonic when the Nazis gobbled up Austria. Save for the Jews and one or two non-Jewish players who were fired for reasons of internal politics, the musicians went along unhesitatingly with Hitler’s desires.
With what did they go along? Above all, they agreed to the scrubbing of Jewish music from their programs and the dismissal of their Jewish colleagues. Some Jewish players managed to escape with their lives, but seven of the Vienna Philharmonic’s 11 Jews were either murdered by the Nazis or died as a direct result of official persecution. In addition, both orchestras performed regularly at official government functions and made tours and other public appearances for propaganda purposes, and both were treated as gems in the diadem of Nazi culture.
As for Furtwängler, the most prominent of the Austro-German orchestral conductors who served the Reich, his relationship to Nazism continues to be debated to this day. He had initially resisted the firing of the Berlin Philharmonic’s Jewish members and protected them for as long as he could. But he was also a committed (if woolly-minded) nationalist who believed that German music had “a different meaning for us Germans than for other nations” and notoriously declared in an open letter to Goebbels that “we all welcome with great joy and gratitude . . . the restoration of our national honor.” Thereafter he cooperated with the Nazis, by all accounts uncomfortably but—it must be said—willingly. A monster of egotism, he saw himself as the greatest living exponent of German music and believed it to be his duty to stay behind and serve a cause higher than what he took to be mere party politics. “Human beings are free wherever Wagner and Beethoven are played, and if they are not free at first, they are freed while listening to these works,” he naively assured a horrified Arturo Toscanini in 1937. “Music transports them to regions where the Gestapo can do them no harm.”O
nce the war was over, the U.S. occupation forces decided to enlist the Berlin Philharmonic in the service of a democratic, anti-Soviet Germany. Furtwängler and Herbert von Karajan, who succeeded him as principal conductor, were officially “de-Nazified” and their orchestra allowed to function largely undisturbed, though six Nazi Party members were fired. The Vienna Philharmonic received similarly privileged treatment.
Needless to say, there was more to this decision than Cold War politics. No one questioned the unique artistic stature of either orchestra. Moreover, the Vienna Philharmonic, precisely because of its insularity, was now seen as a living museum piece, a priceless repository of 19th-century musical tradition. Still, many musicians and listeners, Jews above all, looked askance at both orchestras for years to come, believing them to be tainted by Nazism.
Indeed they were, so much so that they treated many of their surviving Jewish ex-members in a way that can only be described as vicious. In the most blatant individual case, the violinist Szymon Goldberg, who had served as the Berlin Philharmonic’s concertmaster under Furtwängler, was not allowed to reassume his post in 1945 and was subsequently denied a pension. As for the Vienna Philharmonic, the fact that it made Helmut Wobisch its executive director says everything about its deep-seated unwillingness to face up to its collective sins.
Be that as it may, scarcely any prominent musicians chose to boycott either orchestra. Leonard Bernstein went so far as to affect a flippant attitude toward the morally equivocal conduct of the Austro-German artists whom he encountered in Europe after the war. Upon meeting Herbert von Karajan in 1954, he actually told his wife Felicia that he had become “real good friends with von Karajan, whom you would (and will) adore. My first Nazi.”
At the same time, though, Bernstein understood what he was choosing to overlook. When he conducted the Vienna Philharmonic for the first time in 1966, he wrote to his parents:
I am enjoying Vienna enormously—as much as a Jew can. There are so many sad memories here; one deals with so many ex-Nazis (and maybe still Nazis); and you never know if the public that is screaming bravo for you might contain someone who 25 years ago might have shot me dead. But it’s better to forgive, and if possible, forget. The city is so beautiful, and so full of tradition. Everyone here lives for music, especially opera, and I seem to be the new hero.
Did Bernstein sell his soul for the opportunity to work with so justly renowned an orchestra—and did he get his price by insisting that its members perform the symphonies of Mahler, with which he was by then closely identified? It is a fair question, one that does not lend itself to easy answers.
Even more revealing is the case of Bruno Walter, who never forgave Furtwängler for staying behind in Germany, informing him in an angry letter that “your art was used as a conspicuously effective means of propaganda for the regime of the Devil.” Yet Walter’s righteous anger did not stop him from conducting in Vienna after the war. Born in Berlin, he had come to identify with the Philharmonic so closely that it was impossible for him to seriously consider quitting its podium permanently. “Spiritually, I was a Viennese,” he wrote in Theme and Variations, his 1946 autobiography. In 1952, he made a second recording with the Vienna Philharmonic of Mahler’s Das Lied von der Erde, whose premiere he had conducted in 1911 and which he had recorded in Vienna 15 years earlier. One wonders what Walter, who had converted to Christianity but had been driven out of both his native lands for the crime of being Jewish, made of the text of the last movement: “My friend, / On this earth, fortune has not been kind to me! / Where do I go?”
As for the two great orchestras of the Third Reich, both have finally acknowledged their guilt and been forgiven, at least by those who know little of their past. It would occur to no one to decline on principle to perform with either group today. Such a gesture would surely be condemned as morally ostentatious, an exercise in what we now call virtue-signaling. Yet it is impossible to forget what Samuel Lipman wrote in 1993 in Commentary apropos the wartime conduct of Furtwängler: “The ultimate triumph of totalitarianism, I suppose it can be said, is that under its sway only a martyred death can be truly moral.” For the only martyrs of the Berlin and Vienna Philharmonics were their Jews. The orchestras themselves live on, tainted and beloved.
Choose your plan and pay nothing for six Weeks!
He knows what to reveal and what to conceal, understands the importance of keeping the semblance of distance between oneself and the story of the day, and comprehends the ins and outs of anonymous sourcing. Within days of his being fired by President Trump on May 9, for example, little green men and women, known only as his “associates,” began appearing in the pages of the New York Times and Washington Post to dispute key points of the president’s account of his dismissal and to promote Comey’s theory of the case.
“In a Private Dinner, Trump Demanded Loyalty,” the New York Times reported on May 11. “Comey Demurred.” The story was a straightforward narrative of events from Comey’s perspective, capped with an obligatory denial from the White House. The next day, the Washington Post reported, “Comey associates dispute Trump’s account of conversations.” The Post did not identify Comey’s associates, other than saying that they were “people who have worked with him.”
Maybe they were the same associates who had gabbed to the Times. Or maybe they were different ones. Who can tell? Regardless, the story these particular associates gave to the Post was readable and gripping. Comey, the Post reported, “was wary of private meetings and discussions with the president and did not offer the assurance, as Trump has claimed, that Trump was not under investigation as part of the probe into Russian interference in last year’s election.”
On May 16, Michael S. Schmidt of the Times published his scoop, “Comey Memo Says Trump Asked Him to End Flynn Investigation.” Schmidt didn’t see the memo for himself. Parts of it were read to him by—you guessed it—“one of Mr. Comey’s associates.” The following day, Robert Mueller was appointed special counsel to oversee the Russia investigation. On May 18, the Times, citing “two people briefed” on a call between Comey and the president, reported, “Comey, Unsettled by Trump, Is Said to Have Wanted Him Kept at a Distance.” And by the end of that week, Comey had agreed to testify before the Senate Intelligence Committee.
As his testimony approached, Comey’s people became more aggressive in their criticisms of the president. “Trump Should Be Scared, Comey Friend Says,” read the headline of a CNN interview with Brookings Institution fellow Benjamin Wittes. This “Comey friend” said he was “very shocked” when he learned that President Trump had asked Comey for loyalty. “I have no doubt that he regarded the group of people around the president as dishonorable,” Wittes said.
Comey, Wittes added, was so uncomfortable at the White House reception in January honoring law enforcement—the one where Comey lumbered across the room and Trump whispered something in his ear—that, as CNN paraphrased it, he “stood in a position so that his blue blazer would blend in with the room’s blue drapes in an effort for Trump to not notice him.” The integrity, the courage—can you feel it?
On June 6, the day before Comey’s prepared testimony was released, more “associates” told ABC that the director would “not corroborate Trump’s claim that on three separate occasions Comey told the president he was not under investigation.” And a “source with knowledge of Comey’s testimony” told CNN the same thing. In addition, ABC reported that, according to “a source familiar with Comey’s thinking,” the former director would say that Trump’s actions stopped short of obstruction of justice.
Maybe those sources weren’t as “familiar with Comey’s thinking” as they thought or hoped? To maximize the press coverage he already dominated, Comey had authorized the Senate Intelligence Committee to release his testimony ahead of his personal interview. That testimony told a different story than what had been reported by CNN and ABC (and by the Post on May 12). Comey had in fact told Trump the president was not under investigation—on January 6, January 27, and March 30. Moreover, the word “obstruction” did not appear at all in his written text. The senators asked Comey if he felt Trump obstructed justice. He declined to answer either way.
My guess is that Comey’s associates lacked Comey’s scalpel-like, almost Jesuitical ability to make distinctions, and therefore misunderstood what he was telling them to say to the press. Because it’s obvious Comey was the one behind the stories of Trump’s dishonesty and bad behavior. He admitted as much in front of the cameras in a remarkable exchange with Senator Susan Collins of Maine.
Comey said that, after Trump tweeted on May 12 that he’d better hope there aren’t “tapes” of their conversations, “I asked a friend of mine to share the content of the memo with a reporter. Didn’t do it myself, for a variety of reasons. But I asked him to, because I thought that might prompt the appointment of a special counsel. And so I asked a close friend of mine to do it.”
Collins asked whether that friend had been Wittes, known to cable news junkies as Comey’s bestie. Comey said no. The source for the New York Times article was “a good friend of mine who’s a professor at Columbia Law School,” Daniel Richman.
Every time I watch or read that exchange, I am amazed. Here is the former director of the FBI just flat-out admitting that, for months, he wrote down every interaction he had with the president of the United States because he wanted a written record in case the president ever fired or lied about him. And when the president did fire and lie about him, that director set in motion a series of public disclosures with the intent of not only embarrassing the president, but also forcing the appointment of a special counsel who might end up investigating the president for who knows what. And none of this would have happened if the president had not fired Comey or tweeted about him. He told the Senate that if Trump hadn’t dismissed him, he most likely would still be on the job.
Rarely, in my view, are high officials so transparent in describing how Washington works. Comey revealed to the world that he was keeping a file on his boss, that he used go-betweens to get his story into the press, that “investigative journalism” is often just powerful people handing documents to reporters to further their careers or agendas or even to get revenge. And as long as you maintain some distance from the fallout, and stick to the absolute letter of the law, you will come out on top, so long as you have a small army of nightingales singing to reporters on your behalf.
“It’s the end of the Comey era,” A.B. Stoddard said on Special Report with Bret Baier the other day. On the contrary: I have a feeling that, as the Russia investigation proceeds, we will be hearing much more from Comey. And from his “associates.” And his “friends.” And persons “familiar with his thinking.”
Choose your plan and pay nothing for six Weeks!
In April, COMMENTARY asked a wide variety of writers,
thinkers, and broadcasters to respond to this question: Is free speech under threat in the United States? We received twenty-seven responses. We publish them here in alphabetical order.
Floyd AbramsFree expression threatened? By Donald Trump? I guess you could say so.
When a president engages in daily denigration of the press, when he characterizes it as the enemy of the people, when he repeatedly says that the libel laws should be “loosened” so he can personally commence more litigation, when he says that journalists shouldn’t be allowed to use confidential sources, it is difficult even to suggest that he has not threatened free speech. And when he says to the head of the FBI (as former FBI director James Comey has said that he did) that Comey should consider “putting reporters in jail for publishing classified information,” it is difficult not to take those threats seriously.
The harder question, though, is this: How real are the threats? Or, as Michael Gerson put it in the Washington Post: Will Trump “go beyond mere Twitter abuse and move against institutions that limit his power?” Some of the president’s threats against the institution of the press, wittingly or not, have been simply preposterous. Surely someone has told him by now that neither he nor Congress can “loosen” libel laws; while each state has its own libel law, there is no federal libel law and thus nothing for him to loosen. What he obviously takes issue with is the impact that the Supreme Court’s 1964 First Amendment opinion in New York Times v. Sullivan has had on state libel laws. The case determined that public officials who sue for libel may not prevail unless they demonstrate that the statements made about them were false and were made with actual knowledge or suspicion of that falsity. So his objection to the rules governing libel law is to nothing less than the application of the First Amendment itself.
In other areas, however, the Trump administration has far more power to imperil free speech. We live under an Espionage Act, adopted a century ago, which is both broad in its language and uncommonly vague in its meaning. As such, it remains a half-open door through which an administration that is hostile to free speech might walk. Such an administration could initiate criminal proceedings against journalists who write about defense- or intelligence-related topics on the basis that classified information was leaked to them by present or former government employees. No such action has ever been commenced against a journalist. Press lawyers and civil-liberties advocates have strong arguments that the law may not be read so broadly and still be consistent with the First Amendment. But the scope of the Espionage Act and the impact of the First Amendment upon its interpretation remain unknown.
A related area in which the attitude of an administration toward the press may affect the latter’s ability to function as a check on government relates to the ability of journalists to protect the identity of their confidential sources. The Obama administration prosecuted more Espionage Act cases against sources of information to journalists than all prior administrations combined. After a good deal of deserved press criticism, it agreed to expand the internal guidelines of the Department of Justice designed to limit the circumstances under which such source revelation is demanded. But the guidelines are none too protective and are, after all, simply guidelines. A new administration is free to change or limit them or, in fact, abandon them altogether. In this area, as in so many others, it is too early to judge the ultimate treatment of free expression by the Trump administration. But the threats are real, and there is good reason to be wary.
Floyd Abrams is the author of The Soul of the First Amendment (Yale University Press, 2017).
Ayaan Hirsi AliFreedom of speech is being threatened in the United States by a nascent culture of hostility to different points of view. As political divisions in America have deepened, a conformist mentality of “right thinking” has spread across the country. Increasingly, American universities, where no intellectual doctrine ought to escape critical scrutiny, are some of the most restrictive domains when it comes to asking open-ended questions on subjects such as Islam.
Legally, speech in the United States is protected to a degree unmatched in almost any industrialized country. The U.S. has avoided unpredictable Canadian-style restrictions on speech, for example. I remain optimistic that as long as we have the First Amendment in the U.S., any attempt at formal legal censorship will be vigorously challenged.
Culturally, however, matters are very different in America. The regressive left is the forerunner threatening free speech on any issue that is important to progressives. The current pressure coming from those who call themselves “social-justice warriors” is unlikely to lead to successful legislation to curb the First Amendment. Instead, censorship is spreading in the cultural realm, particularly at institutions of higher learning.
The way activists of the regressive left achieve silence or censorship is by creating a taboo, and one of the most pernicious taboos in operation today is the word “Islamophobia.” Islamists are similarly motivated to rule any critical scrutiny of Islamic doctrine out of order. There is now a university center (funded by Saudi money) in the U.S. dedicated to monitoring and denouncing incidences of “Islamophobia.”
The term “Islamophobia” is used against critics of political Islam, but also against progressive reformers within Islam. The term implies an irrational fear that is tainted by hatred, and it has had a chilling effect on free speech. In fact, “Islamophobia” is a poorly defined term. Islam is not a race, and it is very often perfectly rational to fear some expressions of Islam. No set of ideas should be beyond critical scrutiny.
To push back in this cultural realm—in our universities, in public discourse—those favoring free speech should focus more on the message of dawa, the set of ideas that the Islamists want to promote. If the aims of dawa are sufficiently exposed, ordinary Americans and Muslim Americans will reject it. The Islamist message is a message of divisiveness, misogyny, and hatred. It’s anachronistic and wants people to live by tribal norms dating from the seventh century. The best antidote to Islamic extremism is the revelation of what its primary objective is: a society governed by Sharia. This is the opposite of censorship: It is documenting reality. What is life like in Saudi Arabia, Iran, the Northern Nigerian States? What is the true nature of Sharia law?
Islamists want to hide the true meaning of Sharia, Jihad, and the implications for women, gays, religious minorities, and infidels under the veil of “Islamophobia.” Islamists use “Islamophobia” to obfuscate their vision and imply that any scrutiny of political Islam is hatred and bigotry. The antidote to this is more exposure and more speech.
As pressure on freedom of speech increases from the regressive left, we must reject the notions that only Muslims can speak about Islam, and that any critical examination of Islamic doctrines is inherently “racist.”
Instead of contorting Western intellectual traditions so as not to offend our Muslim fellow citizens, we need to defend the Muslim dissidents who are risking their lives to promote the human rights we take for granted: equality for women, tolerance of all religions and orientations, our hard-won freedoms of speech and thought.
It is by nurturing and protecting such speech that progressive reforms can emerge within Islam. By accepting the increasingly narrow confines of acceptable discourse on issues such as Islam, we do dissidents and progressive reformers within Islam a grave disservice. For truly progressive reforms within Islam to be possible, full freedom of speech will be required.
Ayaan Hirsi Ali is a research fellow at the Hoover Institution, Stanford University, and the founder of the AHA Foundation.
Lee C. BollingerI know it is too much to expect that political discourse mimic the measured, self-questioning, rational, footnoting standards of the academy, but there is a difference between robust political debate and political debate infected with fear or panic. The latter introduces a state of mind that is visceral and irrational. In the realm of fear, we move beyond the reach of reason and a sense of proportionality. When we fear, we lose the capacity to listen and can become insensitive and mean.
Our Constitution is well aware of this fact about the human mind and of its negative political consequences. In the First Amendment jurisprudence established over the past century, we find many expressions of the problematic state of mind that is produced by fear. Among the most famous and potent is that of Justice Brandeis in Whitney v. California in 1927, one of the many cases involving aggravated fears of subversive threats from abroad. “It is the function of (free) speech,” he said, “to free men from the bondage of irrational fears.” “Men feared witches,” Brandeis continued, “and burned women.”
Today, our “witches” are terrorists, and Brandeis’s metaphorical “women” include the refugees (mostly children) and displaced persons, immigrants, and foreigners whose lives have been thrown into suspension and doubt by policies of exclusion.
The same fears of the foreign that take hold of a population inevitably infect our internal interactions and institutions, yielding suppression of unpopular and dissenting voices, victimization of vulnerable groups, attacks on the media, and the rise of demagoguery, with its disdain for facts, reason, expertise, and tolerance.
All of this poses a very special obligation on those of us within universities. Not only must we make the case in every venue for the values that form the core of who we are and what we do, but we must also live up to our own principles of free inquiry and fearless engagement with all ideas. This is why recent incidents on a handful of college campuses disrupting and effectively censoring speakers is so alarming. Such acts not only betray a basic principle but also inflame a rising prejudice against the academic community, and they feed efforts to delegitimize our work, at the very moment when it’s most needed.
I do not for a second support the view that this generation has an unhealthy aversion to engaging differences of opinion. That is a modern trope of polarization, as is the portrayal of universities as hypocritical about academic freedom and political correctness. But now, in this environment especially, universities must be at the forefront of defending the rights of all students and faculty to listen to controversial voices, to engage disagreeable viewpoints, and to make every effort to demonstrate our commitment to the sort of fearless and spirited debate that we are simultaneously asking of the larger society. Anyone with a voice can shout over a speaker; but being able to listen to and then effectively rebut those with whom we disagree—particularly those who themselves peddle intolerance—is one of the greatest skills our education can bestow. And it is something our democracy desperately needs more of. That is why, I say to you now, if speakers who are being denied access to other campuses come here, I will personally volunteer to introduce them, and listen to them, however much I may disagree with them. But I will also never hesitate to make clear why I disagree with them.
Lee C. Bollinger is the 19th president of Columbia University and the author of Uninhibited, Robust, and Wide-Open: A Free Press for a New Century. This piece has been excerpted from President Bollinger’s May 17 commencement address.
Richard A. Epstein
Today, the greatest threat to the constitutional protection of freedom of speech comes from campus rabble-rousers who invoke this very protection. In their book, the speech of people like Charles Murray and Heather Mac Donald constitutes a form of violence, bordering on genocide, that receives no First Amendment protection. Enlightened protestors are both bound and entitled to shout them down, by force or other disruptive actions, if their universities are so foolish as to extend them an invitation to speak. Any indignant minority may take the law into its own hands to eradicate the intellectual cancer before it spreads on their own campus.
By such tortured logic, a new generation of vigilantes distorts the First Amendment doctrine: Speech becomes violence, and violence becomes heroic acts of self-defense. The standard First Amendment interpretation emphatically rejects that view. Of course, the First Amendment doesn’t let you say what you want when and wherever you want to. Your freedom of speech is subject to the same limitations as your freedom of action. So you have no constitutional license to assault other people, to lie to them, or to form cartels to bilk them in the marketplace. But folks such as Murray, Mac Donald, and even Yiannopoulos do not come close to crossing into that forbidden territory. They are not using, for example, “fighting words,” rightly limited to words or actions calculated to provoke immediate aggression against a known target. Fighting words are worlds apart from speech that provokes a negative reaction in those who find your speech offensive solely because of the content of its message.
This distinction is central to the First Amendment. Fighting words have to be blocked by well-tailored criminal and civil sanctions lest some people gain license to intimidate others from speaking or peaceably assembling. The remedy for mere offense is to speak one’s mind in response. But it never gives anyone the right to block the speech of others, lest everyone be able to unilaterally increase his sphere of action by getting really angry about the beliefs of others. No one has the right to silence others by working himself into a fit of rage.
Obviously, it is intolerable to let mutual animosity generate factional warfare, whereby everyone can use force to silence rivals. To avoid this war of all against all, each side claims that only its actions are privileged. These selective claims quickly degenerate into a form of viewpoint discrimination, which undermines one of the central protections that traditional First Amendment law erects: a wall against each and every group out to destroy the level playing field on which robust political debate rests. Every group should be at risk for having its message fall flat. The new campus radicals want to upend that understanding by shutting down their adversaries if their universities do not. Their aggression must be met, if necessary, by counterforce. Silence in the face of aggression is not an acceptable alternative.
Richard A. Epstein is the Laurence A. Tisch Professor of Law at the New York University School of Law.
David FrenchWe’re living in the midst of a troubling paradox. At the exact same time that First Amendment jurisprudence has arguably never been stronger and more protective of free expression, millions of Americans feel they simply can’t speak freely. Indeed, talk to Americans living and working in the deep-blue confines of the academy, Hollywood, and the tech sector, and you’ll get a sense of palpable fear. They’ll explain that they can’t say what they think and keep their jobs, their friends, and sometimes even their families.
The government isn’t cracking down or censoring; instead, Americans are using free speech to destroy free speech. For example, a social-media shaming campaign is an act of free speech. So is an economic boycott. So is turning one’s back on a public speaker. So is a private corporation firing a dissenting employee for purely political reasons. Each of these actions is largely protected from government interference, and each one represents an expression of the speaker’s ideas and values.
The problem, however, is obvious. The goal of each of these kinds of actions isn’t to persuade; it’s to intimidate. The goal isn’t to foster dialogue but to coerce conformity. The result is a marketplace of ideas that has been emptied of all but the approved ideological vendors—at least in those communities that are dominated by online thugs and corporate bullies. Indeed, this mindset has become so prevalent that in places such as Portland, Berkeley, Middlebury, and elsewhere, the bullies and thugs have crossed the line from protected—albeit abusive—speech into outright shout-downs and mob violence.
But there’s something else going on, something that’s insidious in its own way. While politically correct shaming still has great power in deep-blue America, its effect in the rest of the country is to trigger a furious backlash, one characterized less by a desire for dialogue and discourse than by its own rage and scorn. So we’re moving toward two Americas—one that ruthlessly (and occasionally illegally) suppresses dissenting speech and the other that is dangerously close to believing that the opposite of political correctness isn’t a fearless expression of truth but rather the fearless expression of ideas best calculated to enrage your opponents.
The result is a partisan feedback loop where right-wing rage spurs left-wing censorship, which spurs even more right-wing rage. For one side, a true free-speech culture is a threat to feelings, sensitivities, and social justice. The other side waves high the banner of “free speech” to sometimes elevate the worst voices to the highest platforms—not so much to protect the First Amendment as to infuriate the hated “snowflakes” and trigger the most hysterical overreactions.
The culturally sustainable argument for free speech is something else entirely. It reminds the cultural left of its own debt to free speech while reminding the political right that a movement allegedly centered around constitutional values can’t abandon the concept of ordered liberty. The culture of free speech thrives when all sides remember their moral responsibilities—to both protect the right of dissent and to engage in ideological combat with a measure of grace and humility.
David French is a senior writer at National Review.
Pamela GellerThe real question isn’t whether free speech is under threat in the United States, but rather, whether it’s irretrievably lost. Can we get it back? Not without war, I suspect, as is evidenced by the violence at colleges whenever there’s the shamefully rare event of a conservative speaker on campus.
Free speech is the soul of our nation and the foundation of all our other freedoms. If we can’t speak out against injustice and evil, those forces will prevail. Freedom of speech is the foundation of a free society. Without it, a tyrant can wreak havoc unopposed, while his opponents are silenced.
With that principle in mind, I organized a free-speech event in Garland, Texas. The world had recently been rocked by the murder of the Charlie Hebdo cartoonists. My version of “Je Suis Charlie” was an event here in America to show that we can still speak freely and draw whatever we like in the Land of the Free. Yet even after jihadists attacked our event, I was blamed—by Donald Trump among others—for provoking Muslims. And if I tried to hold a similar event now, no arena in the country would allow me to do so—not just because of the security risk, but because of the moral cowardice of all intellectual appeasers.
Under what law is it wrong to depict Muhammad? Under Islamic law. But I am not a Muslim, I don’t live under Sharia. America isn’t under Islamic law, yet for standing for free speech, I’ve been:
- Prevented from running our advertisements in every major city in this country. We have won free-speech lawsuits all over the country, which officials circumvent by prohibiting all political ads (while making exceptions for ads from Muslim advocacy groups);
- Shunned by the right, shut out of the Conservative Political Action Conference;
- Shunned by Jewish groups at the behest of terror-linked groups such as the Council on American-Islamic Relations;
- Blacklisted from speaking at universities;
- Prevented from publishing books, for security reasons and because publishers fear shaming from the left;
- Banned from Britain.
A Seattle court accused me of trying to shut down free speech after we merely tried to run an FBI poster on global terrorism, because authorities had banned all political ads in other cities to avoid running ours. Seattle blamed us for that, which was like blaming a woman for being raped because she was wearing a short skirt.
This kind of vilification and shunning is key to the left’s plan to shut down all dissent from its agenda—they make legislation restricting speech unnecessary.
The same refusal to allow our point of view to be heard has manifested itself elsewhere. The foundation of my work is individual rights and equality for all before the law. These are the foundational principles of our constitutional republic. That is now considered controversial. Truth is the new hate speech. Truth is going to be criminalized.
The First Amendment doesn’t only protect ideas that are sanctioned by the cultural and political elites. If “hate speech” laws are enacted, who would decide what’s permissible and what’s forbidden? The government? The gunmen in Garland?
There has been an inversion of the founding premise of this nation. No longer is it the subordination of might to right, but right to might. History is repeatedly deformed with the bloody consequences of this transition.
Pamela Geller is the editor in chief of the Geller Report and president of the American Freedom Defense Initiative.
Jonah GoldbergOf course free speech is under threat in America. Frankly, it’s always under threat in America because it’s always under threat everywhere. Ronald Reagan was right when he said in 1961, “Freedom is never more than one generation away from extinction. We didn’t pass it on to our children in the bloodstream. It must be fought for, protected, and handed on for them to do the same.”
This is more than political boilerplate. Reagan identified the source of the threat: human nature. God may have endowed us with a right to liberty, but he didn’t give us all a taste for it. As with most finer things, we must work to acquire a taste for it. That is what civilization—or at least our civilization—is supposed to do: cultivate attachments to certain ideals. “Cultivate” shares the same Latin root as “culture,” cultus, and properly understood they mean the same thing: to grow, nurture, and sustain through labor.
In the past, threats to free speech have taken many forms—nationalist passion, Comstockery (both good and bad), political suppression, etc.—but the threat to free speech today is different. It is less top-down and more bottom-up. We are cultivating a generation of young people to reject free speech as an important value.
One could mark the beginning of the self-esteem movement with Nathaniel Branden’s 1969 paper, “The Psychology of Self-Esteem,” which claimed that “feelings of self-esteem were the key to success in life.” This understandable idea ran amok in our schools and in our culture. When I was a kid, Saturday-morning cartoons were punctuated with public-service announcements telling kids: “The most important person in the whole wide world is you, and you hardly even know you!”
The self-esteem craze was just part of the cocktail of educational fads. Other ingredients included multiculturalism, the anti-bullying crusade, and, of course, that broad phenomenon known as “political correctness.” Combined, they’ve produced a generation that rejects the old adage “sticks and stones can break my bones but words can never harm me” in favor of the notion that “words hurt.” What we call political correctness has been on college campuses for decades. But it lacked a critical mass of young people who were sufficiently receptive to it to make it a fully successful ideology. The campus commissars welcomed the new “snowflakes” with open arms; truly, these are the ones we’ve been waiting for.
“Words hurt” is a fashionable concept in psychology today. (See Psychology Today: “Why Words Can Hurt at Least as Much as Sticks and Stones.”) But it’s actually a much older idea than the “sticks and stones” aphorism. For most of human history, it was a crime to say insulting or “injurious” things about aristocrats, rulers, the Church, etc. That tendency didn’t evaporate with the Divine Right of Kings. Jonathan Haidt has written at book length about our natural capacity to create zones of sanctity, immune from reason.
And that is the threat free speech faces today. Those who inveigh against “hate speech” are in reality fighting “heresy speech”—ideas that do “violence” to sacred notions of self-esteem, racial or gender equality, climate change, and so on. Put whatever label you want on it, contemporary “social justice” progressivism acts as a religion, and it has no patience for blasphemy.
When Napoleon’s forces converted churches into stables, the clergy did not object on the grounds that regulations regarding the proper care and feeding of animals had been violated. They complained of sacrilege and blasphemy. When Charles Murray or Christina Hoff Summers visits college campuses, the protestors are behaving like the zealous acolytes of St. Jerome. Appeals to the First Amendment have as much power over the “antifa” fanatics as appeals to Odin did to champions of the New Faith.
That is the real threat to free speech today.
Jonah Goldberg is a senior editor at National Review and a fellow at the American Enterprise Institute.
KC JohnsonIn early May, the Washington Post urged universities to make clear that “racist signs, symbols, and speech are off-limits.” Given the extraordinarily broad definition of what constitutes “racist” speech at most institutions of higher education, this demand would single out most right-of-center (and, in some cases, even centrist and liberal) discourse on issues of race or ethnicity. The editorial provided the highest-profile example of how hostility to free speech, once confined to the ideological fringe on campus, has migrated to the liberal mainstream.
The last few years have seen periodic college protests—featuring claims that significant amounts of political speech constitute “violence,” thereby justifying censorship—followed by even more troubling attempts to appease the protesters. After the mob scene that greeted Charles Murray upon his visit to Middlebury College, for instance, the student government criticized any punishment for the protesters, and several student leaders wanted to require that future speakers conform to the college’s “community standard” on issues of race, gender, and ethnicity. In the last few months, similar attempts to stifle the free exchange of ideas in the name of promoting diversity occurred at Wesleyan, Claremont McKenna, and Duke. Offering an extreme interpretation of this point of view, one CUNY professor recently dismissed dialogue as “inherently conservative,” since it reinforced the “relations of power that presently exist.”
It’s easy, of course, to dismiss campus hostility to free speech as affecting only a small segment of American public life—albeit one that trains the next generation of judges, legislators, and voters. But, as Jonathan Chait observed in 2015, denying “the legitimacy of political pluralism on issues of race and gender” has broad appeal on the left. It is only most apparent on campus because “the academy is one of the few bastions of American life where the political left can muster the strength to impose its political hegemony upon others.” During his time in office, Barack Obama generally urged fellow liberals to support open intellectual debate. But the current campus environment previews the position of free speech in a post-Obama Democratic Party, increasingly oriented around identity politics.
Waning support on one end of the ideological spectrum for this bedrock American principle should provide a political opening for the other side. The Trump administration, however, seems poorly suited to make the case. Throughout his public career, Trump has rarely supported free speech, even in the abstract, and has periodically embraced legal changes to facilitate libel lawsuits. Moreover, the right-wing populism that motivates Trump’s base has a long tradition of ideological hostility to civil liberties of all types. Even in campus contexts, conservatives have defended free speech inconsistently, as seen in recent calls that CUNY disinvite anti-Zionist fanatic Linda Sarsour as a commencement speaker.
In a sharply polarized political environment, awash in dubiously-sourced information, free speech is all the more important. Yet this same environment has seen both sides, most blatantly elements of the left on campuses, demand restrictions on their ideological foes’ free speech in the name of promoting a greater good.
KC Johnson is a professor of history at Brooklyn College and the CUNY Graduate Center.
Laura KipnisI find myself with a strange-bedfellows problem lately. Here I am, a left-wing feminist professor invited onto the pages of Commentary—though I’d be thrilled if it were still 1959—while fielding speaking requests from right-wing think tanks and libertarians who oppose child-labor laws.
Somehow I’ve ended up in the middle of the free-speech-on-campus debate. My initial crime was publishing a somewhat contentious essay about campus sexual paranoia that put me on the receiving end of Title IX complaints. Apparently I’d created a “hostile environment” at my university. I was investigated (for 72 days). Then I wrote up what I’d learned about these campus inquisitions in a second essay. Then I wrote about it all some more, in a book exposing the kangaroo-court elements of the Title IX process—and the extra-legal gag orders imposed on everyone caught in its widening snare.
I can’t really comment on whether more charges have been filed against me over the book. I’ll just say that writing about being a Title IX respondent could easily become a life’s work. I learned, shortly after writing this piece, that I and my publisher were being sued for defamation, among other things.
Is free speech under threat on American campuses? Yes. We know all about student activists who wish to shut down talks by people with opposing views. I got smeared with a bit of that myself, after a speaking invitation at Wellesley—some students made a video protesting my visit before I arrived. The talk went fine, though a group of concerned faculty circulated an open letter afterward also protesting the invitation: My views on sexual politics were too heretical, and might have offended students.
I didn’t take any of this too seriously, even as right-wing pundits crowed, with Wellesley as their latest outrage bait. It was another opportunity to mock student activists, and the fact that I was myself a feminist rather than a Charles Murray or a Milo Yiannopoulos, made them positively gleeful.
I do find myself wondering where all my new free-speech pals were when another left-wing professor, Steven Salaita, was fired (or if you prefer euphemism, “his job offer was withdrawn”) from the University of Illinois after he tweeted criticism of Israel’s Gaza policy. Sure the tweets were hyperbolic, but hyperbole and strong opinions are protected speech, too.
I guess free speech is easy to celebrate until it actually challenges something. Funny, I haven’t seen Milo around lately—so beloved by my new friends when he was bashing minorities and transgender kids. Then he mistakenly said something authentic (who knew he was capable of it!), reminiscing about an experience a lot of gay men have shared: teenage sex with older men. He tried walking it back—no, no, he’d been a victim, not a participant—but his fan base was shrieking about pedophilia and fleeing in droves. Gee, they were all so against “political correctness” a few minutes before.
It’s easy to be a free-speech fan when your feathers aren’t being ruffled. No doubt what makes me palatable to the anti-PC crowd is having thus far failed to ruffle them enough. I’m just going to have to work harder.
Laura Kipnis’s latest book is Unwanted Advances: Sexual Paranoia Comes to Campus.
Eugene KontorovichThe free and open exchange of views—especially politically conservative or traditionally religious ones—is being challenged. This is taking place not just at college campuses but throughout our public spaces and cultural institutions. James Watson was fired from the lab he led since 1968 and could not speak at New York University because of petty, censorious students who would not know DNA from LSD. Our nation’s founders and heroes are being “disappeared” from public commemoration, like Trotsky from a photograph of Soviet rulers.
These attacks on “free speech” are not the result of government action. They are not what the First Amendment protects against. The current methods—professional and social shaming, exclusion, and employment termination—are more inchoate, and their effects are multiplied by self-censorship. A young conservative legal scholar might find himself thinking: “If the late Justice Antonin Scalia can posthumously be deemed a ‘bigot’ by many academics, what chance have I?”
Ironically, artists and intellectuals have long prided themselves on being the first defenders of free speech. Today, it is the institutions of both popular and high culture that are the censors. Is there one poet in the country who would speak out for Ann Coulter?
The inhibition of speech at universities is part of a broader social phenomenon of making longstanding, traditional views and practices sinful overnight. Conservatives have not put up much resistance to this. To paraphrase Martin Niemöller’s famous dictum: “First they came for Robert E. Lee, and I said nothing, because Robert E. Lee meant nothing to me.”
The situation with respect to Israel and expressions of support for it deserves separate discussion. Even as university administrators give political power to favored ideologies by letting them create “safe spaces” (safe from opposing views), Jews find themselves and their state at the receiving end of claims of apartheid—modern day blood libels. It is not surprising if Jewish students react by demanding that they get a safe space of their own. It is even less surprising if their parents, paying $65,000 a year, want their children to have a nicer time of it. One hears Jewish groups frequently express concern about Jewish students feeling increasingly isolated and uncomfortable on campus.
But demanding selective protection from the new ideological commissars is unlikely to bring the desired results. First, this new ideology, even if it can be harnessed momentarily to give respite to harassed Jews on campus, is ultimately illiberal and will be controlled by “progressive” forces. Second, it is not so terrible for Jews in the Diaspora to feel a bit uncomfortable. It has been the common condition of Jews throughout the millennia. The social awkwardness that Jews at liberal arts schools might feel in being associated with Israel is of course one of the primary justifications for the Jewish State. Facing the snowflakes incapable of hearing a dissonant view—but who nonetheless, in the grip of intersectional ecstasy, revile Jewish self-determination—Jewish students should toughen up.
Eugene Kontorovich teaches constitutional law at Northwestern University and heads the international law department of the Kohelet Policy Forum in Jerusalem.
Nicholas LemannThere’s an old Tom Wolfe essay in which he describes being on a panel discussion at Princeton in 1965 and provoking the other panelists by announcing that America, rather than being in crisis, is in the middle of a “happiness explosion.” He was arguing that the mass effects of 20 years of post–World War II prosperity made for a larger phenomenon than the Vietnam War, the racial crisis, and the other primary concerns of intellectuals at the time.
In the same spirit, I’d say that we are in the middle of a free-speech explosion, because of 20-plus years of the Internet and 10-plus years of social media. If one understands speech as disseminated individual opinion, then surely we live in the free-speech-est society in the history of the world. Anybody with access to the unimpeded World Wide Web can say anything to a global audience, and anybody can hear anything, too. All threats to free speech should be understood in the context of this overwhelmingly reality.
It is a comforting fantasy that a genuine free-speech regime will empower mainly “good,” but previously repressed, speech. Conversely, repressive regimes that are candid enough to explain their anti-free-speech policies usually say that they’re not against free speech, just “bad” speech. We have to accept that more free speech probably means, in the aggregate, more bad speech, and also a weakening of the power, authority, and economic support for information professionals such as journalists. Welcome to the United States in 2017.
I am lucky enough to live and work on the campus of a university, Columbia, that has been blessedly free of successful attempts to repress free speech. Just in the last few weeks, Charles Murray and Dinesh D’Souza have spoken here without incident. But, yes, the evidently growing popularity of the idea that “hate speech” shouldn’t be permitted on campuses is a problem, especially, it seems, at small private liberal-arts colleges. We should all do our part, and I do, by frequently and publicly endorsing free-speech principles. Opposing the BDS movement falls squarely into that category.
It’s not just on campuses that free-speech vigilance is needed, though. The number-one threat to free speech, to my mind, is that the wide-open Web has been replaced by privately owned platforms such as Facebook and Google as the way most people experience the public life of the Internet. These companies are committed to banning “hate speech,” and they are eager to operate freely in countries, like China, that don’t permit free political speech. That makes for a far more consequential constrained environment than any campus’s speech code.
Also, Donald Trump regularly engages in presidentially unprecedented rhetoric demonizing people who disagree with him. He seems to think this is all in good fun, but, as we have already seen at his rallies, not everybody hears it that way. The place where Trumpism will endanger free speech isn’t in the center—the White House press room—but at the periphery, for example in the way that local police handle bumptious protestors and the journalists covering them. This is already happening around the country. If Trump were as disciplined and knowledgeable as Vladimir Putin or Recep Tayyip Erdogan, which so far he seems not to be, then free speech could be in even more serious danger from government, which in most places is its usual main enemy.
Nicholas Lemann is a professor at Columbia Journalism School and a staff writer for the New Yorker.
Michael J. LewisFree speech is a right but it is also a habit, and where the habit shrivels so will the right. If free speech today is in headlong retreat—everywhere threatened by regulation, organized harassment, and even violence—it is in part because our political culture allowed the practice of persuasive oratory to atrophy. The process began in 1973, an unforeseen side effect of Roe v. Wade. Legislators were delighted to learn that by relegating this divisive matter of public policy to the Supreme Court and adopting a merely symbolic position, they could sit all the more safely in their safe seats.
Since then, one crucial question of public policy after another has been punted out of the realm of politics and into the judicial. Issues that might have been debated with all the rhetorical agility of a Lincoln and a Douglas, and then subjected to a process of negotiation, compromise, and voting, have instead been settled by decree: e.g., Chevron, Kelo, Obergefell. The consequences for speech have been pernicious. Since the time of Pericles, deliberative democracy has been predicated on the art of persuasion, which demands the forceful clarity of thought and expression without which no one has ever been persuaded. But a legislature that relegates its authority to judges and regulators will awaken to discover its oratorical culture has been stunted. When politicians, rather than seeking to convince and win over, prefer to project a studied and pleasant vagueness, debate withers into tedious defensive performance. It has been decades since any presidential debate has seen any sustained give and take over a matter of policy. If there is any suspense at all, it is only the possibility that a fatigued or peeved candidate might blurt out that tactless shard of truth known as a gaffe.
A generation accustomed to hearing platitudes smoothly dispensed from behind a teleprompter will find the speech of a fearless extemporaneous speaker to be startling, even disquieting; unfamiliar ideas always are. Unhappily, they have been taught to interpret that disquiet as an injury done to them, rather than as a premise offered to them to consider. All this would not have happened—certainly not to this extent—had not our deliberative democracy decided a generation ago that it preferred the security of incumbency to the risks of unshackled debate. The compulsory contraction of free speech on college campuses is but the logical extension of the voluntary contraction of free speech in our political culture.
Michael J. Lewis’s new book is City of Refuge: Separatists and Utopian Town Planning (Princeton University Press).
Heather Mac DonaldThe answer to the symposium question depends on how powerful the transmission belt is between academia and the rest of the country. On college campuses, violence and brute force are silencing speakers who challenge left-wing campus orthodoxies. These totalitarian outbreaks have been met with listless denunciations by college presidents, followed by . . . virtually nothing. As of mid-May, the only discipline imposed for 2017’s mass attacks on free speech at UC Berkeley, Middlebury, and Clare-mont McKenna College was a letter of reprimand inserted—sometimes only temporarily—into the files of several dozen Middlebury students, accompanied by a brief period of probation. Previous outbreaks of narcis-sistic incivility, such as the screaming-girl fit at Yale and the assaults on attendees of Yale’s Buckley program, were discreetly ignored by college administrators.
Meanwhile, the professoriate unapologetically defends censorship and violence. After the February 1 riot in Berkeley to prevent Milo Yiannapoulos from speaking, Déborah Blocker, associate professor of French at UC Berkeley, praised the rioters. They were “very well-organized and very efficient,” Blocker reported admiringly to her fellow professors. “They attacked property but they attacked it very sparingly, destroying just enough University property to obtain the cancellation order for the MY event and making sure no one in the crowd got hurt” (emphasis in original). (In fact, perceived Milo and Donald Trump supporters were sucker-punched and maced; businesses downtown were torched and vandalized.) New York University’s vice provost for faculty, arts, humanities, and diversity, Ulrich Baer, displayed Orwellian logic by claiming in a New York Times op-ed that shutting down speech “should be understood as an attempt to ensure the conditions of free speech for a greater group of people.”
Will non-academic institutions take up this zeal for outright censorship? Other ideological products of the left-wing academy have been fully absorbed and operationalized. Racial victimology, which drives much of the campus censorship, is now standard in government and business. Corporate diversity trainers counsel that bias is responsible for any lack of proportional racial representation in the corporate ranks. Racial disparities in school discipline and incarceration are universally attributed to racism rather than to behavior. Public figures have lost jobs for violating politically correct taboos.
Yet Americans possess an instinctive commitment to the First Amendment. Federal judges, hardly an extension of the Federalist Society, have overwhelmingly struck down campus speech codes. It is hard to imagine that they would be any more tolerant of the hate-speech legislation so prevalent in Europe. So the question becomes: At what point does the pressure to conform to the elite worldview curtail freedom of thought and expression, even without explicit bans on speech?
Social stigma against conservative viewpoints is not the same as actual censorship. But the line can blur. The Obama administration used regulatory power to impose a behavioral conformity on public and private entities. School administrators may have technically still possessed the right to dissent from novel theories of gender, but they had to behave as if they were fully on board with the transgender revolution when it came to allowing boys to use girls’ bathrooms and locker rooms.
Had Hillary Clinton had been elected president, the federal bureaucracy would have mimicked campus diversocrats with even greater zeal. That threat, at least, has been avoided. Heresies against left-wing dogma may still enter the public arena, if only by the back door. The mainstream media have lurched even further left in the Trump era, but the conservative media, however mocked and marginalized, are expanding (though Twitter and Facebook’s censorship of conservative speakers could be a harbinger of more official silencing).
Outside the academy, free speech is still legally protected, but its exercise requires ever greater determination.
Heather Mac Donald is a fellow at the Manhattan Institute and the author of The War on Cops.
John McWhorterThere is a certain mendacity, as Brick put it in Cat on a Hot Tin Roof, in our discussion of free speech on college campuses. Namely, none of us genuinely wish that absolutely all issues be aired in the name of education and open-mindedness. To insist so is to pretend that civilized humanity makes nothing we could call advancement in philosophical consensus.
I doubt we need “free speech” on issues such as whether slavery and genocide are okay, whether it has been a mistake to view women as men’s equals, or to banish as antique the idea that whites are a master race while other peoples represent a lower rung on the Darwinian scale. With all due reverence of John Stuart Mill’s advocacy for the regular airing of even noxious views in order to reinforce clarity on why they were rejected, we are also human beings with limited time. A commitment to the Enlightenment justifiably will decree that certain views are, indeed, no longer in need of discussion.
However, our modern social-justice warriors are claiming that this no-fly zone of discussion is vaster than any conception of logic or morality justifies. We are being told that questions regarding the modern proposals about cultural appropriation, about whether even passing infelicitous statements constitute racism in the way that formalized segregation and racist disparagement did, or about whether social disparities can be due to cultural legacies rather than structural impediments, are as indisputably egregious, backwards, and abusive as the benighted views of the increasingly distant past.
That is, the new idea is not only that discrimination and inequality still exist, but that to even question the left’s utopian expectation on such matters justifies the same furious, sloganistic and even physically violent resistance that was once levelled against those designated heretics by a Christian hegemony.
Of course the protesters in question do not recognize themselves in a portrait as opponents of something called heresy. They suppose that Galileo’s opponents were clearly wrong but that they, today, are actually correct in a way that no intellectual or moral argument could coherently deny.
As such, we have students allowed to decree college campuses as “racist” when they are the least racist spaces on the planet—because they are, predictably given the imperfection of humans, not perfectly free of passingly unsavory interactions. Thinkers invited to talk for a portion of an hour from the right rather than the left and then have dinner with a few people and fly home are treated as if they were reanimated Hitlers. The student of color who hears a few white students venturing polite questions about the leftist orthodoxy is supported in fashioning these questions as “racist” rhetoric.
The people on college campuses who openly and aggressively spout this new version of Christian (or even Islamist) crusading—ironically justifying it as a barricade against “fascist” muzzling of freedom when the term applies ominously well to the regime they are fostering—are a minority. However, the sawmill spinning blade of their rhetoric has succeeding in rendering opposition as risky as espousing pedophilia, such that only those natively open to violent criticism dare speak out. The latter group is small. The campus consensus thereby becomes, if only at moralistic gunpoint à la the ISIS victim video, a strangled hard-leftism.
Hence freedom of speech is indeed threatened on today’s college campuses. I have lost count of how many of my students, despite being liberal Democrats (many of whom sobbed at Hillary Clinton’s loss last November), have told me that they are afraid to express their opinions about issues that matter, despite the fact that their opinions are ones that any liberal or even leftist person circa 1960 would have considered perfectly acceptable.
Something has shifted of late, and not in a direction we can legitimately consider forwards.
John McWhorter teaches linguistics, philosophy, and music history at Columbia University and is the author of The Language Hoax, Words on the Move, and Talking Back, Talking Black.
Kate Bachelder OdellIt’s 2021, and Harvard Square has devolved into riots: Some 120 people are injured in protests, and the carnage includes fire-consumed cop cars and smashed-in windows. The police discharge canisters of tear gas, and, after apprehending dozens of protesters, enforce a 1:45 A.M. curfew. Anyone roaming the streets after hours is subject to arrest. About 2,000 National Guardsmen are prepared to intervene. Such violence and disorder is also roiling Berkeley and other elite and educated areas.
Oh, that’s 1970. The details are from the Harvard Crimson’s account of “anti-war” riots that spring. The episode is instructive in considering whether free speech is under threat in the United States. Almost daily, there’s a new YouTube installment of students melting down over viewpoints of speakers invited to one campus or another. Even amid speech threats from government—for example, the IRS’s targeting of political opponents—nothing has captured the public’s attention like the end of free expression at America’s institutions of higher learning.
Yet disruption, confusion, and even violence are not new campus phenomena. And it’s hard to imagine that young adults who deployed brute force in the 1960s and ’70s were deeply committed to the open and peaceful exchange of ideas.
There may also be reason for optimism. The rough and tumble on campus in the 1960s and ’70s produced a more even-tempered ’80s and ’90s, and colleges are probably heading for another course correction. In covering the ruckuses at Yale, Missouri, and elsewhere, I’ve talked to professors and students who are figuring out how to respond to the illiberalism, even if the reaction is delayed. The University of Chicago put out a set of free-speech principles last year, and others schools such as Princeton and Purdue have endorsed them.
The NARPs—Non-Athletic Regular People, as they are sometimes known on campus—still outnumber the social-justice warriors, who appear to be overplaying their hand. Case in point is the University of Missouri, which experienced a precipitous drop in enrollment after instructor Melissa Click and her ilk stoked racial tensions last spring. The college has closed dorms and trimmed budgets. Which brings us to another silver lining: The economic model of higher education (exorbitant tuition to pay ever more administrators) may blow up traditional college before the fascists can.
Note also that the anti-speech movement is run by rich kids. A Brookings Institution analysis from earlier this year discovered that “the average enrollee at a college where students have attempted to restrict free speech comes from a family with an annual income $32,000 higher than that of the average student in America.” Few rank higher in average income than those at Middlebury College, where students evicted scholar Charles Murray in a particularly ugly scene. (The report notes that Murray was received respectfully at Saint Louis University, “where the median income of students’ families is half Middlebury’s.”) The impulses of over-adulated 20-year-olds may soon be tempered by the tyranny of having to show up for work on a daily basis.
None of this is to suggest that free speech is enjoying some renaissance either on campus or in America. But perhaps as the late Wall Street Journal editorial-page editor Robert Bartley put it in his valedictory address: “Things could be worse. Indeed, they have been worse.”
Kate Bachelder Odell is an editorial writer for the Wall Street Journal.
Jonathan RauchIs free speech under threat? The one-syllable answer is “yes.” The three-syllable answer is: “Yes, of course.” Free speech is always under threat, because it is not only the single most successful social idea in all of human history, it is also the single most counterintuitive. “You mean to say that speech that is offensive, untruthful, malicious, seditious, antisocial, blasphemous, heretical, misguided, or all of the above deserves government protection?” That seemingly bizarre proposition is defensible only on the grounds that the marketplace of ideas turns out to be the most powerful engine of knowledge, prosperity, liberty, social peace, and moral advancement that our species has had the good fortune to discover.
Every new generation of free-speech advocates will need to get up every morning and re-explain the case for free speech and open inquiry—today, tomorrow, and forever. That is our lot in life, and we just need to be cheerful about it. At discouraging moments, it is helpful to remember that the country has made great strides toward free speech since 1798, when the Adams administration arrested and jailed its political critics; and since the 1920s, when the U.S. government banned and burned James Joyce’s great novel Ulysses; and since 1954, when the government banned ONE, a pioneering gay journal. (The cover article was a critique of the government’s indecency censors, who censored it.) None of those things could happen today.
I suppose, then, the interesting question is: What kind of threat is free speech under today? In the present age, direct censorship by government bodies is rare. Instead, two more subtle challenges hold sway, especially, although not only, on college campuses. The first is a version of what I called, in my book Kindly Inquisitors, the humanitarian challenge: the idea that speech that is hateful or hurtful (in someone’s estimation) causes pain and thus violates others’ rights, much as physical violence does. The other is a version of what I called the egalitarian challenge: the idea that speech that denigrates minorities (again, in someone’s estimation) perpetuates social inequality and oppression and thus also is a rights violation. Both arguments call upon administrators and other bureaucrats to defend human rights by regulating speech rights.
Both doctrines are flawed to the core. Censorship harms minorities by enforcing conformity and entrenching majority power, and it no more ameliorates hatred and injustice than smashing thermometers ameliorates global warming. If unwelcome words are the equivalent of bludgeons or bullets, then the free exchange of criticism—science, in other words—is a crime. I could go on, but suffice it to say that the current challenges are new variations on ancient themes—and they will be followed, in decades and centuries to come, by many, many other variations. Memo to free-speech advocates: Our work is never done, but the really amazing thing, given the proposition we are tasked to defend, is how well we are doing.
Jonathan Rauch is a senior fellow at the Brookings Institution and the author of Kindly Inquisitors: The New Attacks on Free Thought.
Nicholas Quinn RosenkranzSpeech is under threat on American campuses as never before. Censorship in various forms is on the rise. And this year, the threat to free speech on campus took an even darker turn, toward actual violence. The prospect of Milo Yiannopoulos speaking at Berkeley provoked riots that caused more than $100,000 worth of property damage on the campus. The prospect of Charles Murray speaking at Middlebury led to a riot that put a liberal professor in the hospital with a concussion. Ann Coulter’s speech at Berkeley was cancelled after the university determined that none of the appropriate venues could be protected from “known security threats” on the date in question.
The free-speech crisis on campus is caused, at least in part, by a more insidious campus pathology: the almost complete lack of intellectual diversity on elite university faculties. At Yale, for example, the number of registered Republicans in the economics department is zero; in the psychology department, there is one. Overall, there are 4,410 faculty members at Yale, and the total number of those who donated to a Republican candidate during the 2016 primaries was three.
So when today’s students purport to feel “unsafe” at the mere prospect of a conservative speaker on campus, it may be easy to mock them as “delicate snowflakes,” but in one sense, their reaction is understandable: If students are shocked at the prospect of a Republican behind a university podium, perhaps it is because many of them have never before laid eyes on one.
To see the connection between free speech and intellectual diversity, consider the recent commencement speech of Harvard President Drew Gilpin Faust:
Universities must be places open to the kind of debate that can change ideas….Silencing ideas or basking in intellectual orthodoxy independent of facts and evidence impedes our access to new and better ideas, and it inhibits a full and considered rejection of bad ones. . . . We must work to ensure that universities do not become bubbles isolated from the concerns and discourse of the society that surrounds them. Universities must model a commitment to the notion that truth cannot simply be claimed, but must be established—established through reasoned argument, assessment, and even sometimes uncomfortable challenges that provide the foundation for truth.
Faust is exactly right. But, alas, her commencement audience might be forgiven a certain skepticism. After all, the number of registered Republicans in several departments at Harvard—e.g., history and psychology—is exactly zero. In those departments, the professors themselves may be “basking in intellectual orthodoxy” without ever facing “uncomfortable challenges.” This may help explain why some students will do everything in their power to keep conservative speakers off campus: They notice that faculty hiring committees seem to do exactly the same thing.
In short, it is a promising sign that true liberal academics like Faust have started speaking eloquently about the crucial importance of civil, reasoned disagreement. But they will be more convincing on this point when they hire a few colleagues with whom they actually disagree.
Nicholas Quinn Rosenkranz is a professor of law at Georgetown. He serves on the executive committee of Heterodox Academy, which he co-founded, on the board of directors of the Federalist Society, and on the board of directors of the Foundation for Individual Rights in Education (FIRE).
Ben ShapiroIn February, I spoke at California State University in Los Angeles. Before my arrival, professors informed students that a white supremacist would be descending on the school to preach hate; threats of violence soon prompted the administration to cancel the event. I vowed to show up anyway. One hour before the event, the administration backed down and promised to guarantee that the event could go forward, but police officers were told not to stop the 300 students, faculty, and outside protesters who blocked and assaulted those who attempted to attend the lecture. We ended up trapped in the auditorium, with the authorities telling students not to leave for fear of physical violence. I was rushed from campus under armed police guard.
Is free speech under assault?
Of course it is.
On campus, free speech is under assault thanks to a perverse ideology of intersectionality that claims victim identity is of primary value and that views are a merely secondary concern. As a corollary, if your views offend someone who outranks you on the intersectional hierarchy, your views are treated as violence—threats to identity itself. On campus, statements that offend an individual’s identity have been treated as “microaggressions”–actual aggressions against another, ostensibly worthy of violence. Words, students have been told, may not break bones, but they will prompt sticks and stones, and rightly so.
Thus, protesters around the country—leftists who see verbiage as violence—have, in turn, used violence in response to ideas they hate. Leftist local authorities then use the threat of violence as an excuse to ideologically discriminate against conservatives. This means public intellectuals like Charles Murray being run off of campus and his leftist professorial cohort viciously assaulted; it means Ann Coulter being targeted for violence at Berkeley; it means universities preemptively banning me and Ayaan Hirsi Ali and Condoleezza Rice and even Jason Riley.
The campus attacks on free speech are merely the most extreme iteration of an ideology that spans from left to right: the notion that your right to free speech ends where my feelings begin. Even Democrats who say that Ann Coulter should be allowed to speak at Berkeley say that nobody should be allowed to contribute to a super PAC (unless you’re a union member, naturally).
Meanwhile, on the right, the president’s attacks on the press have convinced many Republicans that restrictions on the press wouldn’t be altogether bad. A Vanity Fair/60 Minutes poll in late April found that 36 percent of Americans thought freedom of the press “does more harm than good.” Undoubtedly, some of that is due to the media’s obvious bias. CNN’s Jeff Zucker has targeted the Trump administration for supposedly quashing journalism, but he was silent when the Obama administration’s Department of Justice cracked down on reporters from the Associated Press and Fox News, and when hacks like Deputy National Security Adviser Ben Rhodes openly sold lies regarding Iran. But for some on the right, the response to press falsities hasn’t been to call for truth, but to instead echo Trumpian falsehoods in the hopes of damaging the media. Free speech is only important when people seek the truth. Leftists traded truth for tribalism long ago; in response, many on the right seem willing to do the same. Until we return to a common standard under which facts matter, free speech will continue to rest on tenuous grounds.
Ben Shapiro is the editor in chief of The Daily Wire and the host of The Ben Shapiro Show.
Judith ShulevitzIt’s tempting to blame college and university administrators for the decline of free speech in America, and for years I did just that. If the guardians of higher education won’t inculcate the habits of mind required for serious thinking, I thought, who will? The unfettered but civil exchange of ideas is the basic operation of education, just as addition is the basic operation of arithmetic. And universities have to teach both the unfettered part and the civil part, because arguing in a respectful manner isn’t something anyone does instinctively.
So why change my mind now? Schools still cling to speech codes, and there still aren’t enough deans like the one at the University of Chicago who declared his school a safe-space-free zone. My alma mater just handed out prizes for “enhancing race and/or ethnic relations” to two students caught on video harassing the dean of their residential college, one screaming at him that he’d created “a space for violence to happen,” the other placing his face inches away from the dean’s and demanding, “Look at me.” All this because they deemed a thoughtful if ill-timed letter about Halloween costumes written by the dean’s wife to be an act of racist aggression. Yale should discipline students who behave like that, even if they’re right on the merits (I don’t think they were, but that’s not the point). They certainly don’t deserve awards. I can’t believe I had to write that sentence.
But in abdicating their responsibilites, the universities have enabled something even worse than an attack on free speech. They’ve unleashed an assault on themselves. There’s plenty of free speech around; we know that because so much bad speech—low-minded nonsense—tests our constitutional tolerance daily, and that’s holding up pretty well. (As Nicholas Lemann observes elsewhere in this symposium, Facebook and Google represent bigger threats to free speech than students and administrators.) What’s endangered is good speech.
Universities were setting themselves up to be used. Provocateurs exploit the atmosphere on campus to goad overwrought students, then gleefully trash the most important bastion of our crumbling civil society. Higher education and everything it stands for—logical argument, the scientific method, epistemological rigor—start to look illegitimate. Voters perceive tenure and research and higher education itself as hopelessly partisan and unworthy of taxpayers’ money.
The press is a secondary victim of this process of delegitimization. If serious inquiry can be waved off as ideology, then facts won’t be facts and reporting can’t be trusted. All journalism will be equal to all other journalism, and all journalists will be reduced to pests you can slam to the ground with near impunity. Politicians will be able to say anything and do just about anything and there will be no countervailing authority to challenge them. I’m pretty sure that that way lies Putinism and Erdoganism. And when we get to that point, I’m going to start worrying about free speech again.
Judith Shulevitz is a critic in New York.
Harvey SilverglateFree speech is, and has always been, threatened. The title of Nat Hentoff’s 1993 book Free Speech for Me – but Not for Thee is no less true today than at any time, even as the Supreme Court has accorded free speech a more absolute degree of protection than in any previous era.
Since the 1980s, the high court has decided most major free-speech cases in favor of speech, with most of the major decisions being unanimous or nearly so.
Women’s-rights advocates were turned back by the high court in 1986 when they sought to ban the sale of printed materials that, because deemed pornographic by some, were alleged to promote violence against women. Censorship in the name of gender–based protection thus failed to gain traction.
Despite the demands of civil-rights activists, the Supreme Court in 1992 declared cross-burning to be a protected form of expression in R.A.V. v. City of St. Paul, a decision later refined to strengthen a narrow exception for when cross-burning occurs primarily as a physical threat rather than merely an expression of hatred.
Other attempts at First Amendment circumvention have been met with equally decisive rebuff. When the Reverend Jerry Falwell sued Hustler magazine publisher Larry Flynt for defamation growing out of a parody depicting Falwell’s first sexual encounter as a drunken tryst with his mother in an outhouse, a unanimous Supreme Court lectured on the history of parody as a constitutionally protected, even if cruel, form of social and political criticism.
When the South Boston Allied War Veterans, sponsor of Boston’s Saint Patrick’s Day parade, sought to exclude a gay veterans’ group from marching under its own banner, the high court unanimously held that as a private entity, even though marching in public streets, the Veterans could exclude any group marching under a banner conflicting with the parade’s socially conservative message, notwithstanding public-accommodations laws. The gay group could have its own parade but could not rain on that of the conservatives.
Despite such legal clarity, today’s most potent attacks on speech are coming, ironically, from liberal-arts colleges. Ubiquitous “speech codes” limit speech that might insult, embarrass, or “harass,” in particular, members of “historically disadvantaged” groups. “Safe spaces” and “trigger warnings” protect purportedly vulnerable students from hearing words and ideas they might find upsetting. Student demonstrators and threats of violence have forced the cancellation of controversial speakers, left and right.
It remains unclear how much campus censorship results from politically correct faculty, control-obsessed student-life administrators, or students socialized and indoctrinated into intolerance. My experience suggests that the bureaucrats are primarily, although not entirely, to blame. When sued, colleges either lose or settle, pay a modest amount, and then return to their censorious ways.
This trend threatens the heart and soul of liberal education. Eventually it could infect the entire society as these students graduate and assume influential positions. Whether a resulting flood of censorship ultimately overcomes legal protections and weakens democracy remains to be seen.
Harvey Silverglate, a Boston-based lawyer and writer, is the co-author of The Shadow University: The Betrayal of Liberty on America’s Campuses (Free Press, 1998). He co-founded the Foundation for Individual Rights in Education in 1999 and is on FIRE’s board of directors. He spent some three decades on the board of the ACLU of Massachusetts, two of those years as chairman. Silverglate taught at Harvard Law School for a semester during a sabbatical he took in the mid-1980s.
Christina Hoff SommersWhen Heather Mac Donald’s “blue lives matter” talk was shut down by a mob at Claremont McKenna College, the president of neighboring Pomona College sent out an email defending free speech. Twenty-five students shot back a response: “Heather Mac Donald is a fascist, a white supremacist . . . classist, and ignorant of interlocking systems of domination that produce the lethal conditions under which oppressed peoples are forced to live.”
Some blame the new campus intolerance on hypersensitive, over-trophied millennials. But the students who signed that letter don’t appear to be fragile. Nor do those who recently shut down lectures at Berkeley, Middlebury, DePaul, and Cal State LA. What they are is impassioned. And their passion is driven by a theory known as intersectionality.
Intersectionality is the source of the new preoccupation with microaggressions, cultural appropriation, and privilege-checking. It’s the reason more than 200 colleges and universities have set up Bias Response Teams. Students who overhear potentially “otherizing” comments or jokes are encouraged to make anonymous reports to their campus BRTs. A growing number of professors and administrators have built their careers around intersectionality. What is it exactly?
Intersectionality is a neo-Marxist doctrine that views racism, sexism, ableism, heterosexism, and all forms of “oppression” as interconnected and mutually reinforcing. Together these “isms” form a complex arrangement of advantages and burdens. A white woman is disadvantaged by her gender but advantaged by her race. A Latino is burdened by his ethnicity but privileged by his gender. According to intersectionality, American society is a “matrix of domination,” with affluent white males in control. Not only do they enjoy most of the advantages, they also determine what counts as “truth” and “knowledge.”
But marginalized identities are not without resources. According to one of intersectionality’s leading theorists, Patricia Collins (former president of the American Sociology Association), disadvantaged groups have access to deeper, more liberating truths. To find their voice, and to enlighten others to the true nature of reality, they require a safe space—free of microaggressive put-downs and imperious cultural appropriations. Here they may speak openly about their “lived experience.” Lived experience, according to intersectional theory, is a better guide to the truth than self-serving Western and masculine styles of thinking. So don’t try to refute intersectionality with logic or evidence: That only proves that you are part of the problem it seeks to overcome.
How could comfortably ensconced college students be open to a convoluted theory that describes their world as a matrix of misery? Don’t they flinch when they hear intersectional scholars like bell hooks refer to the U.S. as an “imperialist, white-supremacist, capitalist patriarchy”? Most take it in stride because such views are now commonplace in high-school history and social studies texts. And the idea that knowledge comes from lived experience rather than painstaking study and argument is catnip to many undergrads.
Silencing speech and forbidding debate is not an unfortunate by-product of intersectionality—it is a primary goal. How else do you dismantle a lethal system of oppression? As the protesting students at Claremont McKenna explained in their letter: “Free speech . . . has given those who seek to perpetuate systems of domination a platform to project their bigotry.” To the student activists, thinkers like Heather MacDonald and Charles Murray are agents of the dominant narrative, and their speech is “a form of violence.”
It is hard to know how our institutions of higher learning will find their way back to academic freedom, open inquiry, and mutual understanding. But as long as intersectional theory goes unchallenged, campus fanaticism will intensify.
Christina Hoff Sommers is a resident scholar at the American Enterprise Institute. She is the author of several books, including Who Stole Feminism? and The War Against Boys. She also hosts The Factual Feminist, a video blog. @Chsommers
John StosselYes, some college students do insane things. Some called police when they saw “Trump 2016” chalked on sidewalks. The vandals at Berkeley and the thugs who assaulted Charles Murray are disgusting. But they are a minority. And these days people fight back.
Someone usually videotapes the craziness. Yale’s “Halloween costume incident” drove away two sensible instructors, but videos mocking Yale’s snowflakes, like “Silence U,” make such abuse less likely. Groups like Young America’s Foundation (YAF) publicize censorship, and the Foundation for Individual Rights in Education (FIRE) sues schools that restrict speech.
Consciousness has been raised. On campus, the worst is over. Free speech has always been fragile. I once took cameras to Seton Hall law school right after a professor gave a lecture on free speech. Students seemed to get the concept. Sean, now a lawyer, said, “Protect freedom for thought we hate; otherwise you never have a society where ideas clash, and we come up with the best idea.” So I asked, “Should there be any limits?” Students listed “fighting words,” “shouting fire in a theater,” malicious libel, etc.— reasonable court-approved exceptions. But then they went further. Several wanted bans on “hate” speech, “No value comes out of hate speech,” said Javier. “It inevitably leads to violence.”
No it doesn’t, I argued, “Also, doesn’t hate speech bring ideas into the open, so you can better argue about them, bringing you to the truth?”
“No,” replied Floyd, “With hate speech, more speech is just violence.”
So I pulled out a big copy of the First Amendment and wrote, “exception: hate speech.”
Two students wanted a ban on flag desecration “to respect those who died to protect it.”
One wanted bans on blasphemy:
“Look at the gravity of the harm versus the value in blasphemy—the harm outweighs the value.”
Several wanted a ban on political speech by corporations because of “the potential for large corporations to improperly influence politicians.”
Finally, Jillian, also now a lawyer, wanted hunting videos banned.
“It encourages harm down the road.”
I asked her, incredulously, “you’re comfortable locking up people who make a hunting film?”
“Oh, yeah,” she said. “It’s unnecessary cruelty to feeling and sentient beings.”
So, I picked up my copy of the Bill of Rights again. After “no law . . . abridging freedom of speech,” I added: “Except hate speech, flag burning, blasphemy, corporate political speech, depictions of hunting . . . ”
That embarrassed them. “We may have gone too far,” said Sean. Others agreed. One said, “Cross out the exceptions.” Free speech survived, but it was a close call. Respect for unpleasant speech will always be thin. Then-Senator Hillary Clinton wanted violent video games banned. John McCain and Russ Feingold tried to ban political speech. Donald Trump wants new libel laws, and if you burn a flag, he tweeted, consequences might be “loss of citizenship or a year in jail!” Courts or popular opinion killed those bad ideas.
Free speech will survive, assuming those of us who appreciate it use it to fight those who would smother it.
John Stossel is a FOX News/FOX Business Network Contributor.
Warren TreadgoldEven citizens of dictatorships are free to praise the regime and to talk about the weather. The only speech likely to be threatened anywhere is the sort that offends an important and intolerant group. What is new in America today is a leftist ideology that threatens speech precisely because it offends certain important and intolerant groups: feminists and supposedly oppressed minorities.
So far this new ideology is clearly dominant only in colleges and universities, where it has become so strong that most controversies concern outside speakers invited by students, not faculty speakers or speakers invited by administrators. Most academic administrators and professors are either leftists or have learned not to oppose leftism; otherwise they would probably never have been hired. Administrators treat even violent leftist protestors with respect and are ready to prevent conservative and moderate outsiders from speaking rather than provoke protests. Most professors who defend conservative or moderate speakers argue that the speakers’ views are indeed noxious but say that students should be exposed to them to learn how to refute them. This is very different from encouraging a free exchange of ideas.
Although the new ideology began on campuses in the ’60s, it gained authority outside them largely by means of several majority decisions of the Supreme Court, from Roe (1973) to Obergefell (2015). The Supreme Court decisions that endanger free speech are based on a presumed consensus of enlightened opinion that certain rights favored by activists have the same legitimacy as rights explicitly guaranteed by the Constitution—or even more legitimacy, because the rights favored by activists are assumed to be so fundamental that they need no grounding in specific constitutional language. The Court majorities found restricting abortion rights or homosexual marriage, as large numbers of Americans wish to do, to be constitutionally equivalent to restricting black voting rights or interracial marriage. Any denial of such equivalence therefore opposes fundamental constitutional rights and can be considered hate speech, advocating psychological and possibly physical harm to groups like women seeking abortions or homosexuals seeking approval. Such speech may still be constitutionally protected, but acting upon it is not.
This ideology of forbidding allegedly offensive speech has spread to most of the Democratic Party and the progressive movement. Rather than seeing themselves as taking one side in a free debate, progressives increasingly argue (for example) that opposing abortion is offensive to women and supporting the police is offensive to blacks. Some politicians object so strongly to such speech that despite their interest in winning votes, they attack voters who disagree with them as racists or sexists. Expressing views that allegedly discriminate against women, blacks, homosexuals, and various other minorities can now be grounds for a lawsuit.
Speech that supposedly offends women or minorities has already cost some people their careers, their businesses, and their opportunities to deliver or hear speeches. Such intimidation is the intended result of an ideology that threatens free speech.
Warren Treadgold is a professor of history at Saint Louis University.
Matt WelchLike a sullen zoo elephant rocking back and forth from leg to leg, there is an oversized paradox we’d prefer not to see standing smack in the sightlines of most our policy debates. Day by day, even minute by minute, America simultaneously gets less free in the laboratory, but more free in the field. Individuals are constantly expanding the limits and applications of their own autonomy, even as government transcends prior restraints on how far it can reach into our intimate business.
So it is that the Internal Revenue Service can charge foreign banks with collecting taxes on U.S. citizens (therefore causing global financial institutions to shun many of the estimated 6 million-plus Americans who live abroad), even while block-chain virtuosos make illegal transactions wholly undetectable to authorities. It has never been easier for Americans to travel abroad, and it’s never been harder to enter the U.S. without showing passports, fingerprints, retinal scans, and even social-media passwords.
What’s true for banking and tourism is doubly true for free speech. Social media has given everyone not just a platform but a megaphone (as unreadable as our Facebook timelines have all become since last November). At the same time, the federal government during this unhappy 21st century has continuously ratcheted up prosecutorial pressure against leakers, whistleblowers, investigative reporters, and technology companies.
A hopeful bulwark against government encroachment unique to the free-speech field is the Supreme Court’s very strong First Amendment jurisprudence in the past decade or two. Donald Trump, like Hillary Clinton before him, may prattle on about locking up flag-burners, but Antonin Scalia and the rest of SCOTUS protected such expression back in 1990. Barack Obama and John McCain (and Hillary Clinton—she’s as bad as any recent national politician on free speech) may lament the Citizens United decision, but it’s now firmly legal to broadcast unfriendly documentaries about politicians without fear of punishment, no matter the electoral calendar.
But in this very strength lies what might be the First Amendment’s most worrying vulnerability. Barry Friedman, in his 2009 book The Will of the People, made the persuasive argument that the Supreme Court typically ratifies, post facto, where public opinion has already shifted. Today’s culture of free speech could be tomorrow’s legal framework. If so, we’re in trouble.
For evidence of free-speech slippage, just read around you. When both major-party presidential nominees react to terrorist attacks by calling to shut down corners of the Internet, and when their respective supporters are actually debating the propriety of sucker punching protesters they disagree with, it’s hard to escape the conclusion that our increasingly shrill partisan sorting is turning the very foundation of post-1800 global prosperity into just another club to be swung in our national street fight.
In the eternal cat-and-mouse game between private initiative and government control, the former is always advantaged by the latter’s fundamental incompetence. But what if the public willingly hands government the power to muzzle? It may take a counter-cultural reformation to protect this most noble of American experiments.
Matt Welch is the editor at large of Reason.
Adam. J. WhiteFree speech is indeed under threat on our university campuses, but the threat did not begin there and it will not end there. Rather, the campus free-speech crisis is a particularly visible symptom of a much more fundamental crisis in American culture.
The problem is not that some students, teachers, and administrators reject traditional American values and institutions, or even that they are willing to menace or censor others who defend those values and institutions. Such critics have always existed, and they can be expected to use the tools and weapons at their disposal. The problem is that our country seems to produce too few students, teachers, and administrators who are willing or able to respond to them.
American families produce children who arrive on campus unprepared for, or uninterested in, defending our values and institutions. For our students who are focused primarily on their career prospects (if on anything at all), “[c]ollege is just one step on the continual stairway of advancement,” as David Brooks observed 16 years ago. “They’re not trying to buck the system; they’re trying to climb it, and they are streamlined for ascent. Hence they are not a disputatious group.”
Meanwhile, parents bear incomprehensible financial burdens to get their kids through college, without a clear sense of precisely what their kids will get out of these institutions in terms of character formation or civic virtue. With so much money at stake, few can afford for their kids to pursue more than career prospects.
Those problems are not created on campus, but they are exacerbated there, as too few college professors and administrators see their institutions as cultivators of American culture and republicanism. Confronted with activists’ rage, they offer no competing vision of higher education—let alone a compelling one.
Ironically, we might borrow a solution from the Left. Where progressives would leverage state power in service of their health-care agenda, we could do the same for education. State legislatures and governors, recognizing the present crisis, should begin to reform and renegotiate the fundamental nature of state universities. By making state universities more affordable, more productive, and more reflective of mainstream American values, they will attract students—and create incentives for competing private universities to follow suit.
Let’s hope they do it soon, for what’s at stake is much more than just free speech on campus, or even free speech writ large. In our time, as in Tocqueville’s, “the instruction of the people powerfully contributes to the support of a democratic republic,” especially “where instruction which awakens the understanding is not separated from moral education which amends the heart.” We need our colleges to cultivate—not cut down—civic virtue and our capacity for self-government. “Republican government presupposes the existence of these qualities in a higher degree than any other form,” Madison wrote in Federalist 55. If “there is not sufficient virtue among men for self-government,” then “nothing less than the chains of despotism” can restrain us “from destroying and devouring one another.”
Adam J. White is a research fellow at the Hoover Institution.
Cathy YoungA writer gets expelled from the World Science Fiction Convention for criticizing the sci-fi community’s preoccupation with racial and gender “inclusivity” while moderating a panel. An assault on free speech, or an exercise of free association? How about when students demand the disinvitation of a speaker—or disrupt the speech? When a critic of feminism gets banned from a social-media platform for unspecified “abuse”?
Such questions are at the heart of many recent free-speech controversies. There is no censorship by government; but how concerned should we be when private actors effectively suppress unpopular speech? Even in the freest society, some speech will—and should—be considered odious and banished to unsavory fringes. No one weeps for ostracized Holocaust deniers or pedophilia apologists.
But shunned speech needs to remain a narrow exception—or acceptable speech will inexorably shrink. As current Federal Communications Commission chairman Ajit Pai cautioned last year, First Amendment protections will be hollowed out unless undergirded by cultural values that support a free marketplace of ideas.
Sometimes, attacks on speech come from the right. In 2003, an Iraq War critic, reporter Chris Hedges, was silenced at Rockford College in Illinois by hecklers who unplugged the microphone and rushed the stage; some conservative pundits defended this as robust protest. Yet the current climate on the left—in universities, on social media, in “progressive” journalism, in intellectual circles—is particularly hostile to free expression. The identity-politics left, fixated on subtle oppressions embedded in everyday attitudes and language, sees speech-policing as the solution.
Is hostility to free-speech values on the rise? New York magazine columnist Jesse Singal argues that support for restrictions on public speech offensive to minorities has remained steady, and fairly high, since the 1970s. Perhaps. But the range of what qualifies as offensive—and which groups are to be shielded—has expanded dramatically. In our time, a leading liberal magazine, the New Republic, can defend calls to destroy a painting of lynching victim Emmett Till because the artist is white and guilty of “cultural appropriation,” and a feminist academic journal can be bullied into apologizing for an article on transgender issues that dares to mention “male genitalia.”
There is also a distinct trend of “bad” speech being squelched by coercion, not just disapproval. That includes the incidents at Middlebury College in Vermont and at Claremont McKenna in California, where mobs not only prevented conservative speakers—Charles Murray and Heather Mac Donald—from addressing audiences but physically threatened them as well. It also includes the use of civil-rights legislation to enforce goodthink in the workplace: Businesses may face stiff fines if they don’t force employees to call a “non-binary” co-worker by the singular “they,” even when talking among themselves.
These trends make a mockery of liberalism and enable the kind of backlash we have seen with Donald Trump’s election. But the backlash can bring its own brand of authoritarianism. It’s time to start rebuilding the culture of free speech across political divisions—a project that demands, above all, genuine openness and intellectual consistency. Otherwise it will remain, as the late, great Nat Hentoff put it, a call for “free speech for me, but not for thee.”
Cathy Young is a contributing editor at Reason.
Robert J. ZimmerFree speech is not a natural feature of human society. Many people are comfortable with free expression for views they agree with but would withhold this privilege for those they deem offensive. People justify such restrictions by various means: the appeal to moral certainty, political agendas, demand for change, opposing change, retaining power, resisting authority, or, more recently, not wanting to feel uncomfortable. Moral certainty about one’s views or a willingness to indulge one’s emotions makes it easy to assert that others are doing true damage or creating unacceptable offense simply by presenting a fundamentally different perspective.
The resulting challenges to free expression may come in the form of laws, threats, pressure (whether societal, group, or organizational), or self-censorship in the face of a prevailing consensus. Specific forms of challenge may be more or less pronounced as circumstances vary. But the widespread temptation to consider the silencing of “objectionable” viewpoints as acceptable implies that the challenge to free expression is always present.
The United States today is no exception. We benefit from the First Amendment, which asserts that the government shall make no law abridging the freedom of speech. However, fostering a society supporting free expression involves matters far beyond the law. The ongoing and increasing demonization of one group by another creates a political and social environment conducive to suppressing speech. Even violent acts opposing speech can become acceptable or encouraged. Such behavior is evident at both political rallies and university events. Our greatest current threat to free expression is the emergence of a national culture that accepts the legitimacy of suppression of speech deemed objectionable by a segment of the population.
University and college campuses present a particularly vivid instance of this cultural shift. There have been many well-publicized episodes of speakers being disinvited or prevented from speaking because of their views. However, the problem is much deeper, as there is significant self-censorship on many campuses. Both faculty and students sometimes find themselves silenced by social and institutional pressures to conform to “acceptable” views. Ironically, the very mission of universities and colleges to provide a powerful and deeply enriching education for their students demands that they embrace and protect free expression and open discourse. Failing to do so significantly diminishes the quality of the education they provide.
My own institution, the University of Chicago, through the words and actions of its faculty and leaders since its founding, has asserted the importance of free expression and its essential role in embracing intellectual challenge. We continue to do so today as articulated by the Chicago Principles, which strongly affirm that “the University’s fundamental commitment is to the principle that debate or deliberation may not be suppressed because the ideas put forth are thought by some or even by most members of the University community to be offensive, unwise, immoral, or wrong-headed.” It is only in such an environment that universities can fulfill their own highest aspirations and provide leadership by demonstrating the value of free speech within society more broadly. A number of universities have joined us in reinforcing these values. But it remains to be seen whether the faculty and leaders of many institutions will truly stand up for these values, and in doing so provide a model for society as a whole.
Robert J. Zimmer is the president of the University of Chicago.