The 80's are more and more coming to be characterized by journalists, historians, and intellectuals as a costly if not…
COMMENTARY recently sent the following inquiry to a number
of American intellectuals of varying political views:
The 80’s are more and more coming to be characterized by journalists, historians, and intellectuals as a costly if not a disastrous decade for America. At home, it is charged, the economic and social policies of the Reagan administration stimulated greed on Wall Street and in the business community, encouraged a general mood of selfishness, and were responsible for an actual increase of poverty (as symbolized in particular by the plight of the homeless). Abroad, the indictment continues, indiscriminate bellicosity on a global scale succeeded mainly in provoking a dangerous arms race, needlessly exacerbating and prolonging conflicts with adversaries, alienating friends, and sowing the seeds of future mistrust of American intentions.
Do you accept these characterizations? If so, how would you account for the worldwide triumph by the end of the decade of the prevailing ideas and policies of the American 80’s? If not, how do you account for the fact that so many people in America itself have been condemning those ideas and policies?
The responses—nineteen in all—follow.
This symposium is sponsored by the Harry Elson COMMENTARY Fund.
Jeane J. Kirkpatrick:
Will history give Ronald Reagan credit for his accomplishments?, I have been asked many times by relatively nonpolitical American Reagan supporters.
COMMENTARY’s symposium reminds us that there is a more basic question: will the very existence of these accomplishments be acknowledged, or will they be ignored and obscured by an interpretation of events that jumps from Jimmy Carter’s human-rights policy to Carter monitoring elections in Nicaragua in 1990; or from the end of negotiations on SALT II to George Bush signing a new START treaty?
Will the collapse of Communism be ascribed simply to internal Soviet weaknesses—as if U.S. and Western policies played no significant role?
All this could happen. The writing of history is the job of a largely adversary professoriate and a largely adversary media, not much affected by the events of the past year.
It is possible that the collapse of Communism as a world revolutionary movement, its elimination as a framework for the critical analysis of capitalism and as an alternative focus of loyalty, may eventually have an effect on the relationship of the American Left with the United States, but I doubt it. I believe the Left’s attitudes toward America have influenced its attitudes toward Communism, rather than the reverse. These attitudes toward the United States are deeply rooted, and shielded from evidence because they are not based on evidence.
I believe one of the distinctive attributes of the American Left is a broad, though not universal, alienation from the dominant American society and culture which constitutes a cognitive screen through which new information about the world must pass before it is assimilated. This creates the negative evaluation of the 80’s confronted in this symposium.
What kind of alienation am I talking about? An example at hand is in a recent column by William Pfaff on Canada in the International Herald Tribune. He writes:
We shot up our West—murdered the Indians, overgrazed and overplowed the land into dust. Now we are de-treeing it. You sent your Redcoats to police and bring order to the West, singing—we were led to suppose—a neat baritone while going about it. How impressive to have won the West your way.
This excerpt (part of a longer equally prejudiced and prejudicial comparison) has no special significance. But it is typical of the tendency on the Left to sweeping condemnation of U.S. national character and practices and the easy praise of others.
Let me be clear. I understand that not everyone on the American Left is alienated from the United States. Like all broad political movements the American Left contains a number of potentially incompatible tendencies. Traditional Democrats, including many blacks and most of the labor movement, are not alienated from America. There are also some fairly traditional Marxists and quasi-Marxists for whom alienation is not central. But there are also denizens of the counterculture who contribute to the American Left a broad streak of romanticism, anarchism, and alienation largely missing from the serious social-democratic and Communist parties of Europe (though this streak has surfaced in Germany’s “Greens” and on the edges of Britain’s Labor party).
The anti-establishment, anti-bourgeois romantic Left has had in common with democratic-socialist and major Marxist parties the rejection of capitalism. But serious socialists reject capitalism because they think socialism can provide a better life and a higher standard of living for more people, not because they reject bourgeois styles of life or are contemptuous of bourgeois values. Persons whose politics are motivated chiefly by the desire to promote the best life for the most people change their minds and their policies on the basis of new evidence, and in recent years, many democratic-socialist parties and some non-democratic ones have abandoned key socialist policies in favor of market strategies on the basis of accumulating evidence of socialism’s failure and capitalism’s economic success. But the American Left has had special problems assimilating these facts.
I believe the reason is that many have been attracted to socialism less as an economic system that would work than as a system which disdained capitalism and capitalist society.
I believe they disdain capitalism for much the same reason they disdain John Wayne and Ronald Reagan—because it seems so American. Americans like Horatio Alger, the American dream, the homely virtues, the promise of reward for hard work. The heroes of the counterculture also work hard and live by the rules—and lose their farms to greedy bankers who divert depositors’ funds to support Ronald Reagan and his ilk. In this world of the counterculture, the good guys are all anti-establishment. Economics becomes a subfield in an ongoing morality play. In that morality play it matters less that socialists are unproductive than that they are anti-capitalist.
The Sociologist Seymour Martin Lipset has recently described the process by which one Western country after another has abandoned key socialist strategies after their economies floundered, and adopted market mechanisms instead. But not American Democrats. In attempting to explain why “the national Democrats have been more disposed to adhere to a redistributionist, progressive tax, anti-business orientation than most social-democratic parties elsewhere,” Lipset returns—as I think we must—to the thesis of Richard Hofstadter and Lionel Trilling that for American intellectuals, “the attachment [to Marxism] has been inspired and sustained more by a desire to be anti-establishment, to be adversarial toward bourgeois and national patriotic values, than by a concern to implement specific political and social programs.” I rebel, therefore I am.
These aspects of the American Left’s relationship to its society are present in speeches at Democratic conventions and in Democratic primaries: the America they depict is one of debt and depression, of decline, homelessness, and hopelessness, of insecurity and intolerance, of neglected children and neglected aged. Ronald Reagan’s America—as they depict it—is a callous, careless society with polluted air and water and drug-ridden, violent cities in which the homeless are left to freeze on winter streets. It is an America of greed and need.
This crime-ridden America projects its lawlessness abroad. It is described as the “world’s biggest deadbeat,” as a great “international lawbreaker.” At the Democratic convention of 1988, Ronald Reagan’s administrations were decried as irresponsible in economic policy, callous in social policy, and trigger-happy in foreign policy.
Do I exaggerate? No. They exaggerate.
The alienation of the American Left is manifest in the eager magnification of American failings and the reluctance to acknowledge American achievements. It is clear in the enthusiasm for such interesting “rights” as flag burning and in the eager condemnation of such “crimes” as John Poindexter’s. This alienation is nowhere more obvious than in the tendency to blame America first—for almost everything almost anywhere. It has cost the national Democrats votes and has been recognized by that party as a problem.
How can people who see the American economy in terms of depression, deficit, and decline accept the news that others find it the most promising system of all?
How can people who believed Ronald Reagan’s description of the Soviet Union as an “Evil Empire” to be dangerous and ludicrous accept the news from people who lived under it that it was a repressive regime in which alien rule was sustained by force?
How can they bear the pain of learning definitively that Eastern-bloc socialism had never won the allegiance of those who lived under it? Only by denial. Suddenly, all the old arguments have been settled: virtually no one chooses to live under Communism. Given a chance, people choose free governments and free markets. The speed with which Poland, Hungary, East Germany, and Czechoslovakia seized the opportunity to dump Communism and Communist rulers, transform their regimes, and loosen ties to the Warsaw Pact resolved once and for all the outstanding questions about whether those people of Eastern Europe had “chosen” socialism.
Worse, from the point of view of the Left, must be the widely acknowledged failure of socialist strategies even in a democratic context and the subsequent abandonment of central planning, redistributionist tax policies, and systems of comprehensive social insurance. “From Scandinavia to the South Pacific, socialist and other left-wing parties have taken the road back to capitalism,” writes Lipset.
This decision to travel “the road back” from socialism to capitalism rests on three conclusions—each of which contradicts central aspects of Marxist and quasi-Marxist perspectives: first, that socialism is an inefficient, unproductive economic system; second, that capitalism is a more effective, more productive system; and third, that it is possible to “go backward” in history. Of all these, I think the news most difficult for the American Left has been capitalism’s success, because accepting it means conceding America’s success. It means conceding that the Left got it all wrong: in fact there was no increase in poverty, no sudden explosion of greed, no adventurism in foreign policy, no increased misery of the poor in the 80’s.
There was a dramatic increase in drug addiction and in certain kinds of social ills for which the counterculture and liberal extremists bear a special resposibility. Homelessness is the very best example of an ill dramatically exacerbated by “liberals” who push out of mental hospitals those too sick to remember to take the medicine that makes them relatively sane, and who keep in the streets in the name of freedom people who have long since become slaves to addiction.
The alienated leftists make it clear day after day that they would rather curse the society than help the people in it cope. Only recently, for example, the New York Times reported new findings concerning the biological basis of addiction which, it said, “will result in an entirely new strategy for fighting drug addiction.” One might have thought this good news, but no. . . .
“I object to seeing the vulnerability in the person rather than in their poverty,” said a University of Colorado sociologist, apparently unmoved by the prospective good the new medicine might do. “Even if the scientific evidence turns out to be strong, people have the right to refuse being tested or given medication,” added the director of an “advocacy group for the mentally ill,” less concerned about the misery than the “right” to be ill.
Our homeless problem does reflect a failure of Ronald Reagan: his failure to solve the problem of “liberal” extremism which is ready to sacrifice persons to abstractions.
It matters that there are influential, highly educated Americans who think that the United States is a cold, careless society in debt and decline. But their views are not a conclusion based on evidence, and thus cannot be disproved by evidence. Too bad, because the evidence is now so abundantly at hand.
Robert B. Reich:
How can the decade just ended exemplify both the failure of American capitalism and, simultaneously, the triumph of American capitalism? There are two plausible explanations. The second is more convincing.
1. The warts-and-all hypothesis. For all its obvious shortcomings, American capitalism is still so superior to Communism that it will dramatically improve the lives of Eastern Europeans, Soviets, and others around the world nonetheless, and they know it.
Yes, certain blemishes were revealed during the last decade. American productivity has risen at a snail’s pace; the nation went from being the largest creditor to being the largest debtor in the world; it has lost market share in many of the technologies of the future; American companies are being bought up by non-Americans; the average real income of Americans has stagnated; American schools are falling apart, aptitude tests are down, and one out of every five young American adults is functionally illiterate; the rates of infant mortality in many of America’s major cities rival those of Third World nations; and a growing portion of the American population—including one out of every four children under the age of six—is impoverished, at least according to the way Americans define poverty.
But these are quibbles. In the world economy, everything is relative. During the same decade, Communist systems fared much worse. In fact, the gap between the two systems—American-style capitalism, and Soviet-style central planning—became so wide that the latter could not endure.
The relative collapse of central planning was due largely to new demands placed on all economic systems by rapidly changing information technologies. Central planning is up to the task of producing masses of identical objects, like steel ingots. Indeed, Soviet steel production increased by a brisk 9 percent a year between 1945 and 1960—enabling Nikita Khrushchev credibly to boast in 1959 that at the rate his economy was growing the Soviets would overtake America within twenty years. But when it comes to smaller runs of more intricate things that must be carefully synchronized with one another and continuously adapted to new opportunities and circumstances, central planning is hopeless. Bureaucratic coordination is impossible; there are too many variables. For designing and producing networks of computers, software, semiconductors, and fiber optics, and utilizing them to accomplish all sorts of tasks, there is no substitute for a decentralized price system that constantly signals how and where new ideas can be put to their best uses.
2. The it’s-not-American-capitalism-they-want-anyway hypothesis. Granted all of the above about the inevitable failure of central planning, there are capitalist alternatives other than the American Way. And the record of the 80’s suggests that the alternatives might be superior.
If you happen to live near Japan, South Korea, Taiwan, or Singapore you are probably considering adopting some variant on their highly successful forms of neo-mercantilist capitalism. To this end, take the following measures: hold down consumption; push exports; subsidize high technologies; pour huge amounts of money into education, training, and research and development; avoid the bureaucratic inefficiencies of central planning but keep some of its scale economies by consolidating your firms into giant industrial groups, each with its own lead bank; keep everyone on his toes by forcing the groups to compete vigorously with one another; but also create among your business, labor, and government elites a tight coordinating system, so that whoever loses from economic change can quickly compensate whoever wins. Most importantly, don’t be overly enamored of the price system. Use it to signal growth opportunities, but then rapidly mobilize the nation’s resources to capture them.
If you live near Western Europe, on the other hand, you’re probably considering adopting social-democratic capitalism. Western European productivity soared throughout the, 80’s, and the anticipated gain from economic union in 1992 has summoned a deluge of global capital. How to institute social-democratic capitalism? Easy: rely on free markets, but establish public health insurance and generous unemployment insurance; set up a highly progressive income tax; give organized labor a major say in corporate governance; have your big banks, labor unions, and finance ministries steer macroeconomic policy; so long as a strong social safety net is in place, tolerate unemployment for the sake of rapid productivity growth; pool public and private resources for major research-and-development projects; invest heavily in public works; provide workers with an abundance of apprenticeship, training, and retraining programs, and help them find jobs and relocate themselves. Also, consider sacrificing some political sovereignty in order to create economic opportunities that originate across your borders.
None of this is meant to suggest that the average Pole, Hungarian, Russian, Vietnamese, or Thai is carefully weighing and balancing these possibilities. Their overwhelming desire, presumably, is to get food on their tables and VCR’s on their television sets, and they care little about what the economic system is called or exactly how it is run. The point, rather, is that the alternative to central planning is not necessarily American capitalism. Those who are emerging from Communism are more likely to be attracted by a wartless variety.
If the recent past is any guide, it can be expected that the future ideological battleground will not be between Communism and capitalism, but among these three fundamentally different kinds of capitalism: East Asian neo-mercantilist, European social-democratic, and American. If they stick to their paths and we stick to ours, the differences between the American kind and the others are likely to be far greater by the year 2000 than they are today, and the relative unpopularity of our errant variety, far more apparent.
Free at last! Let us rejoice at the end of the 80’s. Even as I write, a cleansing wave of retrospective rage sweeps through the media at the greed and venality, jingoism and philistinism that emerged in America with the election of Ronald Reagan to the presidency. A wave of euphoria engulfs us as Americans celebrate more inspiring leaders, leftist leaders—Time’s Man of the Decade Mikhail Gorbachev and the media’s man of the millennium Nelson Mandela.
Mandela opens the 90’s by proclaiming his creed of racial harmony from Yankee Stadium to the U.S. Congress, hailing such pacific prophets as Yasir Arafat, Muammar Qaddafi, the PLO, the IRA, and the Puerto Rican redeemers who shot four U.S. Congressmen in 1955. In contrast to the bellicose Reagan, Gorbachev now pursues a policy of peace, raising defense spending by 20 percent as a share of Soviet output (from 10 to 12 percent of GNP).
Free at last! The American intelligentsia now revels in the prospect of higher taxes and higher spending and looks longingly toward environmental catastrophe. Taxes will bring a new spirit of “equal sacrifice” to our “plutocratic” and “undertaxed” nation in which government now spends a “niggling” one-third of GNP (without the help of socialized medicine that balloons the totals of other Western nations). Environmental disasters will put politicians (and intellectuals) back into control once and for all, and end the business bid for primacy in our national life.
Crime rates, too, seem likely to improve soon, as the clogged court system prepares to end “wilding” sprees in Washington and New York through the incarceration of John Poindexter and Michael Milken. Faced with the threat to public health and safety posed by these men, the government refreshingly eschewed the folderol of specific criminal charges and is putting these men away merely as symbols of 80’s excess.
Now in the 90’s, the U.S. can turn away from the “sterile” conflicts of the cold war and focus instead on the real threats to America—Israel and Japan. In a bold response to the threat of high-quality Japanese imports, the U.S. has already moved to dissuade the Japanese from working more than forty hours a week. Perhaps in time the Israelis can be persuaded to stop defending themselves and the West from terrorism so effectively, as this distresses potential American allies such as Iran, Syria, Iraq, Libya, Cuba, and Uganda.
It is easy to make fun of the fashionable attitudes toward the 80’s. But the question remains: why does the memory of the past decade make all these highbrows ache and furrow and the media mouths froth and sneer? The reason is obvious. For most of the American intelligentsia, the 80’s were nearly a total disaster.
The 80’s were the era when the leftist dreams all collapsed in travesty. The mock-heroic youth of the 60’s emerged from schools full of self-importance, sure that the world was evil and owed them a living for their moral superiority, and capable of making no contribution to society except teaching their crippling creeds to future generations. The 80’s taught them the unwanted lesson that Marxist slogans, a sense of grievance, and a rhetoric of rights are economically useless. Disdainful of science, enterprise, and other practical learning, they moved into law, teaching, and politics. Incapable of performing any useful task for a business, they thronged into the environmental movement where they could harass businesses from a moral pinnacle without submitting to the humbling discipline of serving customers. The youth of the 60’s crowded onto the pulpits of the media and the academy in such numbers that a reporter or professor made less money than a garbage man. Seething at such obvious inequities of capitalism, they castigated the prosperous for “greed” and “workaholism.”
The 80’s were the decade when socialism died and left nothing but a bristling carcass of weapons pointed toward the West. It was the decade when tax rates were cut in 55 nations and revenues dropped only in nations that raised their rates. It was the era when capitalism at last demonstrated conclusively its superiority as an economic system. It was the era when U.S. economic growth rates, long lagging behind the rest of the world, surged ahead of Europe, Africa, and Latin America, and nearly caught up with Japan’s for the first time since the early 50’s.
The 80’s also saw the longest peacetime expansion on record, continuing today, with the highest rates of investment in capital equipment and the highest rates of manufacturing productivity growth of any postwar recovery. During the 80’s, the U.S. increased its share of global manufacturing output, global exports, and global GNP. Contrary to thousands of reports to the contrary, U.S. balance sheets mostly improved, with debt as a share of assets dropping drastically for both businesses and households, as equity and real-estate values rose far more rapidly than indebtedness. Even government debt, as a share of GNP or in relation to real national assets, remained modest by historic and international standards.
After tax-rate reductions in the early 80’s took effect in 1983 and 1984, revenues rose some 9 percent a year in real terms, far faster than during the high-tax 70’s. During the 80’s recovery, industrial output rose nearly 40 percent, personal income 20 percent, and all segments of American society benefited from the creation of 22 million new jobs at rising real wages. Black employment rose 30.3 percent and Hispanic employment nearly 50 percent.
Why then all the gloom-mongering? Chiefly because by the critical indices of marital stability, which is what mostly matters, the 80’s brought tragedy to millions of American families and their children. Contrary to all the claims of the Left, female-headed families are mostly a disaster, incapable of disciplining boys or of escaping poverty. Poverty, however, is not the key problem. Census figures showed that the poorest Americans spent some three times more money than they reported as income. Their problem was not chiefly monetary but a breakdown of the moral codes of civilized society.
Although not economic in nature, this persistent problem of family breakdown cast a pall of failure on all the economic triumphs of the 80’s. Both sought and desired by the Left as a form of liberation, family breakdown accounted for most of the crime, drug, and other “poverty” problems widely blamed on Reagan policy. Without paternal discipline and role models, teenage boys run amok in nearly all societies.
Most of all, family dissolution was crucial to the single greatest propaganda triumph of the Left since the Great Depression: the homeless who haunt all the proud towers of 80’s prosperity. Like the Great Depression, which was caused by massive tax and tariff hikes, the homeless problem is a harvest of failed government policies: favoring divorce and illegitimacy over marriage, deinstitutionalizing the mentally ill, rescinding vagrancy laws, and stifling cheap housing with regulations, codes, and controls. But in the morbid feedback loops of liberalism, the answer is always more government subsidies for homelessness and family breakdown and thus more propaganda for the enemies of America. As the most gullible of Americans, many intellectuals blamed the homeless on the capitalist successes of the 80’s.
A more Basic reason so many intellectuals resent the 80’s is that they uphold an adversary culture, and the decade saw the triumph of mainstream values. American intellectuals prefer socialist regimes where intellectuals seem to rule rather than an America where power and status accrue chiefly to the providers of useful goods and services to others. Many intellectuals prefer a so-called “society of poets” where most people live in fear and famine (Nicaragua or Cuba) to a society where capitalists surpass intellectuals in income and status.
This preference of the intellectuals became obvious in the debates over taxes. Objectively the debate should be over and the Left in rout. It was predicted in Congress that the Reagan tax cuts would reduce the share of real taxes paid by the “rich”—those earning over $200,000—to 7.8 percent of total income-tax receipts; the same was predicted for the relatively poor, those earning under $15,000. Instead, by 1987 (the latest available detailed data), payments by the rich had soared to nearly 20 percent of all income-tax revenues and payments by the poor plummeted to 2.8 percent. As supply-siders predicted, the rich did pay a lower share of their incomes than previously, but they earned and declared more income, thus increasing both their absolute and proportionate tax contribution. This healthy and desirable result, which meant that American elites were working harder and investing more effectively, makes the New Republic, Kevin Phillips, and other levelers seethe.
In the long run, it is impossible to get more money out of the rich unless they earn and produce more. More rich people and more production are the essence of economic growth, which is indispensable both to any enduring improvement of the lot of the poor or any enduring increase in government revenues.
The fact is that people in America with enough gumption to work hard and keep their families intact thrived during the 80’s. The share of all families with real incomes over $50,000 rose from 20 percent to 26 percent and the share earning more than $35,000 rose from 41 percent to 46 percent.
In capitalism, the rich and well-off usually deserve their money, not as a prize for virtue or a celebration of consumption, but for their superiority as investors. Capitalism works because people who demonstrate their ability to create wealth also win the right to reinvest it. The experience of the 80’s around the world showed the effectiveness of this system both in increasing government revenues and expanding opportunity and mobility for the poor.
High tax rates do nothing to stop people from being rich; if you are already rich, you can always manipulate your funds and properties in ways that avoid taxation. British aristocrats and Swedish tycoons have demonstrated this for decades. High tax rates, however, are very effective in preventing poor and middle-class people from getting rich by working harder and more resourcefully than the classes above them. Thus these high rates impoverish entire societies. The 80’s forced the leftist intellectuals to make clear that they prefer equally distributed poverty to the enriching inequalities of capitalism.
The greatest triumph of the 80’s, however, was not fully evident in any economic data. It was the computer revolution, entirely a product of relentless discipline and creative genius in capitalist nations. Computer-industry revenues more than quadrupled, unit sales rose by a factor of hundreds, and computer cost effectiveness rose ten-thousandfold. At the end of the decade, U.S. companies still hold some two-thirds of the world market, and in critical software and leading-edge microchips their market share is above 70 percent and growing. In particular, the U.S. leads in using personal computers, with well over half of the world’s 100 million PC’s located in the U.S. in 1990. The U.S. has three times as much computer power per capita as the Japanese.
This development, which impelled most of the world’s economic growth during the decade, was also disastrous for the Left. The Left has always pinned its hopes on politics. The converging technologies of computers and telecommunications are radically reducing the power of politicians. An ever-increasing share of world wealth assumes the mobile form of information technologies which, unlike the massive industrial systems of the past, are difficult to measure, capture, or tax. The computer age is an age of mind, elusive and hard to control. This ascent of mind is devaluing all the entrenchments of material resources and geography within the ken and command of politicians. As Gorbachev himself has observed, the computer revolution was critical to the crisis of Communism: “We were among the last to understand that in the age of information sciences the most valuable asset is knowledge, springing from human imagination and creativity. We will be paying for our mistake for many years to come. ”
Perhaps Gorbachev was the leftist most visibly shaken by the experience of the 80’s. But all suffered a similar discomfiture and few enjoyed it. In the coming decade, politicians will fight back with an array of seductive appeals to intellectuals. Obsolete notions of national autonomy and self-determination—favoring the ruling elites and bureaucracies against the forces of global capitalism—will ring out at the UN, in every trade-policy forum, and on every campus. Using trade-gap data that are meaningless in an information age, politicians will try to divide business with protectionist demagoguery against Japanese rivals. “Balanced trade” will never again be seen on earth, and it can be pursued only by a globally impoverishing expansion of government power over international flows of goods and investments. A key issue for the future of America is which side the intellectuals will now take: the side of materialist envy and political prejudice, or the side of opportunity and intellect.
In the conflict between matter and mind, the euphoria of the 90’s poses a real threat to the U.S. no less serious than cold-war Communism. Abetted by nihilist and terrorist forces lashing out against the wealth they can neither earn nor produce, and exploiting the vast destructive powers of modern weaponry, the enemies of civilization are more dangerous than ever. Of all the Western nations, only Israel fully comprehends this menace and is appropriately resolute and resourceful in resisting it. Thus the new socialist assault, with its nihilist edge, will often target Israel as the world’s leading obstacle to “peace.” Israel’s ability to resist these pressures ironically will depend on its own capacity to transcend the socialist forces within its own borders that have turned what should be one of the world’s most prosperous nations into a dependent of American charity.
In the 90’s, intellectuals must understand that the reason to defend free societies is not their legal rectitude in international disputes or any obsolete shibboleth of national self-determination in little despotic satrapies around the globe. The reason to defend free nations is their role as the capitalist bearers of civilization and human progress, freedom and technology, in a world that will sink into unspeakable horror without their leadership.
The characterization of American failures in the 80’s offered by the editors of COMMENTARY seems to me excessively mild. For it’s not just that Wall Street has gone piratical and that buccaneers have looted the savings-and-loan (S & L) banks and that the 80’s turn out to have been, as ought to be noted, probably the most corrupt business decade in American history. Government, too, has descended further into corruption than at any time in living memory. Nor is it just that, in regard to foreign affairs, American bellicosity has had a bad effect on national interests. The election of the Reagan administration at the start of the decade produced horrific massacres in El Salvador, not without CIA complicity, and led to the first death squads in Honduras. In Nicaragua, tens of thousands of people have been killed as a result of our policies, and for no purpose that would not better have been served by following the peaceful advice of that sagacious social democrat, Oscar Arias of Costa Rica—whose own administration was, by the way, shamelessly undercut and subverted by White House shenanigans during the period in question. So there has been a mortal cost as well as costs that are moral and monetary.
There has been a cost to democracy in Washington itself, since the 80’s were marked by still another major Republican political scandal (after McCarthyism and Watergate), showing once again how, in America, the greatest threat to liberty comes from fanaticism of the Right. On a national scale, the 80’s have seen such a cheapening of electoral debate that Eastern European democrats, when they picture the kind of political life they would like to encourage, make a point of citing the recent American elections as the sort of failing that ought to be avoided. And all of these developments have come on top of a shift of wealth so vast and undemocratic that, as the editors correctly remind us through a quiet reference to homelessness, walking to work in a city like New York means stepping over the diseased addicted bodies of the most abject persons that anyone can imagine.
How to account, then, for “the worldwide triumph . . . of the prevailing ideas and policies of the American 80’s”? The question is delusionary. It is like the stuttering airplane passenger in Salman Rushdie’s Satanic Verses who flaps his arms every time his plane rushes down the runway, and after take-off turns to his seat-mate and says: “Wowoworks every time.” Yes, the federal budget grew amid a lot of rhetoric affirming otherwise; social welfare was diminished; law, conscience, justice, and Congress were flouted; wars were fanned. And sure enough, some happy events soon occurred, namely, the world democracy movement, the collapse of Communism plus that of two or three fascistic dictatorships on the Right. But the prior existence of A does not indicate a causal relation to B.
In one respect, though, a broad if not exactly worldwide “triumph” of the “prevailing ideas and policies of the American 80’s” may indeed have occurred. There has been a public-relations triumph. In many parts of the world, something called “Reaganism” has come to be admired. This can be explained by two factors. The United States still plays an emancipatory role in many parts of the world and especially in Europe—the role that was described most powerfully by Henry Adams a century ago. Morally you might suppose that the American ideal would long ago have drowned in seas of blood and injustice. But the ideal is unsinkable; not even the worst and most reactionary of decades can overwhelm it. The thing bobs back into view, and people look at America from far distances and keep noticing that political liberty and individual freedom and the possibility of self-improvement are not the ideological mirages that sometimes they are said to be—no matter how grim may be America’s other problems. And since those faraway people cannot easily distinguish between the ancient American ideal and a given ephemeral administration or era, they are likely to assign to that ideal whatever name appears most current. If some recent elections had gone a little differently, that name might well have been “Carterism,” “Mondaleism,” or “Dukakisism.”
Instead “Reaganism” enjoys a wide vogue. (No one will ever speak of “Bushism”; a man requires character to become a doctrine.) But apart from an identification with America, what does Reaganism (“the prevailing ideas and policies of the American 80’s”) mean? In this country, Reaganism means, I would say: laissez-faire and supply-side economics, diminished social welfare, diminished concern for the rights of minorities, militarism, anti-Communist bombast, religion (especially fundamentalism), cultural conservatism, and a populist tolerance of such tendencies as the anti-Darwin movement in education. Reaganism’s meaning abroad is not the same. An Eastern European who uses the term is likely to have in mind “liberalism” in the European sense (market economics, secularism, enlightenment)—combined with sympathy for American styles that belong more to the Left than to the Right. An Eastern European “Reaganite” might well favor unregulated market economics, bebop, rock-and-roll, and “We Shall Overcome” as an all-purpose anthem. The students of Prague, who tend to admire Ronald Reagan personally, sang precisely that non-Reaganite song in Czech and in English as they marched through the streets on the day that the 1989 revolution broke out.
If we are to bandy about the relative triumphs of American decades, I would argue, in fact, for the radical 60’s, whose global successes would make a fine theme for a further symposium. The role of the American (and Anglo-American) 60’s in spreading democratic ideas turns out to have been enormous. Anyone can verify that by walking past the scruffy idealistic guitar-twangers of Prague’s Wenceslas Square. The enduring popularity of certain aspects of 60’s radicalism, stripped of some political and chemical errors of that era, may prove to be one of the main ways in which America’s emancipatory role continues to be performed. Imagine! (Lyrics from “Imagine,” the radical utopian song by John Lennon, are anti-Communist graffiti in Prague. And why not?) Of course, in the matter of decades, the principal credit for the miraculous events of 1989 still belongs to the 1770’s and 1780’s. History moves a lot slower than we have sometimes thought it would.
The editors of COMMENTARY have addressed the right issues, but they have put these issues cart foremost. They speak of the failings of the American 80’s as a debatable point (“Do you accept these characterizations?”). But they speak of the “worldwide triumph” of the “prevailing ideas and policies” of that decade as a simple reality. This reversing of the debatable and the factual is an example, I think, of the American habit of dousing ourselves with celebratory champagne in order not to notice the evidence of capitalist horrors that sprawls nightly across our own doorsteps. Even so, the editors’ question has the virtue of at least noticing, if only to dismiss its significance, a possible contrast between American failings and successes. That contrast has always been the great tragedy of our national life. If today there has been in some respect an American triumph (no matter whether we ascribe that success to the American 80’s or to other, grander, more radical factors), the contrast, therefore the tragedy, is only the greater. What are we going to do about that?
Did the Reagan administration sponsor a culture of greed and selfishness? Is there a culture of greed and selfishness? Oddly, the answer to the first question is “yes, to some extent,” even though the answer to the second is probably “not really.”
Kevin Phillips has handed up the most unrelenting indictment of the Reagan administration in his recent book, The Politics of Rich and Poor. Much as I disagree with his economic analysis, I think Phillips is right about the political ambiance. The Reagan administration often seemed grandly indifferent to the ethics of limited government.
My own exposure to this indifference involved the policy debates about poverty, welfare, and the underclass. There were bright spots. The second echelon of officials included some people who thought deeply and cared passionately about these problems. Sometimes they managed to get the right paragraph into a presidential speech; sometimes they managed to get the right provision into a legislative proposal. At the cabinet level, William J. Bennett was a splendid exception, doing for the educational debate what needed doing for the welfare debate. But the dominant impression conveyed by the Reagan White House was that an expanding economy would minimize poverty, the social safety net would handle the residual problems, and that the best way to cut the domestic budget was by getting rid of all that fraud and waste in programs like Medicaid and food stamps. When it came right down to it, one suspected, the top people in the administration didn’t lie awake nights worrying about the poor.
This would not have been so bad—people who lie awake nights worrying about the poor have done much mischief—except that the indifference coexisted with the Reagan administration’s failure to confront the central standard of fairness involved in reducing the size of government: what’s sauce for the goose also must be sauce for the gander.
Take, for example, the administration’s attitude toward cutting the budget. Large sums of money can be cut from the domestic side of the budget if one is willing to yank the middle class and the corporations from their many federal teats, politics be damned. In Reagan’s first term, several idealistic officials in the administration were eager to do just that. They were largely rebuffed, however, either by more senior officials in the administration or by Congress. They needed leadership from the top. Only Ronald Reagan had the stature and the podium to articulate a coherent program for cutting domestic spending in which everyone would lose some of his special privileges, but the payoff would be balanced budgets, less intrusive government, and simpler, fairer rules for everyone. He didn’t give that leadership. The implied message was that the Reagan administration was comfortable with government benefits if they went to the right people.
The S & L fiasco is another example of the way in which the administration exhibited the ethical carelessness that lends credibility to the greed-and-selfishness indictment. The blame for the fiasco is complicated and diffuse, much of it antedating 1981. But a bedrock principle of Reaganism (as I understand Reaganism) is that people must bear the consequences of their actions. Giving S & L lenders license to make risky loans without incurring commensurate risk themselves should have caused instant consternation in every administration official who understood the specifics of the deregulation. Apparently it didn’t. Whatever the excuses, the result was that the Reagan administration permitted precisely the same error—government-created incentives for irresponsible behavior—concerning affluent S & L bankers that it regarded with such horror when it saw criminals let off without punishment, teenage girls rewarded for having babies, or students given diplomas without having to study.
So I will not defend the Reagan administration against accusations that it encouraged an “I’m all right, Jack” mentality that in turn encouraged greed and selfishness. That said, it still remains unclear to me how much real effect this mentality had on the nation, and how much of the hoopla about the greed and selfishness of the 80’s described a reality. In this brief COMMENTARY, I will limit myself to two observations.
The first is that you can’t have sustained, sizable economic growth without producing something very like the phenomena that have gotten journalists so upset about greed and selfishness. Economic growth means that a lot of people get rich. When people get rich, some of them are going to buy fancy new houses and splashy jewelry and big boats. A look back in American history reminds one of the late 19th-century Gilded Age, the Roaring 20’s, or the post-World War II prosperity that prompted Vance Packard to write The Status Seekers. Kevin Phillips refers to this conspicuous consumption as the “capitalist blowout” and sees it as something that could have been prevented. I’m not convinced. If you want exuberant economic growth, you’re also going to get nouveaux riches and buccaneers and con artists and splendiferous camp followers as part of the package. It is one thing to argue that this will eventually produce a political counter-swing (which might well be true), and another to say that it was a product of public policy.
Preventing such effects in the 80’s would have been especially difficult because of the increasing size of the economic base. Measured in proportional terms, growth in Gross National Product from the end of the recession in 1983 to Reagan’s last year in 1988 has been matched by many other episodes in American history. But as the base gets larger, even modest percentage increases in GNP can produce unprecedented raw quantities of new wealth. The increase in just the five years from 1983 to 1988 was larger than the increase during the entire three decades from 1870 to 1900 that constituted the Gilded Age. It was larger than the increase from 1900 through the next three decades, to the peak of the boom in 1929. And by “larger,” I refer to per-capita changes in real GNP, controlling for both inflation and population size.
The 80’s were also exceptional in terms of the size of the newly affluent population. This is no place for a lengthy analysis of income distribution, but this fundamental fact about the Reagan years has been obscured by the rhetoric about the rich getting richer at the expense of the poor: using the most basic government data on distribution of wealth, money income of households, constant 1988 dollars, and the Bureau of the Census’s, breakdown into nine income brackets, the proportion of households in every income bracket up to $35,000 decreased during the Reagan years. The proportion in the low-income brackets (up to $14,999) decreased from 29.4 to 27.3 percent of households from 1980 to 1988, while the proportion in the working- to middle-income brackets (from $15,000-$34,999) decreased from 37.5 to 34.6 percent.
This simple set of facts leaves much to be debated. It may be argued that the proportions of people in the lowest brackets decreased trivially, for example, or that the improvements reflect an increase in two-income families, not increases in wages. On the other side, it may be argued that household-income figures, which include welfare households, understate the true improvement among families who remained in the labor market. But it cannot be argued that during the Reagan years the poor got poorer or even that working-class households got poorer. They didn’t. They got somewhat less poor, though not dramatically so. Meanwhile, the big change occurred in the income brackets from $50,000 on up, which increased from 15.8 percent of households to 20.8 percent. This translates into a remarkable 6.3 million new households who moved above the $50,000 level during the Reagan years. It seems reasonable to infer that this newly prosperous group contributed to an atmosphere in which people seemed to be unusually preoccupied with material possessions. Affluent readers of this article may remind themselves of their spending behavior when they first found themselves in possession of discretionary income, even if that happy event occurred before the selfish and greedy 80’s.
In sum: the Reagan boom involved the creation of widespread wealth and sometimes great wealth, accompanied by the predictable side-effects. Radically redistributionist tax policies could have prevented these effects, but also would surely have killed the economic growth.
My Second observation is that the behaviors that get in the newspapers don’t necessarily have much to do with the day-to-day life of most people. Even given the large numbers of people who became more prosperous during the 80’s, and granting that some of them behaved as the tabloids breathlessly describe, it is not obvious that many people behaved in ways that fit the stereotype. My question (I don’t have the answer) is this: what proportion of the American population experienced at first hand an increase in greed and selfishness, either in themselves or their neighbors, in the 80’s? Maybe my experience is atypical, but somehow I missed out. For me, the 80’s were the polar opposite of the image of the decade, a time for sinking roots and thinking more, not less, about obligations to family and community. As I mentally run down my list of friends, I can think of one couple who became noticeably more prosperous during the 80’s but they along with all the others seem to have gotten less acquisitive, less self-absorbed, more concerned with others during the supposedly greedy and selfish decade.
There is a catch in this exercise, of course: I was in my forties for most of the decade, and so were most of my friends. I suggest that one final reason the decade was so commonly characterized as acquisitive and self-absorbed is that the Baby Boomers weren’t in their forties like me, but in their thirties. It is not a novel argument, but it bears repeating: ever since the 60’s, the Baby Boomers have defined the decade’s Zeitgeist—rebellion in the 60’s (the modal Baby Boomers were in their teens), narcissism in the 70’s (they were in their twenties), and acquisitiveness in the 80’s (they were in their thirties). I’m at a loss to explain precisely why this bulge in the population distribution so decisively affects the popular culture—they comprise only a modest proportion of the total population—but they have had that effect for three decades and there is no reason to think it will change during the next three.
Even as I write, the modal Baby Boomer is approaching his fortieth birthday and the characteristic attitudes of middle age. One may, therefore, confidently predict that no matter what happens in the White House, the 90’s will be a decade that celebrates family and community and traditional values. The Zeitgeist will change, and commentators on the culture will need to have explanations. Presidents are convenient for this purpose. (Could any presidential couple be a more convenient explanation for a return to family values than the quintessentially Dad-like George Bush and Mom-like Barbara Bush?) But that will not tell us how much American life has really changed, or how much the Bush administration has had to do with it—just as it is not clear how much American life changed during the 80’s, or how much the Beverly Hills Reagans had to do with it.
The collapse of Communism represents a victory, of sorts, for the West—but not much of a victory for the United States and certainly not for the “prevailing ideas and policies of the American 80’s.” The real winners, as everybody knows, are the West Germans and the Japanese, who owe their power and prosperity to a combination of circumstances having nothing to do with the “policies of the American 80’s.” While the United States and the Soviet Union were exhausting themselves in the production of armaments, West Germany and Japan, unburdened by competition in the arms race, rebuilt their shattered economies and cultivated the arts of peace. Unlike the United States, those countries learned something from the devastating experiences of the 30’s and 40’s. Whereas Americans learned only the dangers of “appeasement,” the West Germans and Japanese made a serious attempt to come to terms with their recent past. They gave up atavistic dreams of imperial conquest and racial destiny. Having been drawn more reluctantly into the modern world than other industrial nations, they embraced modernity, after World War II, with the enthusiasm of converts. They invested heavily in the modernization of infrastructure, replaced technologies geared to mass production with technologies geared to production for specialized markets, perfected first-rate systems of education that assured a trained and disciplined work force, built up an efficient civil service, and taxed their citizens in order to maintain elaborate health and welfare services. They drew on traditions of solidarity and collective discipline, inherited from the pre-modern past, and put them to work in the service of enlightened, secular objectives—peace, prosperity, health, social welfare.
These are not very exalted goals, and the success of West Germany and Japan illustrates not only the benefits of successful modernization but the price paid in the form of spiritual shallowness and a shallow concept of democracy—one that stresses the equitable distribution of material comforts rather than the character-forming effects of civic participation. But the point is that the “ideas and policies of the American 80’s” do not explain the success of West Germany and Japan or the revolutions in Eastern Europe. It is presumably the West German model, not the American or Soviet model, that attracts the populations of Eastern Europe, together with the prospect of membership in a European community that will give economically backward nations access to the benefits of development. Many people in Eastern Europe still think of the United States, no doubt, as a land of fabulous wealth; but further exposure will correct that impression and teach them to regard America as an example, if anything, of social and economic stagnation. A country that lacks many of the social services and amenities taken for granted in other advanced-industrial countries—national health insurance, good schools, efficient systems of public transportation, civic order—is unlikely to serve as a model or inspiration for Eastern Europe.
None of this is meant to deny that the collapse of the Soviet empire adds up to a diplomatic, military, and ideological victory for the West and even, in a limited sense, for the United States. Many years ago, George Kennan held out the hope that containment would eventually lead to the “breakup or mellowing of Soviet power.” He turns out to have been right: the United States forced the Soviet Union into an arms race that has wrecked its economy, perpetuated a corrupt and inefficient bureaucracy, and retarded the development of democratic institutions. Gorbachev’s resignation from the cold war is an admission of defeat, no doubt about it. It is above all an ideological defeat. Socialism can no longer claim to be the wave of the future. The hope that sustained a generation of leftists—that “actually existing socialism” would evolve toward “socialism with a human face,” while social-democratic regimes in the West would move toward a more thoroughgoing form of socialism—has suffered a blow from which it is not likely to recover. The removal of the socialist alternative from political debate (at least in its classic form) will have profound effects both in the East and in the West.
Still, it is too much to claim a victory for the type of free-market ideology promoted by Thatcher and Reagan. Britain and the United States are declining powers, and the privatization of public services has not only failed to arrest their decline but contributed to it. The ideological victory, if there is one, belongs to proponents of the “mixed economy” we used to hear so much about in the 50’s and 60’s. That we don’t hear much about the “mixed economy” today does not mean that it has disappeared, only that American liberals have lost control of the political agenda. Elsewhere in the industrial world, it remains the dominant polity—uninspiring though it may be.
The victory of the West, then, is not a victory for the free-market ideology that seeks to privatize everything in sight. Nor is it a vindication of American diplomacy, except in a highly provisional sense. We are witnessing the mellowing and possibly the breakup of Soviet power, but these developments may lead to destabilizing effects we will come to regret. In any case, we have already paid a heavy price for anything we can be said to have won. The containment policy, as critics pointed out from the beginning, required the creation of a global network of client states; and the need to sustain its credibility as the protector of those clients forced the United States into police actions, and finally into the disastrous war in Vietnam, that were inconsistent with its national interests. The cold war, moreover, caused serious distortions in the American economy. Military spending created jobs and promoted economic growth in the short run, but in the long run it deflected investment from plant expansion and modernization, making the United States weak in exports and more and more vulnerable to imports.
Besides undermining America’s position in world markets, the cold war brought about an institutional interpenetration of the corporations, government, and universities. It thus contributed to the centralization of economic and political power and widened the gap between the affluent, heavily subsidized sector of the American economy, which rests on technologies originally developed in connection with national defense, and the impoverished, technologically backward sector for which public subsidies are unavailable. The flexible technologies required by advanced societies—technologies compatible with decentralized control over production, a high degree of workers’ skill and initiative, and efficient use of energy (as opposed to heavy reliance on nonrenewable resources)—are not the kind of technologies that were encouraged by an economy based on the production of weapons. At a time when our competitors were beginning to perfect more sophisticated technologies, the defense economy forced the United States to adopt technologies suited only for mass production—technologies that enforced a rigid separation between the design and the execution of work, eradicated every vestige of workers’ control, and left the work force demoralized and apathetic.
Preoccupation with external affairs, during the long years of the cold war, led to the neglect of domestic reforms, even of basic services. Medical services, public health, and education are notoriously backward in this country. In the matter of education, the most dramatic instance of American decline, obsession with the Soviet threat led to ill-conceived reforms designed to educate a scientific elite, at the expense of an intelligent, enterprising, and politically knowledgeable work force. Here again, the Soviet threat proved to be far less serious than the economic threat from nations that understood the importance of education and rapidly outstripped both superpowers in the quality of their schools.
To list all the bad effects of the cold war would take more space than I have here. The development of a secret police, the erosion of civil liberties, the stifling of political debate in the interest of bipartisan consensus, the concentration of decision-making in the executive branch, the sheer growth of the executive and its declining accountability, the secrecy surrounding executive actions, the lying that has come to be accepted as routine in American politics—all these things derive either directly or indirectly from the cold war. Their worst effect has been to undermine confidence in government, to weaken our public culture, and to destroy the delicate fabric of trust on which civic life depends. If the West can be said to have won the cold war, the United States can hardly be said to have shared in the fruits of that victory. It would be closer to the truth to say that in the course of their long rivalry, the Soviet Union and the United States have destroyed each other as major powers, just as many critics of the cold war predicted. No doubt we have a long way to go before we reach Soviet levels of economic inefficiency, political apathy and cynicism, and bureaucratic intrigue; but that is pretty clearly the road we are traveling. We can rejoice that Eastern Europe has been delivered from Soviet control. For ourselves, however, there is little to celebrate.
Ronald Reagan is the first Republican I ever voted for for President, and I did it not once but twice. In 1976 I voted for Jimmy Carter, the man who should go down in history as the first President to use a blow dryer in the White House, but I couldn’t bring myself to do it a second time. A photograph of Carter, collapsed while jogging, weak in the knees, held up by secret-service men, provided a symbolism I felt I could not ignore. A collapsed jogger with a hot-comb hairdo and a big smile—that seemed to me precisely the condition of the United States under the man who preferred to be called Jimmy.
I am not at all sorry about having voted for Ronald Reagan, except perhaps only for the fact that having done so prevented me from scoring some fairly easy jokes off him. He made, let us face it, an easy target. A former and not particularly adept actor, a man entirely without intellectual interests or even pretensions (though consistently grammatical and well-spoken), married to a woman who seemed a damned icy little proposition—plenty of room here for the expression of casual contempt. Acquaintances, not under the burden of having voted for Reagan, were regularly telling me how “stupid” he was. “What,” I used to say in defense of my man, “has intelligence got to do with running the United States?”
That, I now realize, was only a half-joke. I don’t know how intelligent Ronald Reagan may be, though in ways that matter he is, in my view, more intelligent than either his predecessor (a graduate of Annapolis) or the man who succeeded him (Yale, Phi Beta Kappa). More important than intelligence, Reagan had, I think, correct beliefs. He believed (rightly) that the United States was slipping badly in almost every regard; he believed (again rightly) that Americans had lost confidence in themselves as a nation; he believed (bingo!) that Communism was no good thing and therefore was not to be encouraged by weakness or accorded undue respect; and he believed (more dubiously) that a common-sense approach could solve most contemporary problems. Ronald Reagan not only believed these things but he believed them deeply and absolutely, and, given the state of demoralization of the nation at large when he came into office, it was a good and useful thing that he did so believe them.
Most journalists, however, saw it quite differently. The Worst Years of Our Lives is how Barbara Ehrenreich, a heavy contributor to Mother Jones and Ms., titles her recent book of essays on the Reagan years; “Irreverent Notes from a Decade of Greed” runs her subtitle. Miss Ehrenreich bangs down on the melodeon of clichés with thick fingers. She blames Ronald Reagan for everything short of earthquakes that she doesn’t like in the United States. In her introduction, she tells us that her father, who has Alzheimer’s disease, when asked, as a test of mental competency, who was President of the country, would snort back, “Reagan, that dumb son of a bitch.” The little vegetables in nouvelle cuisine seem to Miss Ehrenreich the last word in decadence. Yuppie is a word that, in her moral lexicon, is right up there with fascist. With the thick-fingered method, you never have to worry about missing a note; this is accomplished by hitting all the keys at once:
Greed, the ancient lubricant of commerce, was declared a wholesome stimulant [during the Reagan years]. Nancy Reagan observed the deep recession of ’82 and ’83 by redecorating the White House, and continued with this Marie Antoinette theme while advising the underprivileged, the alienated, and the addicted to “say no.” Young people, mindful of their elders’ Wall Street capers, abandoned the study of useful things for finance banking and other occupations derived, ultimately, from three-card Monte. While the poor donned plastic outer-ware and cardboard coverings, the affluent ran nearly naked through the streets, working off power meals of goat cheese, walnut oil, and crème fraîche.
Bet you didn’t know that you had lived through such hellish times. But, then, at Ms. and Mother Jones the tendency is not to accentuate the positive.
Barbara Ehrenreich’s clichés are not restricted to her or to her favorite publications. When it comes to writing about the Reagan years, most journalists sidle up to the melodeon. Here, for a notable example, is the New York Times columnist Russell Baker, bemoaning the increased expense of life on Nantucket, where he has had a summer home for more than twenty years: “During the Reagan years the island experienced an onset of decamillionaires; that is, people so rich they could spend a million dollars decorating a house that in 1965 might have sold for $25,000.” A tricky business complaining about the wealth of others, especially when one has one of the better jobs in journalism and a few best-sellers under one’s own belt. But let that pass. The larger point is that the chief clichés about the Reagan years, which are now beginning to harden, have to do with unexampled greed: decamillionaires jogging off their nouvelle cuisine while chewing up the countryside.
Or consider the Yuppie, the principal cliché villain of the 80’s (though one heard a bit less about him as the decade drew to its close). “It’s one of those goddamn Yuppie restaurants,” was a sentence I seem often to have heard during the 80’s. “The Yuppies have moved in and wrecked the neighborhood,” was another. So widespread was the contempt connected with the term that it wasn’t easy to find anyone ready to own up to being one, though they were usually readily enough spotted: a Yuppie was generally the other fellow. And yet what was so heinous about being a Yuppie? If the Yuppies can be said truly to have existed, then they were young men and women who worked twelve- and fourteen-hour days in order to indulge their penchant for designer clothes, BMW’s and Saab’s, gorgeous grub, and minor appliances. The world, surely, has known worse villains, yet the Yuppie, as a social type, a real Reagan phenomenon, could certainly get people worked up into a frenzy. Perhaps it was because they weren’t as spiritual as you and I—or at least I.
Even though I twice voted for Ronald Reagan, I am not one of those decamillionaires, or even close to a unimillionaire. I drove into the 80’s in a 1978 Chevy Malibu and drove out of the decade in a 1988 Oldsmobile Cutlass Ciera. Not much evidence of taking advantage of such things as oil-depletion allowances and leveraged buyouts here. During these years, my taxes seemed regularly to go up, my expenses never down. Two children in private universities kept me typing much faster than I thought I could. At least I never felt the need to jog, having spent so much of the decade, financially, running in place. No doubt I was not well-positioned to take advantage of the famous greed of the 80’s. But then, those of us mired in the middle class are used to finding ourselves just there, smack in the middle—and where better to feel the squeeze?
Yet I am far from certain that I, and people in similar conditions, would have fared better under a Democratic presidency. Certainly we didn’t under the effable Jimmy Carter. I have the most limited economic knowledge, but I am, at least in a general way, for reducing government spending, not out of any high principle but mainly because, in my experience, it is almost invariably inefficient spending. But whenever the Reagan administration attempted to reduce federal spending, out came the journalistic clichés. Two of the great cliché phrases invoked endlessly during the Reagan years were “savage cuts,” which the administration was supposed to be attempting in all realms but that of defense—and “savage cuts,” as everyone surely knows, result in (all together now) “chilling effects.” When, for example, the Reagan administration was confronted with the awkward fact that a great many people were forfeiting on repayment of their federal student loans, and consequently made student aid of this kind more difficult to obtain and to evade repaying, a chorus of journalistic shock went up over these “savage cuts,” which could only have (how did you guess?) “chilling effects.” You would think that once, just once, the writers on the New York Times would discover a chilling cut that had a savage effect, but if they ever did I missed it.
The paramount economic cliché about the Reagan years is that under Reagan’s economic policies only the rich got richer. Kevin Phillips, who is usually advertised as writing from a conservative standpoint, recently reiterated this cliché in the New York Times Magazine, writing: “It was the truly wealthy, more than anyone else, who flourished under Reagan.” Phillips adds: “Meanwhile, everyone knew there was pain in society’s lower ranks, from laid-off steelworkers to foreclosed farmers. A disproportionate number of female, black, Hispanic, and young Americans lost ground in the 1980’s, despite the progress of upscale minorities in each category.” Now, doing a turn on this cliché by reversing it, Forbes went to a putatively liberal Harvard economist named Lawrence Lindsey, who reports that, in his view, owing to the Reagan tax cuts, the budget will be balanced by the middle of the 1990’s and the American economy will be generating enormous surpluses. Even more tax cuts are required, according to Professor Lindsey, “to preserve the incentive and to avoid giving the politicians money for pork.” So there you have it: a conservative announces that everything bad you have ever heard about the Reagan economic policies is true, and a liberal reports something like the reverse. What’s a simple-minded fellow in a 1988 Olds Cutlass Ciera to think?
From behind the tilt wheel and tinted glass of that Olds, it begins to look as if economists of all shades of political opinion are not that smart. “If You’re So Smart, How Come You Ain’t Richl” is the subtitle of an essay by Donald N. McCloskey (“The Limits of Expertise,” American Scholar, Summer 1988) that discusses, in an amusing and penetrating way, the extreme limits of the predictive powers of economists. (Some economists, it is true, are rich, but generally from giving advice and not from taking their own.) Along with not being very good at predicting the economy, economists are not especially good at describing it, either, which makes it very difficult to know what is going on at any moment. On this morning’s radio news, for example, I learned two perfectly and typically contradictory economic facts: that the American economy continues to show expansion and that the consumer confidence index is well down. Go, as they say, figure.
Again from my windshield’s-eye view, it strikes me that Reagan’s economic policies were successful at reducing inflation and unemployment, neither a small thing. As for these same policies encouraging unprecedented greed, I suspect that there is no act of greed without plenty of precedent. Moreover, I was interested to read in George Russell’s recent “Trashing Wall Street” (COMMENTARY, July 1990) that it was a Democratic Congress that paved the way for the great junk-bond fiasco by passing legislation freeing S & L institutions to invest in high-risk enterprises. Reagan’s economic policy is blamed, too, for the creation of the homeless. AIDS is also often blamed on what are felt to be Reagan’s rotten economic priorities. Dubious stuff, but then, as they used to say in the 60’s, if you’re not part of the solution, you’re part of the problem. All such accusations, though, it seems to me, lend to policy a greater prestige, and reality, than it perhaps merits.
Permit me here to bring in my own rather drab experience, this with government policy over a good portion of the past decade in the arts. In 1984 I was nominated by the Reagan White House to be a member, for a six-year-term, of something called the National Council of the National Endowment for the Arts. The Council, which meets quarterly, is an advisory group to the Endowment and its chairman, and for the most part discusses past and present and plans for future policy. Everyone currently on the Council is either a Reagan or Bush appointee, which, one would think, might make them conservative in outlook. Not quite so. A clichémeister such as Robert Brustein might write, in the New Republic, of “Reagan’s thin-lipped dismissal of the arts” (as opposed, one wonders, to whose fat-lipped acceptance?), but in fact the arts budget grew consistently during the Reagan years, and the spirit of the Council, as it was when I first joined it, remains preponderantly liberal. Most members have never met a work of art they didn’t like, and such policy as the Council creates, such public positions as it takes, tend to be very far from anything anyone would be likely to describe as conservative. Such, then, is the policy-enforcing power of modern Presidents.
If Enforcing policy in one’s own country is a tough enough job in a democracy, obviously it is even more difficult to estimate the effect of one’s foreign policy on other nations. And yet there is small doubt that United States foreign policy under Ronald Reagan—and insofar as this same policy was continued under George Bush—had a good deal to do with the astonishing collapse of several Eastern European Communist regimes. A less intransigent-sounding foreign policy than Reagan’s would, I suspect, have delayed this collapse, might even have postponed it indefinitely. In Central America, similarly, a less hostile policy toward the Sandinistas on our part would likely have put off for years the free election that removed the Sandinista government from power at the behest of the Nicaraguan people.
Liberal journalists used to get a hardy laugh out of what they felt was Ronald Reagan’s archaic, not to say troglodytic, anti-Communism, making many a smug little joke about his view of the Soviet Union as the “Evil Empire.” But as everyone now—knows-and ought to have known all along—the people who were forced to live under Soviet rule viewed it essentially as Ronald Reagan did. Reagan’s foreign policy, by keeping up a high degree of pressure on the Soviet Union and other Communist regimes, and by its implicit encouragement of the hopes of adversaries forced to live under these regimes, contributed greatly to the devastation of this, yes, “Evil Empire.” There are many other reasons, in my less than perfectly disinterested view, for feeling good about having voted for Ronald Reagan, but this is surely chief among them.
John B. Judis:
As a political era, the 80’s began in the 70’s, if not earlier—as a response to the antiwar and civil-rights movements, the slowdown of the American economy, and the decline of American power overseas. The reaction reached a climax in Ronald Reagan’s first term and then began to dissipate, though not to disappear. We are still in the 80’s.
The reaction was fundamentally conservative, rooted in a kind of imperial nostalgia—a sense that the United States, once the unchallenged leader of world capitalism, was in decline, and that to halt, if not reverse, that decline, it was necessary to go back to what had worked in the past, ranging from free-market individualism and the virtue of the Founding Fathers to cold-war preparedness and the sexual mores of Muncie.
In politics, the reaction first emerged in Barry Goldwater’s and George Wallace’s 1964 presidential campaigns. Its political success—meaning its ability to forge a majority coalition of Republicans and erstwhile Democrats—depended on two factors: first, the growing racial backlash among white Southern and urban working-class Democrats; second, the transformation by conservative politicians and intellectuals of 50’s-style radical rightism into a majority, governing philosophy.
By the last two years of the Carter administration, the reaction already dominated politics and policy. Just as liberals took charge of the Nixon administration’s domestic agenda, conservatives called the tune in the Carter administration’s last years. Carter’s tax reform became a capital-gains cut; labor-law reform was defeated; and the airlines were deregulated. Even on foreign policy, Carter, by 1980, had capitulated to his conservative critics: he shelved SALT II, and his projected military budget increases from FY 1981 through 1985 were comparable to those achieved during the Reagan administration.
Reagan carried these conservative initiatives forward and added some of his own—appointing a judiciary opposed to the post-1964 civil-rights rulings and removing restraints on S & L’s and mergers—but his primary innovations were ideological rather than substantive. Where Carter had come to stand for American defeat, Reagan stood for the possibility of victory—whether over the Soviet Union or scarce oil supplies. In years to come, Reagan’s first term and the 1984 “morning in America” election will be seen as a kind of ideological Indian summer—the final time in which American goodness and purity would be highlighted against the shadow of the “Evil Empire” to the east. Indeed, by 1986, Reagan and Secretary of State George Shultz had sharply turned course on foreign policy and were steering America out of the cold war and—to that extent—out of the 80’s.
How one evaluates this era of conservative reaction depends on how one defines its objectives. For instance, if one sees the U.S. in the late 70’s as having fallen woefully behind the Soviet Union militarily and strategically, if one sees Western Europe as having been on the verge of “Finlandization,” and if one’s primary goal was to remove this looming Soviet threat, then the program of conservative reaction was an enormous success, and the 80’s a time of American triumph.
On the other hand, if one believes that in 1979 the window of strategic vulnerability was a canard and that the Soviet Union posed little threat to the U.S. and Western Europe, then, from a standpoint of American national security, much of the military buildup of the early 80’s was wasted. What it may have accomplished was to accelerate the collapse of Soviet Communism and of the Soviet empire in Eastern Europe and to speed the Soviet withdrawal from Afghanistan. If this is so, then the buildup did benefit peoples under the Soviet thumb. This is a signal achievement, but its direct benefit to Americans is highly debatable.
This is particularly true if one’s measure of success is economic and social. By this measure, the period from 1971 through the present has been one of decline and disintegration, particularly in the 80’s. From 1980 to 1988, the U.S. share of world exports in automobiles dropped 46 percent, computers 36 percent, microelectronics 26 percent, and machine tools 17 percent. In 1980, the U.S. controlled 60 percent of the world market in semiconductors; by 1988, it controlled 38 percent and the Japanese 50 percent. As the political scientist Chalmers Johnson has remarked, the Soviet Union lost the cold war, but Japan won it.
Decline in these industries means that the American standard of living will sink as American workers are stuck with the less productive and lower-paying jobs within the international division of labor. If the decline continues, America will have the same relation economically to Japan and Germany that Britain or even Brazil has at present to the U.S. And the U.S. will become like Manhattan writ large, divided between a wealthy, parasitic banking class, beholden to foreign capital, and an increasingly unemployed and unemployable working class.
This decline would have occurred no matter who won the elections in 1980 and 1984. It is based less on immediate policies than on broader structural factors. As microelectronic-era manufacturing has required greater capital and long-term planning, the U.S. has been at a disadvantage because of the historic antagonism between industry and government. Japan and Western Europe have used government-industry consortia to pull even with or ahead of the U.S. in chip manufacturing, high-speed railroads, and now even commercial aircraft.
As manufacturing and services have required greater education and teamwork, U.S. industry has also been hampered by frayed relations between capital and labor and by a deteriorating educational system. In the 80’s, for instance, U.S. automakers discovered belatedly that the secret of Japan’s auto-making success was not its robots, but its greater reliance on team production and worker innovation.
In addition, the U.S., as the leader of world capitalism after World War II, incurred military and economic obligations that forced it to divert scarce fiscal and scientific resources to the military and to adhere to the canons of free trade even while other countries ignored them. Japan’s meteoric rise from 1965 to 1971 was largely due to demand created by the Vietnam war and to its protectionist trade and industrial strategies.
Thus, considerable damage had already been done by the late 70’s, when conservative policies began to prevail, but there is no question that these policies—the hostility toward government, the attitude toward labor exemplified by the reaction to the air-traffic-controllers’ strike, the neglect of the human and physical infrastructure, the increase in military spending, and the refusal to protect vital American industries—reinforced these structural causes of economic decline. In his book Trading Places, former Reagan-administration Commerce Department official Clyde V. Prestowitz, Jr. offers eloquent testimony to how the Reagan administration’s blind adherence to free trade allowed the U.S. semiconductor industry—perhaps the most important industry of the 21st century—to be gutted by the Japanese “dumping” chips at below cost in the U.S. market.
In addition, the Reagan administration contributed to an ongoing fiscal crisis that has severely limited government’s ability to boost economic growth. On one side of the ledger, the administration not only increased military spending, but through its reckless deregulatory policies, laid the foundation for the multibillion-dollar S 8 L bailout; on the other side, its crusade against government has encouraged opposition—to the point of paranoia—to any tax increases. Even if the Bush administration wanted to undertake a significant public-works or educational program, it would have difficulty doing so.
During the 80’s, the only bright spot in economic policy came because either massive political pressure or the imperatives of the cold war overrode the government’s commitment to laissez-faire economics. This occurred during the Carter administration’s bailout of Chrysler, which saved the company and made money for the government, and the Reagan administration’s belated funding of Sematech, a semiconductor consortium in Austin, Texas. But as the cold war has ended, conservatives in the Bush administration have abandoned any effort to protect and foster key American industries. In economic policy, the Bush administration is probably more conservative than the Reagan administration. (The Bush administration’s policy-makers, looking at Eastern Europe, even interpret the collapse of totalitarian planning as a victory for Friedmanite economics—a mistake that the Western Europeans and the Japanese are unlikely to make.)
During this era, liberal Democrats have not presented a viable alternative to conservative reaction, but have in effect either justified it or been part of it. In the 70’s, for instance, liberal jurists, backed by civil-rights organizations, virtually invited a white backlash by pressing for busing as a means of school integration. Liberal economic programs, like the Humphrey-Hawkins Full Employment Bill, conceived of the state as a gigantic version of the post office, dispensing jobs to the unemployed. With few exceptions, liberal policy on trade, foreign investment, and multinationals was the same as conservative policy. In 1984 and 1988, for instance, there were no significant differences between the presidential candidates on these issues.
As the U.S. enters the 90’s, it is necessary to move beyond both the liberalism of the 60’s and the conservatism of the 80’s. In economic policy, it will be important to recognize that the U.S. is no longer the unchallenged leader of world capitalism and, in important respects, has fallen behind both the Japanese and Germans. As the U.S. did in the 19th century and as the Japanese did after World War II, America will have to use the power of government to protect and nourish key industries—not so much to be number one, but simply to be part of the action in the most advanced industrial sectors. If the U.S. doesn’t do this—if it allows its economic future to be dictated by financial speculators or foreign lobbyists—it will continue to decline.
In foreign policy, the U.S. must begin reconceptualizing its role in the world. Crucial to this, of course, is the recognition that economics has once more become primary—not only in relation to former adversaries, but also in relation to Latin American countries whose debt to U.S. banks has imperiled our trade balance. In the 80’s, America’s share of Latin America markets increased but its total exports dropped precipitously. If American exports to Latin America during the 80’s had increased at the same rate as they had increased for the prior three decades, the U.S. would not have had a trade deficit.
The U.S. will also have to recognize that America’s alliances will shift—that it may well become in America’s interest to strengthen rather than weaken the Soviet Union as a counterweight internationally to Germany and Japan and to promote stability through aid in those regions where traditional ethnic rivalries, if allowed to fester, could return the world to the situation it faced in 1914. The Bush administration, to its credit, has moved in this direction, sometimes over opposition from both liberal Democrats and conservative Republicans.
For the purpose of this symposium, however, the most important point is that if the program of conservative reaction did confer benefits upon the U.S. and the world, these have now been entirely exhausted. There is no longer any reason whatsoever for seeking the destruction of the Soviet Union. On the contrary, the U.S. now has a distinct interest in preventing the Soviet Union from descending into chaos. Nor is there any reason to continue deregulating business and finance and defunding American inner cities. The question for the next decades will not be how to get government off people’s backs, but how to use it so that Americans can once more stand straight.
The 80’s, it is true, did indeed end with a bang of anti-Americanism even louder than the usual thumping of the American intelligentsia. To be fair, however, New York City was its epicenter, and those who live not quite so close to ground zero were perhaps less perturbed. New York is the headquarters for only one kind of anti-Americanism, which blames the system. There is another traditional critique of the country, which blames individuals. The meaning of redemption in the two cases is radically different, for one is Marxist, the other Christian. Both partake of a persistent theme of our history—the juxtaposition of the country’s noble goals and our inherent personal unworthiness. And there is much more in our cultural and religious tradition which warns of the pitfalls of too much wealth, power, arid glory.
One cannot blame the Left for its sputtering and near-hysteria at the end of the 80’s. It had been a tough decade for the Left, its worst ever. At home and abroad, in economics and politics, in strategy and diplomacy, the Left had been dispersed and wholly discredited. And there is worse to come. The Soviet government’s embrace of the Right’s traditional reading of Soviet history and practice was bad enough. Once the archives of the former Communist satellites begin to yield their assorted facts, even the worst of “right-wing anti-Communist paranoia” will come to appear as sweet naiveté.
Though the Left has thus been routed, it still retains its hegemony in matters cultural, so it is natural to expect a redoubling of its efforts there. And ironically—or better yet, dialectically—it is all possible because 80’s capitalist intrigues brought about the further embedding of the leftist view of things in the great telecommunications-amplification machinery in New York, and in the now relatively few publications/communications/entertainment combines that really matter. Growing economic concentration and monopolization in this sector—the result of 80’s “greed,” junk bonds, insider trading, and whatnot—have placed the Left in firm control, ideologically speaking, of these enormous combines.
The Left also benefited from the good fortune that Reaganesque “greed” produced elsewhere. The universities, the foundations, and other places which generate the ideas that trickle down, or up, into the plebeian culture, were all enriched by the Reagan, now Bush, bull market. It increased endowments and incomes enormously. The Left is now camped out in its version of Bohemian Grove, not quite clear about how or why it has become unimaginably rich, but knowing only that it deserves to be, and is thus all the more able to be spoiled, self-indulgent, and dissociated from reality in the blessed way of those who do not have to work.
Yet it is not obvious what any of this means. Even more than the rest of America, those sectors of society which sustain the Left became rich indeed. But as all the novels remind us, just because you’re rich doesn’t mean you’re happy. Is this An American Tragedy? As the Left’s real income and power seemed to grow, its psychic income can only have declined. We may have one-party control of our major media and of our cultural, intellectual, and academic life—but so did East Germany. And, on the one hand, while our one party—let us for convenience call it the Anti-America party (or AAP), because this has nothing to do with being a mere Republican or Democrat—seemed to be tightening its control intellectually, it could hardly be pleased by its problems politically. Was anybody listening?
The Anti-America party intervened visibly and vociferously in all three presidential campaigns in the 80’s, and in all three it failed. The 1988 fiasco and its aftermath were especially galling. The AAP had done everything it could for Michael Dukakis; it had pulled out all the stops. Walter Mondale had had the good grace to disappear, but Dukakis had turned out to be not even an Adlai Stevenson who could be mythologically inflated in defeat. Instead, Dukakis shrank to the point where not even those who most ardently promoted him felt much other than embarrassment. Meanwhile, the Bush presidency itself supplies nonstop frustration for the AAP which worked so energetically, and nastily, to bring about his defeat. First, it turns out that eight years of anti-Reagan indoctrination did not prevent what was, in effect, a vote for a third term for the “ex-actor.” As for Bush himself, he has become hugely popular, principally because times are good—better, in fact, than they have ever been, and everybody knows it—but also because Americans really do prefer old-fashioned Georges and Barbaras to newfangled Mikes and Kittys. Again, it turns out that a decade’s attack on the traditional family, and the traditional male role and all that, merely reminds people of how desperately they want traditional families and traditional men.
Still and all, this is only the minor part of the Left’s discomfort. Our homegrown Anti-America party has been stripped of its global pretensions and connections. It has turned out to be the American Century after all. The present ascendancy of the United States—strategic, economic, and ideological—is really quite breathtaking. No single country—not even Britain at its height-has ever held an equivalent position. We who have been taught to be magnanimous and humble in times of triumph can therefore be grateful that the noisy petulance of the American Left has drowned out any unseemly gloating about America’s successes.
What can explain the dichotomy between the AAP’s seeming successes and its real failures, between our familiarity with every detail of its thinking about every conceivable subject, and the triumph of things of which it has tried to keep us ignorant? The explanation might be as simple as the propensity of almost all people to sort through the junk and reject descriptions of reality which diverge radically from reality as they know it. In particular, the hateful anti-Americanism which suffuses public discourse is rejected by Americans to the degree that they know something about the subject under discussion. Besides, even citizens of a free society have learned the techniques of passive resistance to attempts at thought control. Consider, for example, what is permissible in public discourse about some of our domestic problems, what people really think about those same problems, and how they really discuss them among themselves. Do they flee Park Avenue apartments because they fear marauding bands of students from the Dalton School? In this respect, there is a certain harmonic resonance between the concepts of “workers’ paradise” and “gorgeous mosaic,” for they are joined by the propensity of sane people to flee from the true nature of both. People have learned from Lenin’s famous question “who/whom?”—they are seldom if ever confused about who is doing what to them.
One likes to think that this is nature’s way of mitigating the seeming cultural hegemony of the Left. And one also likes to think that those loci of the Left’s power not normally answerable either electorally or economically—the university professoriate, the federal judiciary, the civil service—will also be impressed, even though, formally speaking, they are immune to the normal disciplines imposed by our society. Nor will the kind of anti-Americanism propounded by our intellectual classes persuade many people in a world which has experienced the political and philosophical revolutions of the 80’s. Indeed, this mode of American academic and intellectual and journalistic discourse has entered the genre of self-parody, dotty and idiosyncratic when it is not just plain loony. More and more, people will react to it as they would react if North Korea’s Kim I Sung were to resume his old practice of buying full-page advertisements in the New York Times.
Then, too, there is the 80’s transformation of the communications industry into a relative handful of enormous commercial organizations. True, the people who actually own or run these businesses dread denunciation for philistinism or illiberalism by their hired help, and do not much bother these employees when business is good. But this is still capitalism, and when television networks continue to lose viewers, when film investors start to lose money, and when the returns from publishing are not what they need to be, then all the old bets may be off. Time, for example, has lost hundreds of thousands of readers. So far, at least, the increasingly bizarre ideological bent of the magazine’s editorial content has not been identified, in public anyway, as a possible culprit, but Time-Warner, Inc. may soon be driven to try anything, even normality.
Yet one cannot be matter of fact, or overly analytical, or in any way cavalier about the enormous damage inflicted by the application of Left doctrines, or about the enormous effort required to repair the damage. It is easy enough to hope that the damage will be repaired sooner or later, but that phrase can encompass a long, painful, and frustrating period for those who have paid the real costs. “Sooner or later,” for example, we will follow the lead of Leipzig and do away with the East German-like social and economic system which, in the name of progress and justice, the Left has managed to foist on our inner cities. This system, like socialism in general, contrives to make people poorer than they once were and then to keep them that way, all the while suppressing their capacities for normal civic, social, and cultural life.
The contribution of the 80’s, then, is to make the repairs seem possible. For unlike many earlier decades which we remember as times of muddle, the 80’s were the decade of clarity. As Kenneth Minogue of the London School of Economics has pointed out, there has never been a comparable test case in social, economic, and political life which was so plainly decided in favor of one side—the side of democracy and capitalism, our side. The decade, moreover, saw more freedom, more opportunity, more hope, and more prosperity for more people than any other ten-year period in the history of the world. That much of this was inspired and presided over by the United States validates our civic creed and vindicates the efforts of our citizens. It was a decade filled with accomplishments in this country and in the world on a scale so great that the mere memory of it will remind people of what can be achieved.
Meanwhile, the 90’s have begun and the new decade already wants to know what the 80’s have done for it lately. In praising the clarifying power of the 80’s, however, one need not lose sight of other instructive eras. Winston Churchill, we remember, wrote a six-volume history of World War II. He inscribed a theme in each one; the last, Triumph and Tragedy, was written, he said, to recount how the great democracies triumphed and were thereby free to resume the follies which had nearly cost them their lives. It is always possible that the 90’s will be a time for the resumption of such folly. But no one will be able to blame Ronald Reagan for that.
Have the ideas and policies prevailing in the United States in the 80’s triumphed worldwide?
The coming of liberal democracy to Eastern Europe and the unraveling of Communism in the Soviet Union vindicate enduring principles of the West—those of open societies, competitive economies, and democratic political life with guarantees of individual liberty. It is wishful thinking, or at least premature, to talk of those ideas triumphing worldwide. Communism remains entrenched in East Asia; Islamic fundamentalism convulses nations from North Africa through South Asia; regimes in Africa and Latin America continue to be wracked by intractable social problems and political instability. The ultimate political destiny of the Soviet Union itself remains dark and uncertain.
But if it is an error to take the past year’s progress for a consolidated victory and the Soviet bloc for the world, it is worse to confuse the common traditions of the West with particular policies of the Reagan administration. Liberals and conservatives alike have reason to celebrate the advance of principles that unite us. But the policies of the Reagan era that divide us have had no similar triumph, not among our allies and not at home.
I have no idea whether the Reagan administration’s economic and social policies “stimulated greed on Wall Street” or “encouraged a general mood of selfishness.” Greed and selfishness have never been in short supply on Wall Street or anywhere else. But the policies of the last decade have unquestionably left us with a colossal financial burden. The S & L fiasco is perhaps the most graphic illustration of the national costs of misconceived deregulation. The 1981 tax legislation helped to stimulate a wave of mergers and acquisitions that converted equity to debt on a massive scale and left many companies excessively leveraged and vulnerable to collapse, as the fall of the house of Drexel itself illustrated. Federal deficits at previously unimaginable levels added more to the national debt than in all of our previous history. Interest expenses now represent the third biggest item in the federal budget. As interest costs and the defense budget absorbed an increased share of federal spending, productive public investment—the share of government spending that goes to infrastructure and other productivity-improving purposes—declined to the lowest levels in four decades. Supply-side policies were supposed to increase the private savings rate; they failed to do so. America as a whole consumed 3 percent of GNP a year more than it produced; we made up the difference by borrowing from foreigners, and we shall be paying them back for years. All of this will be part of the long financial hangover from the 80’s.
So, a costly decade, yes; a disastrous one, no. Relative to our Gross National Product, the federal debt is still smaller than it was after World War II. And while the proportion of that debt owed to foreigners is higher, the burden is manageable. So, too, are the staggering costs of the S & L bailout. But to say burdens are manageable is not to say we should celebrate or continue the policies that brought them upon us. It was not these policies that gave the United States, much less the West as a whole, a victory over Soviet Communism. Soviet power is collapsing because of the deep and endemic problems of Communism. The liberal democracies of the West are now gaining ground because they have discovered the means of reconciling initiative and innovation with political and economic stability. The deep strength of our system is forgiving: it permits us to make errors in policy, and to recover. But the errors made by the United States have a cost to our power and prosperity. Relative to Western Europe and Japan, the United States has slipped—because we have been unwise, and because they have graduated from the junior position in the alliance that they accepted for nearly a half century after World War II. Only in that restricted, relative sense is it possible to speak of American “decline.”
In a larger sense, the United States faces extraordinary opportunities. Advances in science and technology are changing the basic relations of time and space in the economy and promoting a global expansion that should enable us to continue to grow and to cope with the problems of the environment, an aging society, and the persistence of hard-core poverty. The end of the cold war should enable us to shift resources, such as scientific talent, from the military to more economically productive uses. Nothing, of course, guarantees that we will succeed in capitalizing on those opportunities, but fate has not decreed that we go the way of the British empire.
Internally the picture is highly uneven, socially and regionally. Whether you think the 80’s were, from an economic standpoint, a good decade or a bad one depends exactly on that—your economic standpoint. For roughly the top third, the 80’s were exceptionally good. The value of financial assets soared. Between 1980 and 1988 America’s upper 20 percent of households increased their share of total income from 41.6 percent to 44.0 percent, while the share going to the bottom fifth fell from 5.1 percent to 4.6 percent. To take an even more stark contrast—this for the years 1977 to 1987—average incomes for the top 1 percent rose from $174,000 to $304,000 (up 74 percent), while average incomes for the bottom tenth dropped from $3,528 to $3,157 (down 10.5 percent). Forbes’s 1990 survey of 800 top chief executives found them making an average of $1.4 million, more than double the $620,000 they made in 1985, even though profits had increased only 40 percent over the same period. Average wages for production workers did not increase in real terms at all between 1980 and 1988. By the end of the decade, according to the economist Frank Levy, the median incomes (in 1989 dollars) of male high-school graduates, ages 25 to 34, had actually fallen from $23,000 to $20,000. One of four children under age six in America is growing up in a household with an income below the poverty line. Lower-income Americans have not just experienced a loss relative to the rich; their real level of living has declined. The homeless are but the most visible sign of that deterioration.
Increasing inequality has diverse sources. Since pre-tax incomes became more unequal during the 80’s, tax policy cannot be the entire explanation. While the global economy offers new opportunities for those with skills and resources, it puts America’s less skilled into direct competition with low-wage workers in poor countries. Economic changes unfriendly to less affluent Americans, however, did not appear for the first time in the 80’s. In other periods, national policy sought successfully to reduce the gap between rich and poor. The difference in the last decade is that national policy has widened the gap.
Poverty is only one aspect of the continuing failures of social policy. The system of public schools that once gave us one of the highest educational levels in the world is now in deep trouble. Even apart from the poor, our children perform nowhere near the level of children in Japan and Western Europe. America’s health-care system is by far the most expensive in the world, both absolutely and relative to our national income—yet some 37 million Americans, the majority of them members of families with a working adult, have no health insurance; and on every major indicator of health, we lag among the advanced societies. By the same comparison, we are beset by higher levels of drug use and violence and have the highest rate of imprisonment, with over a million Americans behind bars—the majority of them young men who ought to be contributing to rather than subtracting from America’s wealth. The “prevailing ideas and policies” of the United States in the 80’s show no sign of remedying these problems and are not models the rest of the world is driven to adopt.
In the great struggle between capitalism and socialism, socialism has lost. But among the capitalist countries, the variations in the design of institutions and policies are considerable, and the Reagan-era, laissez-faire model is by no means triumphing over the diverse alternatives. Highly interventionist governments in East Asia have had the highest rates of economic growth in recent years. The Western Europeans are committed to more comprehensive social policies and higher levels of public expenditure than is the United States; European unification is reinforcing that pattern. Throughout the world, environmental concerns are a reminder of the limits of the market.
In the United States, the two principal elements of the Reagan agenda of 1980—the drive to deregulate markets, privatize services, and cut back government, on the one hand, and to build up America’s military posture on the other—are now played out. In some areas, such as telecommunications and trucking, deregulation reflected a political consensus, gotten under way before 1980, and is now well established. In others, such as the environment, the Reagan initiatives were divisive and quickly cut short. And in still others—finance, cable television, airlines—the results have been unimpressive, in some respects dismal. As a politically popular movement, deregulation is finished: the S & L scandal is the coup de grâce.
So, too, with cutbacks in government spending. In fact, overall government spending did not decline during the 80’s because of higher defense budgets and interest costs. The cutbacks came entirely in discretionary social expenditures (that is, exclusive of Social Security), down from 9.7 to 7.3 percent of GNP (a cut of one-fourth). With the end of the cold war, that change may partially be reversed. California voters’ recent approval of higher taxes for public infrastructure suggests that the acute taxophobia of the 80’s is receding. President Bush’s retreat on taxes may also be a signal.
Did the United States need to shift spending from domestic programs to defense as much as it did? Believers will insist higher defense budgets were essential to bringing about the breakdown of Soviet Communism. But it is a hard case to make, given the severity and systemic origins of the Soviets’ problems. Some things are now clear. First, the CIA estimates of the Soviet economy that backed up our own high defense spending were grossly exaggerated; the Soviets were in much worse shape than conservatives, ironically, could admit. Second, the long-term contracts for weapons procurement made during the 80’s hang over us like the S & L bailout. They are now costly to terminate even when the rationale behind some weapons systems is disappearing.
I do not believe, on the other hand, that there is any particular ill legacy to Reagan’s foreign policy. The world has changed so radically in two years that the question of American intervention abroad takes on entirely different meaning today. The premises of 1980—that the United States needed to take a harder line against Soviet expansionism, deploying new weapons in Europe and opposing Soviet-supported regimes and movements elsewhere—lose their force in a world where the Europeans can take care of themselves, thank you, and many local and regional conflicts in the Third World are losing their former global significance. Confronted by an unfriendly dictator in some misbegotten Third World country, we ought to evaluate our interests in the cool recognition that it probably will not make much difference to us whether he is a white zebra with black stripes or a black zebra with white stripes. Indeed, the less the United States needs to vie with the Soviets for influence in the Third World, the less opportunity will such regimes have for playing one superpower against the other to extract arms and assistance. This is one of the less apparent “peace dividends” from the end of the cold war, and it will enable us to defend our real national interests in theaters of conflict where America’s moral and political role is genuinely needed.
America’s great opportunity now is to become the country we have held ourselves out to be. What people elsewhere in the world acclaim about the United States is not the policies that produce our budget deficits and trade deficits, our leveraged buyouts and financial bailouts, our failing public schools, costly and inequitable health-insurance system, welfare programs, housing policies, or drug-enforcement techniques. They admire our basic liberties, the openness of our society, the ingenuity of our technology, the freshness and energy of our culture. These deep springs of vitality continue to offer us our best hope for the future. The trick will be to apply that American energy to the common purposes and responsibilities to each other neglected during the last decade.
One of my favorite passages from presidential speeches runs as follows:
Entertaining a due sense of our equal right to the use of our own faculties [and] to the acquisitions of our own industry . . . ; enlightened by a benign religion, professed . . . and practiced in various forms, yet all of them inculcating honesty, truth, temperance, gratitude, and the love of man; acknowledging and adoring an overruling Providence, which by all its dispensations proves that it delights in the happiness of man here and his greater happiness hereafter—with all these blessings, what more is necessary to make us a happy and a prosperous people? Still one more thing, fellow citizens—wise and frugal Government, which shall restrain men from injuring one another, shall leave them otherwise free to regulate their own pursuits of industry and improvement, and shall not take from the mouth of labor the bread it has earned. This is the sum of good government, and this is necessary to close the circle of our felicities.
This passage, of course, is not from Ronald Reagan in 1981 but from Thomas Jefferson in 1801; but the reader will perhaps agree that the spirit is Ronald Reagan’s.
Indeed, Ronald Reagan spoke often of his presidency as “a new beginning,” a fresh start on the original American experiment. In the classic sense, every successful revolution (re-volvere) is a turning back to first principles, and in the American case turning back to principles “conceived in liberty and dedicated to the proposition that all men are created equal.” Reagan preached on democracy in London and Moscow, and made the principles of Jefferson and Lincoln echo far away: Jefferson’s Declaration (“We hold these truths . . .”) was used by Zdenek Janecek, a brewery worker in Prague in November 1989, to explain to his fellow workers the meaning of “the velvet revolution”; just as in Shanghai the preceding June, the students had carried aloft a white plaster replica of the Statue of Liberty. The Reagan revolution, like its predecessor, was a shot heard ’round the world.
Barely three weeks into office in 1981, President Reagan told guests at a White House dinner for Margaret Thatcher that Communism was being swept into the dustbin of history. His “simplistic” confidence made the sophisticated cringe, as did his later assertion that Communism was “the focus of evil” in the world. Described as Manichean and vociferously objected to by progressives in the West, this sentiment was solemnly assented to by those weighed down by that evil in the East and has become a commonplace in Eastern Europe today.
Indeed, the implication of the Reagan era for foreign policy is simple and inescapable. If he was right, the Left was wrong, wildly wrong, on just about every important issue of our time: on Communism, on human rights, on the arms buildup vs. the nuclear freeze, and even on the character of Mikhail Gorbachev.
I have some particular claim to knowledge of Reagan’s strategy on human rights since by his tenth day in office, the President had me flying to the Human Rights Commission (HRC) in Geneva, the first Reagan appointee to speak on any matter of foreign policy at all and “the first Reaganaut in European captivity,” as my ambassadorial colleagues were amusingly to describe me. Before my arrival, these colleagues were said to be very curious, as if a Reaganaut might show up with holster and cowboy boots.
I remember answering one question after another at my first formal dinner, at the home of a Scandinavian ambassador, in terms supplied me by the Reagan administration: that America was founded in the name of human rights, so that on the central propositions our allies could expect unwavering consistency; that my instructions were to condone no violation of human rights anywhere in the world, but to criticize all abuses by a consistent standard; that on most practical questions our voting instructions would most likely (97 times out of 100) be substantially like those of the Carter administration, since a great nation like an aircraft carrier changes course only slowly; but that our allies could expect our speeches and our actions to be more consistent with one another and more steady than the Carter administration’s. In particular, Reagan would insist on holding the USSR, its satellites, Africa, and the Islamic countries to the same standard as Latin America was held: one single standard for all. Again, Reagan intended to be judged by results, not words: that is, by whether, after his term was through, the actual human-rights situation around the world had improved.
By this measure, the revolution in human rights and in the building of democracy around the world since 1981, not only in the Philippines and in Chile and in eleven other new democracies in Latin America but also in the heart of the “Evil Empire” itself, is in line with Reagan’s policy. The world by 1990 is much closer to where Reagan wanted it to be—where Reagan pushed it to be—than anyone but he predicted. Let me dwell a little longer on the underlying strategy.
My colleague in Geneva, Richard Schifter (now Assistant Secretary of State for Human Rights), argued openly that Western military power was the only reliable guarantor of human rights in the world, and that a serious and careful use of words was absolutely essential to the protection of human rights. He and I expressed our loathing for the Orwellian usages of the HRC (particularly its inversion of such words as “racist” and “terrorist” in its relentless annual assault upon Israel). We urged our allies to unite behind a strategy to advance Western concepts of human rights, rather than surrendering defensively one step after another in their erosion. We helped to form the first majority in the (then) thirty-seven-year history of the HRC to discover and to denounce in public even one human-rights abuse behind the Iron Curtain (Poland under martial law), and we began the process of including Cuba under human-rights scrutiny. Ronald Reagan wanted to signal to the world that human rights are not protected by slogans, promises, or words, but by institutions of democracy and civil society. This was a conscious long-term strategy.
Thus, when Mikhail Gorbachev came belatedly upon the scene as General Secretary in 1985, he faced a different ideological momentum in the world and a new correlation of forces. He was not at first any champion of glasnost or perestroika. People may forget how closed, stubborn, and testy Gorbachev was in the beginning about human rights, as evidenced by the glacial resistance of the Soviet delegation in Bern, Switzerland, at the Helsinki follow-up talks of May 1986, whose best offer Reagan turned down as totally inadequate. This was at the time when the disaster at Chernobyl was being hushed up, just before glasnost altered the climate.
To whom goes the credit for the great and sudden change of direction in the USSR, Time asked Agostino Cardinal Casaroli at the close of 1989? Earlier, the prudent and reserved Vatican Secretary of State had sought to moderate Reagan’s military buildup, especially the Strategic Defense Initiative. Nonetheless, the experienced Vatican Cardinal carefully assigned credit as follows: “Ronald Reagan obligated the Soviet Union to increase its military spending to the limit of insupportability. He made everyone understand that rearmament was a dead-end street.” This is a great tribute to Ronald Reagan.
Further, it is probable that Mikhail Gorbachev could not have agreed to the process leading to the liberation of Eastern Europe in 1989—and perhaps not even to glasnost and perestroika in 1986—in the face of any other American President except Ronald Reagan. First of all, only Reagan could have engineered both the decisive military buildup and the change in ideological warfare. Second, any liberal Democrat, even a liberal Republican, who would have “trusted” Gorbachev in those early days would have-been sharply attacked by the very conservatives who were willing to be led by Reagan. For his part, Reagan laid the perfect foundation for Gorbachev’s desperate attempt to save Communism and the USSR. Before he went to Geneva, President Reagan told a small group of friends that if he could have four or five hours alone with the new Soviet leader he was sure he could persuade him that he intended the Soviet Union no harm; and that a man of his age could be concerned only with the children of his children’s generation. About Gorbachev, those present may have thought this hope terribly naive; but about Ronald Reagan it was thoroughly convincing. The record shows that Gorbachev also found it so.
In his earlier profession, Reagan had mastered both an instinctive grasp of the ideological drama within which persons speak their lines and a knack for reading quickly the essence of character, and so he quickly recognized Gorbachev as “a different kind of Communist . . . certainly no Lenin, . . . a new type of Russian,” and played his own presidential lines accordingly. More than anyone else, Ronald Reagan made recent history happen as it did, and this marked his presidency with greatness. In Clare Booth Luce’s game of epitaphs: “Ronald Reagan toppled Communism.”
Domestically, however, most of the New Class—leaders of culture, journalism, and social activism—have loathed what Ronald Reagan stands for since before he became President, while he was President, and still today. They were accustomed, in the age of television, to getting their way: they had driven out Johnson, destroyed Nixon, ridiculed Ford, humiliated Carter. They were astonished that Reagan could be elected, and it killed them that the public loved him more than them, which they blamed on the public’s gullibility. And so the years 1981-88 were the darkest in their lives. They pretend now that it was a troubling illusion, an unhappy memory, a phase already ended. Having failed to vanquish Reagan in the hearts of the people while he was in office, they hope at least to blacken his memory.
To the decade that saw the greatest outpouring of private philanthropy in history ($115 billion in 1989, 88 percent of it given by individuals) and the largest outpouring of volunteer hours (more than 40 percent of American adults giving at least five hours per week) they have fraudulently tried to attach the name of “greed.” Tens of millions of evangelicals, never politically active before, went from being “the silent majority” to taking an active part in politics. Millions rallied for “Back to Basics” programs in the schools. One of the greatest human-rights programs of all times, the pro-life movement, rose up from the grassroots with virtually no support whatever from the nation’s cultural elites—indeed against the visible contempt of elites-to protest against the excesses of Roe v. Wade in 1973. (The same elites who protested loudly against “police brutality” toward civil-rights marchers in 1964, and against “the pigs” during 1968’s youth rebellion, nodded with satisfaction as the police used unnecessary cruelty against the prayerful participants of Operation Rescue.)
Against much early scorn for Reaganomics, Ronald Reagan argued that cuts in income-tax rates were necessary for two reasons: both to hold level the percentage of national income going to government, and to generate economic growth. By arduous persuasion, brilliant tactics, and exceedingly close votes, Reagan got his tax cuts—and the country got the longest period of peacetime economic growth in American history, to the point that his critics stopped calling this success Reaganomics. When Reagan took office, the prime interest rate was 19 percent, inflation was 13.5 percent, and unemployment was over 7 (for a misery index of 40). When he left, the prime rate was 8, inflation less than 4, unemployment 5.2, and no one was talking about a misery index.
These successes both in foreign and in domestic affairs left the Left desperate. Thus, even as socialism was being discredited internationally (not only Communist socialism in Eastern Europe, but also Mitterrand’s socialist turn of 1981), and even as Mitterrand, González, Soares, Kinnock, and other European social democrats turned speedily to market principles, free enterprise, and lower tax rates, the American Left reflexively turned to the old Marxist tactic of class warfare, envy, and resentment.
This line of attack has become the conventional wisdom of those American journalists who are easily intimidated by the Left. Reagan’s tax cuts, they say, helped the rich and hurt the poor. Blacks especially, they say, suffered under Ronald Reagan. In America, inequality is growing, they say, and the U.S. leads all other major industrial countries in the gap dividing the upper fifth of the population from the lowest fifth.
All these assertions are false, as anyone who examines all the numbers sees; they are fashioned by selecting some numbers and neglecting others. Thus, Kevin Phillips in The Politics of Rich and Poor claims that “among major Western nations, the United States has displayed one of the sharpest cleavages between rich and poor.” Phillips supports this with exceptionally rough World Bank data (about which the Bank itself signals caution) from the years 1978-80 (before the Reagan inauguration). He forgets that the U.S. is a continent-sized, multi-ethnic nation, not a small homogeneous nation like Denmark. Measures of inequality in our most Swedish state, Minnesota, come quite close to those of Sweden; those of the U.S. as a whole are better compared with those of Canada, Australia, or Europe as a whole. Besides, unlike social—democratic nations (Kevin Phillips, social democrat?), the U.S. is not and has never been committed to income equality. “The protection of different and unequal faculties of acquiring property,” Madison wrote in Federalist #10, “is the first object of government.” The people of the U.S. define fairness in terms of opportunity, not of outcomes, as the social-democratic writer Jennifer L. Hochschild plaintively reports in What’s Fair? (1981).
Similar tricks are played with the lowest quintile in annual income. During the 80’s, the total percentage of income going to the bottom quintile declined—but the total amount of income received by that quintile rose. This last point is encouraging, because since at least 1960 the characteristics of those in the bottom quintile have been steadily changing. By now, almost two-thirds of the householders in the bottom quintile are single women, and relatively few are married men. About half of these female householders are elderly widows. For this and other reasons most householders in the bottom quintile are nowadays not in the work force. Thus, even the unprecedented economic growth of the 80’s could scarcely increase the cash income of those who are neither employed nor looking for work. And comparing the bottom quintile of 1988 with the bottom quintile of earlier decades is no longer comparing like with like.
The characteristics of householders in the top quintile have also changed. It is not so new that nearly all are married, have had at least four years of college, work full time, and are in their highest-earning years (45-64). What is new is that the highly-educated spouses of nearly all of them are also working full time for comparably high incomes—thus doubling the advantage of the top quintile over lower quintiles. As long as this phenomenon continues, so will “growing inequality.” The Census Bureau figures on income, incidentally, are pre-tax and so entirely unaffected by cuts in tax rates.
Many falsehoods have also been written about how the “Reagan tax cuts” have helped the rich at the expense of the poor. Yet Reagan’s economic growth with low inflation was not nearly so damaging to the poor as was Carter’s stagnation with high inflation. Carter’s inflation raised the poverty level from $5,815 in 1976 to $8,414 in 1980, sweeping some 4.3 million persons below the poverty line simply by eroding the value of incomes. Nothing hurts the poor like inflation; Reagan stopped that.
More important, though, the Reagan tax cuts exempted most of the officially poor from paying any federal income taxes at all. (It was not Reagan, but the Congress, that raised Social Security taxes.) By 1987, those in the upper brackets paid both the highest amounts of federal income tax ever paid and the highest proportion of all income taxes ever paid. For example, those in the top 5 percent paid 37.6 percent of all income taxes in 1979, but 43 percent in 1987. The actual taxes they paid were also significantly higher, both as a gross amount and as an average effective tax rate (22.4 percent). The top 10 percent of returns in 1987 paid 55.4 percent of all federal income taxes. Indeed, the top half of taxpayers in 1987 paid 94.1 percent of all federal income taxes. The entire bottom half paid only the other 5.9 percent.
Moreover, in certain respects, the condition of blacks in the U.S. improved significantly. In 1980, only 9 million blacks were employed; by 1988 this number had jumped to 11.4 million. The total annual income earned by all U.S. blacks rose steeply (in constant 1988 dollars) from $191 billion in 1980 to $259 billion in 1988—a sum larger than the GDP of all but ten nations in the world. The number of black families earning more than $50,000 per year jumped from 392,000 in 1982 to 936,000 in 1988. The median income of black married-couple families in 1988 climbed to $30,424. Of course, the bleak side of this picture is that the number of single-parent black families rose by 15 percent between 1980 and 1988, from 1.9 million to 2.2 million. The vast majority of single female householders were not in the work force, and a majority of black children in 1988 were born out of wedlock. Here median income was far lower; and nearly all indices of suffering and disability were higher. No wonder the measure of inequality (gini coefficient) among blacks was far higher (.450) than among whites (.382).
Choices of family pattern are not entirely amenable to government intrusion, although some argue that federal welfare policies enable certain self-injuring behaviors. Reagan was able to set welfare thinking on a new course, but not to show a reversal in damaging trends.
More than these great foreign and domestic achievements, however, Reagan deserves credit for inspiring this large, diverse, and important country to keep faith with its own destiny. For him that destiny was manifest, and manifest he made it to the world—even to the Communist party of the USSR. Ronald Reagan may not have caused a revolution in America, but he did renew the original one and give it universal reach.
The two questions posed by the editors of COMMENTARY actually come down to a single one: the relation of prevailing opinion among academics, intellectuals, and the media to political realities. The answer, unfortunately for the state of social thought, is that these opinions represent largely what people want to believe rather than sustained observation and analysis. Essentially, they reflect the disorder that has invaded much of our thinking about literature, philosophy, and politics. I suspect that this period may become known as the age of confusion.
In the political chaos that has surrounded the last decade—polemically known as the Reagan years or Reaganism—it is not easy to sort out the false from the true characterizations. On the whole, however, it is patently absurd to characterize the last decade as a plunge into darkness. Despite many of its failings, the country is scarcely in a catastrophic condition domestically. As for our foreign policies in the 80’s, our standing in the world is higher than it ever has been since soon after World War II. And the movement toward democratization and the free market in the Soviet Union and Eastern Europe certainly has vindicated our policies and done nothing to blacken the image of our own brand of democracy.
To be sure, not everything has been idyllic, even if it has not been disastrous. But to deny that it has been disastrous does not mean we should deny the failings. We cannot ignore the wild takeovers, the widespread corruption, and the frequent disregard of the national interest. Deregulation seems to have produced the S & L disaster. More might have been done to reduce the gap between the rich and the poor which appears to be widening; more could have been done to improve the environment; greater and more effective measures could have been taken to reduce crime and to do something about the abysmal state of education. But the confusion and the distortion arise when these failures are inflated so as to darken the entire picture and to dismiss the 80’s as the black years of Reaganism. (Perhaps to make clear my own perspective, I should say I have never been a Reaganite.) But surely Reagan was not responsible for every failure; some were just as much the responsibility of Congress (controlled by the Democrats). And many of the vulgarities and the unlimited pursuit of self-interest that we abhor are rooted in the laissez-faire system—which, to be sure, Reagan did little to control—and in the emergence of the much-publicized “me” generation and the Yuppies with radical politics. This seems to be the price we pay for our liberties and our prosperity.
As for our foreign policies, they have been more successful than our domestic ones, though they have not been always intelligent or responsible. Nor are they sufficiently long-range. For example, the Iran-contra affair, however well-intentioned it might have been, was stupidly planned and executed. Also, relations with Israel were often strained because of some almost inexplicable blindness toward the machinations of the Arabs and an unwillingness to understand the demagogy and treachery of Arafat. The misleading slogan of evenhandedness masks the specific causes of the Israeli-Arab conflict. On the other hand, the arms buildup and the stiffening attitude toward the Soviet Union under Reagan, which came under so much liberal criticism, turned out to be not only correct but successful—as can be seen in the democratization of Eastern Europe and the easing of relations with Russia. (It must be said in this connection that Bush does not seem to be following Reagan’s leads, particularly in his captious attitude to Israel and in his almost uncritical support of Gorbachev.)
Part of the negative characterization of the 80’s is a cultural matter. For a good deal of the criticism of Reagan by academics and the media was a reaction to his style and his personality. The image of the actor who could read his own lines fluently but was not at ease with the language on his own did not command much respect from the educated classes. Indeed, Reagan did not have any of the aura of an intellectual leader, of a statesman. As a matter of fact, the gap between Reagan’s reputed intellectual laziness and lowbrow tastes on the one hand, and the fact that he was able to pull the country out of its post-Vietnam sloth on the other, remains something of a mystery. Perhaps the explanation is to be found in the political genius of the country, noted early by Tocqueville. Unlike the nations of Europe, which produce statesmen, we create political leaders who, like Reagan, Eisenhower, and Truman, represent the essence of the common man elevated to a leader.
But this is only half of the equation between the political realities and the popular perceptions of them. The other half, I think, lies in the area of ideology, the ideology of liberals and the Left, who have had much to do with our thinking about the events of the last decade. And here the question is not so much whether the ideas in this quarter have been right or wrong as that they have been the source of much muddled thinking. The point therefore is not whether Left-liberals have been wrong about Reagan and his years in office; in some respects they have been right. It is, rather, that the interminable Reagan-bashing with its automatic dismissal of everything connected with the Reagan years has made it difficult to arrive at a clear understanding of the ideas and events of the 80’s. As I have suggested, in some matters, as in the new tax law, which is not what it was claimed to be, the Democrats have been as responsible as the Republicans. In other areas, such as crime, drugs, and economic inequities, the culpability that belongs to both parties and to the condition of our society as a whole has been attributed to the Reagan administration alone. On the other hand, the anti-anti-Communism of many liberals and much of the Left, which recent events have shown to be foolhardy, has generally escaped critical examination. The complete failure of socialism in the Soviet Union, for example, and the admission of terror and the slave-labor camps are themselves commentaries on those who regarded anti-Communist criticism as a species of conservatism.
The confusion scarcely lets up. We are now in the era of Bush-bashing, which is an automatic continuation of Reagan-bashing, even though on many key questions, with the notable exception of the issue of abortion and to some extent that of taxation, Bush is almost indistinguishable from liberal Democrats. On Lithuania and China, for example, there has been an ironic reversal, with the Democrats pressing for a tougher policy, and Bush leaning toward a softer one. On the other hand, extreme right-wingers, such as Patrick Buchanan, have made their own contributions to mindless policies by insisting that nothing basic has changed in the Soviet Union.
In Essence, the fact that Left-liberals have made it more difficult to think seriously about political and social questions is at least as important as their mistaken views. This is particularly unfortunate now when we are entering a new uncharted era, and when we are still fed the ideological platitudes that have served to conceal the real issues in the past. The two most important questions today are the unification of Germany and the extent of the changes in the Soviet Union under Gorbachev. And in both cases a certain amount of confusion comes from the persistence of old liberal views. In the matter of unification, fears of re-Nazification—which happen to coincide with Russian rather than American interests—are relics of liberal ideology (as well as the understandable feelings of many Jews). As for Gorbachev’s Russia, we have difficulty in assessing its power and stability to some extent because of the lingering effects of the tendency to play down Russian economic backwardness, political controls, and military strength. No doubt, glasnost and perestroika represent significant liberating changes, but so far they have been limited in scope: the Communist regime is still intact, and genuine democracy and pluralism are not in sight.
Another example of the will to cling to outworn beliefs is the response of some socialists to the changes in the Soviet Union and Eastern Europe. Many of them insist that socialism has not been repudiated and that there is no movement toward a free market. For example, Samuel Bowles, who describes himself as a leftist, says, in a recent comment in the Chronicle of Higher Education, that the death of socialism in Eastern Europe will revitalize leftism in this country. There is also the continuing obfuscation in the use of the term “socialism.” Usually it is not the abolition of the private ownership of the means of production advocated by Marx and Lenin that is meant by socialism. What is being referred to is some version of social democracy, generally a mild one, mainly little more than a welfare state.
These intellectual habits, it might be noted, are not limited to the area of practical politics. Much has been written criticizing the latest trends in deconstruction, feminist literary theory, black aesthetics—and the radical assault on the cultural tradition. (Helen Vendler’s recent piece in the New York Review of Books is a devastating criticism of the feminist approach to literature.) But, to my knowledge, it has not been pointed out that aside from the question of their validity, these new cultural theories actually have produced an enormous amount of intellectual confusion. For to deny the primary meaning of texts and the authenticity of the Western tradition has created a state of intellectual anarchy, in which any theory seems to have as much credibility as any other. It is difficult to say which is cause and which is effect. But the confusion in political matters is of a piece with that in cultural ones. One is reminded of Dostoevsky’s remark—through one of his characters—that if God doesn’t exist, then anything is possible. Similarly, if there is no authority informed by tradition, then anything goes.
If, as I have suggested, we are intellectually handicapped by political confusion as well as by the profusion of trendy theories, it would seem to be less profitable to argue the validity of specific positions than to clear the air of ideological commitments—of those habits of mind that are bound to twist our thinking about many political and cultural questions.
The characterization of the 80’s that introduces this symposium is taken seriously only by those who are not themselves serious. It is, not to put too fine a point on the matter, loser’s history. This is history as seen by the adversary culture, dominant among “journalists, historians, and intellectuals,” but quite unrecognizable to the unbeguiled majority. It is history as seen by those who don’t much like America, and most Americans like America just fine, even if they are sometimes bemused by the inability of the country’s cultural elites to appreciate the benefits of this almost chosen nation.
The antagonism of the adversary culture toward America’s bourgeois democratic reality is of course nothing new. Earlier traces of it can be found among losing Federalist elites in the late 18th and early 19th centuries, in the moral fastidiousness of the New England Transcendentalists, in the goo-goo restorationism of the post-Civil War era (of which the supercilious Adams brothers were the perfect expression). It has come fully into its own in the 20th century, first in the 20’s, again in the 50’s, and now in overripe maturity in the 80’s.
Two brief and partial points of concession. First, the current vogue of the adversary-culture perspective is part of the inevitable ebb and flow of political opinion. No political mood lasts forever, and it is in the nature of things that the conservatism of the 80’s should produce something of a left-wing reaction. That reaction is exacerbated by the Bush succession: had Richard Nixon rather than John Kennedy succeeded Eisenhower in 1961, the relatively subdued reaction of the cultural elites of the time against the 50’s would have been as exaggerated as is that of their counterparts against the 80’s today. Second, there are elements of truth in the critical view of the 80’s. There were, given the entrepreneurial enthusiasms of the time, excesses on Wall Street and elsewhere in the business community (though the idea that the Reagan administration somehow invented greed and selfishness is one of those stock moralisms that it takes an American liberal to believe), and there was a widening of the spread in income distribution during the period (though the vast majority of Americans improved their situation in the course of the decade and the desperate condition of the homeless and of the underclass of which they are a part owes far more to cultural and structural economic factors than to the policies of the Reagan administration).
The great reality of the 80’s, ignored almost as much on the Right as on the Left, is the ascendancy of conservatism. (The massive exception here—there are other partial ones—is the advance of feminism; its against-the-grain victories would require a separate essay properly to explore.) We have witnessed in the past decade a transformation of the vocabulary and agenda of the American political economy. Liberals (and Bourbon-restorationist conservatives) note that Americans have not fully repudiated the welfare state; they ignore the far more significant point that no one today calls for the return of the Great Society. Of course the Reagan administration did not restore laissez-faire; neither did the Harding, Coolidge, or Hoover administrations of the 20’s. (For that matter, there was never in the American experience a condition of full laissez-faire to be restored.) What matters is that not only was the once seemingly inexorable march toward an ever-greater statism retarded, the idea behind it was brought fundamentally into question.
The temptations of hyperbole in political discourse duly noted, it is still fair to say that there occurred during the 80’s a Reagan counterrevolution, and the further removed from it we become in time the more obvious in substance it will appear. I will leave to others in this symposium the making of the easy case in international affairs. Suffice it to say here that those who believe that the revolution of 1989 in Eastern Europe had little or nothing to do with the policies of the Reagan administration might as readily believe in the ministrations of the tooth fairy.
Reagan’s achievements in the domestic sphere were, if less dramatic in effect than those in foreign affairs, no less significant in fact. Here a comparison with the presidency of Franklin Roosevelt suggests itself. Thirty years ago the noted American historian Carl Degler referred to FDR’s New Deal as the Third American Revolution (after 1776 and 1861-65). By the term “revolution” he meant a number of things, but more than anything else he meant the idea that the federal government had taken ultimate responsibility for the functioning of the economy, that from that time forward it was to the public sector, not the private, that Americans would look as guarantor of economic health and security. Degler represented the conventional wisdom of the pre-Reagan postwar era, as was confirmed in President Richard Nixon’s observation-then considered quite unexceptionable—that “we are all Keynesians now.” (Today one wonders if there are any Keynesians left.)
The comparison of the Reagan 80’s with the Roosevelt 30’s can be pursued further. FDR’s critics on the Left, much like those of Reagan on the Right, complained that he had not done enough, that he had squandered his chances truly to set things right, truly to seize the moment and rearrange America’s political and economic landscape. It was only in hindsight that the significance of his changes became clear, only from Degler’s perspective a quarter-century later that observers could perceive the outlines, within the context of an extraordinarily long-lived national consensus, of a peaceful American “revolution.” It is in precisely such a context that one can speak soberly of a Reagan “counterrevolution,” of a return, mutatis mutandis, to the classical Lockean ideals of bourgeois republicanism that the New Deal and the realities of economic modernity had presumably relegated to the dustbin of history. Not only did Reagan frustrate the Left’s long-nurtured ambition to transform America’s modest welfare state into a social-democratic redistributive state, he revitalized faith in the traditional (and presumably outmoded) American national virtues: individualism, private enterprise, the work ethic, voluntarism, personal freedom and responsibility.
The magnitude of Reagan’s accomplishment becomes fully apparent only in historical perspective. Anyone who has taught American history is acutely aware of the degree to which modern American political thought has been dominated by what might be called the progressive paradigm. The paradigm is disarmingly simple in outline. It suggests that the growth of big government is the inevitable result of the development of a complex national economy. In that view—prevalent since the emergence of Populism in the 1890’s—modern industrial conditions required the development of a national governing authority equipped to manage the economy, regulate private corporations in the public interest, and provide a welfare system to meet the vagaries of modern life.
Originating in a form of economic determinism, the paradigm carried obvious implications for political ideology, not only in America but everywhere the modernization process operates. From its premises flowed implied justification for ever more fully developed manifestations of strong central government: the logic of historical development moved from laissez-faire to mild regulation to the welfare state to the planned economy to social democracy to one or another form of socialism. Within that perspective, liberal periods in the American past such as the Populist and Progressive eras and the New Deal represented necessary accommodations with the future, while conservative decades like the 20’s, the 50’s, and, by extension, the 80’s represented anachronistic and futile protests against the imperatives of history. The progressive paradigm provided, in other words, the story line for modern American development, the framework within which one could place and make sense of particular events in the nation’s history. And in that story line liberals were always right and conservatives always wrong.
The hegemony of the paradigm followed from its apparent unanswerability. The most dogmatic of libertarians, after all, would be hard-pressed to deny the paradigm, altogether. It is the case that big government did, to some extent, flow inexorably from the processes of modern industrial development. Bigness begat bigness. The flaw in the paradigm issues from a subtle and insidious fallacy: the assumption that if, given modern conditions, some increase in government is good, even necessary, then more of that necessary good thing must be even better. But the conclusion does not follow—not in logic, and far less in history.
Americans learned that lesson in the episode of the Great Society. The Great Society represented a logical extension of the spirit of the Depression-era New Deal—the latter’s innovations having been consolidated in modified terms during the Eisenhower 50’s—into the prosperous 60’s. But the Great Society was, by any standards, a great fiasco. It did not solve the social problems at which it threw such huge amounts of money (it made many of them worse); it was, in macroeconomic terms, fiscally debilitating; it created horrendous problems of social dependency; and it stimulated social divisions and social unrests from which the nation is still suffering. The lesson of the Great Society is inescapable: we would have been better off without it.
American liberalism has in fact never recovered from the Great Society. The Watergate debacle diverted America’s attention for a considerable period, and the diversion continued during the odd and inept Carter interlude, but the larger pattern remained clear: the progressive paradigm had been broken and the “L-word” with which it was associated had become an ideological onus from which prudent politicians carefully distanced themselves. Activist government was no longer in the public mind an unambiguous good thing; it had become at best uncertain and problematic, at worst the unwitting source of our deepest problems.
It was Ronald Reagan’s achievement to raise all this from the level of inchoate intuition to conscious perception. Perhaps, Americans came to understand, the 80’s were not an aberration at all, any more than had been the 20’s or 50’s before them. Perhaps it was the decade of the 30’s, with its unprecedented experience of economic catastrophe, that was the aberration. Perhaps a dominant free-enterprise economy, undergirded by a safety net for the small minority who could not make it on their own, made more sense of the American experience than did visions of collectivist social democracy. Perhaps—heretical thought—Reagan told the American story better than Roosevelt had. A counterrevolution, indeed—and one given ideological confirmation by the collapse of socialism and the socialist ideal everywhere except in circles of the willfully self-deceived.
Not that Reaganism has fully carried the day. It has had to deal with a problem of cultural lag. Even while the politics of the progressive paradigm came under radical reconsideration, its moral correlative remained more or less in force. Commonplace political rhetoric—rhetoric dominated, of course, by the “journalists, historians, and intellectuals” responsible for the tendentious characterization of the 80’s that introduces this symposium—continued to equate social morality with government activism on behalf of those deemed to be the least-advantaged members of society. Concepts of “compassion” and “caring” translated inescapably in that view into new laws passed, new monies spent, new bureaucracies set in place. Despite the by-now innumerable cautionary experiences of the law of (doleful) unintended consequences of social policy, it remained the case into the Bush era that an administration skeptical of activist government placed itself by definition in the moral wrong. (There is also the troubled world of social issues—abortion, quotas, gay rights, feminism, crime and punishment, the family, moral and religious values—in which conservatives have made, at best, mixed advances and which will for the indefinite future remain an arena of Kulturkampf.)
But if conservativism is not unambiguously victorious, liberalism has, for the moment and into the foreseeable future, quite definitively lost. The liberals are writing coterie history to the contrary—history that tries to make its case by changing the subject-but there is no good reason why the rest of us should indulge their fantasies.
I do not think the 80’s have by any means been a disastrous decade for America. At the beginning of that decade, the Soviet Union was a totalitarian state with a vast gulag of imprisoned souls, not to mention the oppressed people of Eastern Europe under its heel—all a far cry from what now obtains. To a significant extent the collapse of Communism had its own rhythms and causes—yet surely the military strength of the West and its relative unity under NATO account for some of that collapse. Who knows what Brezhnev and his heirs (not to mention the dreary and dreadful bureaucrats who ran the Eastern-bloc countries) might have done were the West a pushover for them politically and militarily! I think, all along, the implacable critique of Leninism and Stalinism has proven to be one of the glories of conservatism. Of course, not only naive, gullible, or wrong-headed leftists or liberals have made mistakes with respect to their judgment of Stalinist totalitarianism in its various forms. I would like to see some of our political scientists look carefully at what Jeane J. Kirkpatrick said about the possibilities for change in Communist dictatorships, and what Hannah Arendt handed down as virtually a doctrine in The Origins of Totalitarianism—the inviolability of such a political system. What would Arendt think and say were she alive about what has happened during the last few years of the 80’s?
Moreover, who is checking up on the remarks and predictions made by many leaders of the so-called nuclear-freeze movement upon the election of Ronald Reagan? I listened to Dr. Helen Caldicott speak during that time, and I heard her warning us that a nuclear war was right around the corner (and I am here toning down severely her various statements).
A factory worker in the General Motors Chevrolet plant in Framingham, Massachusetts, told me he had heard Dr. Caldicott speak on his car radio, and thought this:
She’s half-crazy, and so are some who hold her up as the wisest one around. You can hear it in her voice, not just her words—all that scary talk, all that screeching, it is! And if you disagree with them, they point fingers and shout and try to tell you they’re smart and you’re dumb, and they’re right and you’re—well, you’re not just wrong, you’re sick!
A strange silence from such people now—and no eagerness to look critically at their own past judgments and, too, the accusations they leveled at others. For example, in the early 80’s a few of us suggested to some leading nuclear-freeze activists that their rhetoric was terribly exaggerated—and that we really did not think it fair that they should go to schools, as they were doing, with a message that Reagan’s election meant that a nuclear war was “inevitable” during his presidency. The response, I fear, showed that hysteria and meanness are by no means absent in people who regard themselves as sensitive and thoughtful and well educated. We were told we had this or that “problem,” we were “denying reality,” we were “rationalizing”—yet another reminder of a peculiar kind of cussing that is to be found among some of us in the psychoanalyzed segment of the liberal intelligentsia: if you don’t agree with someone, quickly slap some psychological label on him or her, cast doubt about the person’s motives, and in general use psychiatry as a discrediting device. In any event, all that seemed a strangely distant and irrelevant controversy by the end of the 80’s.
As for our domestic scene, I applauded the emphasis in the 80’s upon “family values”—a solid home life as the mainstay of society. I also was horrified and disgusted by an increasing indifference to the needs of millions of our most vulnerable citizens who were in jeopardy. I was, too, appalled by the various scandals of this past decade, the parade of crooks and phonies and liars who have walked before us on television and in our newspapers and magazines: the Boeskys and Milkens, the Swaggarts and Bakkers, and, of course, the high federal officials who have been shown to be part of all that—nothing new in our history, granted, but a reminder that those who denounce government excess and proclaim a new morality, all too readily, on the assumption of power, can exemplify in abundance what they have condemned.
Sometimes, as I look at this country’s recent political life, I think we are shaping a system that casts doubt on any federal assistance to the ordinary working people of this country, while granting, in the form of tax advantages and credits and privileges, tariff laws, subsidies, all sorts of encouragement and support to the well-to-do and the wealthy—a sort of free enterprise for the working class and socialism for the rich. I grant that some of our welfare laws have not worked—as I well know from my work with teenage mothers, who so often need most the moral and spiritual life they badly lack. Without such a life they do, indeed, become the apathetic, depressed, “welfare-dependent” single parents their harshest critics accuse them of being. Still, we have not in the 80’s reached out strenuously and adequately to such people—tried to figure out how we can engage with them so that they can, in turn, engage with our social and economic system. There is, too, plenty of old-fashioned, out-and-out racial suspicion and hate in this country—and, at times, I worry that such a side of our continuing history has been more than heeded (even catered to) by those who have succeeded politically in this past decade.
Personally, I have felt utterly out of keeping with much that has happened politically and culturally during the 80’s. I have strongly endorsed Christopher Lasch’s brilliant critique of the “culture of narcissim,” including the role played in that “culture” by my ilk, the ubiquitous shrinks. (We’ve removed the Bible from our classrooms, and all too commonly downplayed the significance of the flag, but the school psychologists are a mainstay, it seems, and the ideology they propound—the moral and philosophical assumptions they often unwittingly affirm—seems in no danger from either our Supreme Court or our liberal critics.) I associate myself with Lasch and with Jean Bethke Elshtain, and with much that Daniel Bell has written—conservative on so called social and cultural issues, and populist or egalitarian on economic issues: anxious to give working families and the poor the kind of economic boost our Congress gladly gives to already rich farmers and industrialists and bankers (the S & L scandal!) and defense contractors through various kinds of “special legislation”—again, tax breaks, credits, price supports, and on and on: a kind of “welfare dependency” we hear less discussed than is the case with that to be found in our urban ghettos. For some of us, then, the 80’s may well have been a decade of political loneliness—unable to stomach the cultural side of American liberalism as well as the class bias of American conservatism, and tempted, as I surely was in 1990, not even to vote in a presidential election.
The characterizations are wide of the mark. How, then, to explain why “so many people in America” accept the indictment? Well, let’s see now . . . I think it goes something like this:
Throughout the 70’s, the combination of inflation and an unindexed, “progressive” tax code moved all middle-income earners into tax brackets designed for the wealthy. This created great uneasiness across the land and led to the election of Ronald Reagan. Supply-side economics came in at about the same time. Stripped of all jargon, the new theory turned out to be timeless. Its central claim was that economic policy must be compatible with human nature. “Humans are rational,” the supply-siders said, in effect, “and they will not work very hard if unjust and excessive taxation strips them of the fruits of their labor.”
This insistence that human rationality must be acknowledged and deferred to by economic theorists may now seem unsurprising, but ten years ago the economics profession had strayed so far from all such “psychological” considerations that it seemed revolutionary. There ensued a tremendous outburst of indignation—echoes of which may still occasionally be heard on the editorial pages of the New York Times, the Nation, etc., and almost nightly on the CBS Evening News.
Economic journalists in particular led the charge against the restoration of human nature to economics. According to the cherished world view of the economics profession, there was something called “the economy” which worked hydraulically. There were income streams and savings sumps and liquidity traps—and worrisome imbalances if the levers of trade or fiscal or monetary policy weren’t properly handled. (The pressure might drop alarmingly throughout the whole system and we’d all be in a pickle.) Tax cuts? They would be inflationary and the math was there to prove it. But as long as wisely trained, well-intentioned people were at the control panels in Washington, “the economy” would perform satisfactorily enough. Human nature had nothing to do with it. It was absurd and dangerous even to suggest such a thing. (Would you amateurs mind staying out of this serious business, best left to professionals?)
The insistence that people wouldn’t be productive if deprived of the fruits of their labor was treated as though it were the kind of observation that a century earlier had been discarded as irrelevant by Alfred Marshall of Cambridge, the teacher of John Maynard Keynes. Marshall’s Principles of Economics, published in 1890, had transformed “political economy” into “economics,” and in the process philosophizing had been displaced by science, and much reactionary baggage discarded. In particular, the idea that the economic performance of a country depended on the self-interested behavior of its citizens was called into question.
Marshall believed, quaintly as it now seems, that human nature was changing rapidly, especially in the fifty years from 1840 to 1890 (corresponding to Charles Darwin’s ascendancy), and this in turn, Marshall thought, had brought about a “change in the point of view of economics,” which was beginning to pay “every year a greater attention to the pliability of human nature, and to the way in which the character of man affects and is affected by the prevalent methods of the production, distribution, and consumption of wealth.” The need for private property, then, earlier regarded as axiomatic by economists, “doubtless reaches no deeper than the qualities of human nature,” Marshall wrote. (Describing the intellectual climate of the late Victorian period, Bertrand Russell dryly noted that “everything was supposed to be evolving.”)
But now, in 1980, there came this retrograde development. Supply-side amateurs were insisting that it was back to square one for human nature! The reintroduction of the idea of incentives into economics threatened to overturn whole libraries and faculties, institutions and schools of thought—a carefully nurtured way of looking at the economic universe which excluded the individual almost completely (except as a consumer of goods). So Arthur Laffer was laughed at and George Gilder ridiculed, but that damnable Ronald Reagan seemed to be putting their ideas into practice, more or less; and in England the pushy Margaret Thatcher, the greengrocer’s daughter who became Prime Minister, was promoting the same silly ideas. How lacking in idealism they were, these parvenu advocates of the nouveaux riches! Was there to be no new society built? (No transformation of human nature after all?) No room for social justice? Were we to be thrown back on the tired old nostrums of private property, selfishness, and “greed”? Quite a setback for progress and civilization.
We all know what happened next. The detested ideas were put into effect (in 1982) and the U.S. economy grew rapidly; it has not stopped growing since. There was likewise a great improvement of the British economy in the 80’s. Furthermore, by the late 80’s it no longer became plausible to pretend (as the CIA had been pretending) that the socialist economies of Eastern Europe were chugging along quite nicely in their own quirky way. (The 1989 edition of The Statistical Abstract of the United States claims that the GNP per capita of East Germany is higher than that of West Germany!) Then, in the fall of 1989, the Communist governments of these Eastern-bloc countries gave up without a struggle. Today the only question is: how do you build capitalism on socialist ruins?
My own belief is that the final failure of socialist economics has been quite distressing for many of the same people whose daily avocation has been accusing Americans of greed and selfishness. In 1982, the left-wing economist Robert Heilbroner was candid enough to admit that
the collapse of the vision of socialism is one of the great intellectual traumas of the West. . . . As inefficiencies and indecencies have become evident in the Soviet Union, Cuba, China, East Germany, Poland, not to mention Yugoslavia itself, the once hallowed term “socialism” has become emptied of content. Moreover, as we look at the ideas of socialism apart from the forms it has taken in specific countries—ideas of central planning, nationalization, the “dictatorship of the proletariat”—we find the same sickening sense of vanishing ideals, empty slogans, terrible mistakes.
I’m sure that this “sickening sense” has been experienced by many intellectuals much more recently than 1982, but we hear few confessions from these people. (You only have to look at the worshipful reaction to Nelson Mandela to realize that the hunger for a socialist order is still very much alive.)
Don’t forget, the Soviet Union was supposed to create “New Soviet Man.” And that failure has been another big disappointment. The idea that human nature could be transformed came from Western Europe, and it is useful to think of the Soviet Union as having been for seventy years a passive laboratory for an evil experiment on man, suggested by Western intellectuals and carried out on unwilling subjects. As long as there was still some hope that this experiment might produce positive results, the Soviet Union remained effectively immune from Western criticism. And don’t forget that these hopes were still alive as late as the 60’s. But the experiment finally failed in the Gulag Archipelago, and today it has been effectively abandoned. The Soviet Union itself is unlikely to survive the experiment. It was a very costly and dangerous failure in applied sociology, and again, quietly traumatic for many in the West.
Only recently, it had seemed, we were well on the way to building new societies all over the world. They would be run by incorruptible, highly qualified leaders of men, beloved by their peoples, pragmatic in outlook, but driven by moral rather than merely material ambition. Julius Nyerere of Tanzania was the ideal. We would provide the cash and the condoms and the know-how where necessary (Robert McNamara and his World Bank would see to that); but when it came to soul and authenticity, by golly they would have something to teach us for a change.
Now it is becoming clear that if other countries are to attain our standard of living there is only one way they will be able to do so: they will have to forget about tribalism and adopt Western institutions. I am talking about the rule of law, secure private-property rights, the freedom of contract, government preferably limited constitutionally, and so on. (We are ourselves in danger of forgetting about, or even actively undermining, these institutions. Our leaders don’t talk about them enough, or even really understand them. We talk too exclusively about democracy—an insufficient prescription.)
Those who take pleasure in condemning America find all this very galling. They may pay lip-service to the free market but in practice they despise it and will continue to do so. It creates wealth, yes, but it also denies power to intellectuals. It reduces lawyers to service-providers—working on wills and estates. I’ll never forget Barry Bosworth, then as now with the Brookings Institution (he also worked for the Carter administration), telling me in 1976 that the trouble with free-market economists was that they were always talking themselves out of a job. True.
In the 1860’s, an English legal historian named Sir Henry Maine noted that “the movement of progressive societies has hitherto been a movement from status to contract.” Those who were formerly born to greatness were more and more being displaced by those who had succeeded through their own efforts. But even as Maine wrote, the socialist movement was gathering steam. There followed Sidney and Beatrice Webb, the Fabians (to whom Alfred Marshall was always deferential), the revolution of 1905, the Bolshevik Revolution, “New Soviet Man,” Mao Zedong, Khrushchev, Sputnik, Fidel Castro, the 60’s, “people’s democracies,” the Communist triumph in Vietnam . . . followed by Thatcher and Reagan and the 1980’s. In retrospect, what we have been through is a century-long attempt to reverse Henry Maine’s dictum: in other words, to replace contract with another kind of status—one deriving from credentials rather than birth. The 80’s saw the end of this counterrevolution of expertise, much to the annoyance of the experts.
It is rarely wise, professional historians will warn you, to judge a decade—or a government—immediately upon its passing. Not only are we too close to the events and the personalities to be objective, we simply lack the longer-term perspective to provide us with the proper comparative measure of historical “success” or “failure.” A monarch or a president regarded as mediocre by his contemporaries looks a lot better in posterity’s eyes if it turns out that he is followed by several totally incompetent successors; a decade characterized by economic recovery and increased business confidence (the 20’s, say) appears much more suspect retrospectively if it leads toward financial collapse and industrial decay (the 30’s). If COMMENTARY is still flourishing in twenty-five years’ time, it will be instructive to return to this issue, and check how valid our early rush to judgment upon the 80’s appears by then.
Moreover, in this particular case the problems of premature historical assessment are compounded by the wildly conflicting images of—and opinions about—the Reagan presidency. While some adore Reagan, others denounce him vehemently. Already one has the sense that the more unbalanced critics have forgotten just how popular and appealing the previous President was in the eyes of many of his countrymen. His sense of humor, his gallantry, his natural charm, his preference for riding on the range rather than chairing committees, all in their way complemented his determination to “stand up” to Communism, to make the United States stronger militarily, to assert basic Western values. The Soviet Union had to be dealt with from a position of strength; international nuisances like Libya had to be taught a lesson; if military measures were called for, they would be carried out, be it in Grenada or the Persian Gulf. To an American public, angry and humiliated at various setbacks from Vietnam to Iran, this was a welcome relief. It was as if a frontier town, previously terrorized by outlaws, had at last received a new sheriff. The B-movie actor of the 40’s cowboy films came into his own in the 80’s White House. And the palpable weakening of resolve of the Soviet outlaw (or “Evil Empire”) by the close of that decade, with Gorbachev virtually begging for a compromise, was ample vindication of this Reaganite policy of peace through strength. Clearly, in the view of many Americans, this had been a “good” presidency.
Yet while the Reagan presidency was restoring the American position in world affairs in the short term, it was also weakening it over the longer term. This is not to blame the post-50’s relative decline of the United States solely upon Republican mismanagement, as the Dukakis electoral campaign of 1988 tended to do. The shrinking American share of global GNP and world-manufacturing output, the erosion of its lead in high-tech industries, the failures in its public educational system, the dreadful poverty of its inner cities, the aging of its infrastructure, were long-term developments which neither Democratic nor Republican administrations had succeeded in reversing.
But Reagan’s presidency was “bad,” not simply because it tended to ignore such isues, but also—and more seriously—because it exacerbated them. Its reckless fiscal policies swiftly turned the United States from being the world’s greatst creditor-nation to being its greatest debtor-nation. Its propensity to live beyond its means increased its deficits, and worsened its international indebtedness (and its overall current accounts); by 1988, it was no longer a truly independent nation financially. Its military buildup, based upon exaggerated estimates of Soviet “power,” had siphoned off engineers, scientists, and skilled craftsmen from export-oriented industry, and given a further advantage to Japanese and European competitors. Its encouragement of a totally laissez-faire mentality had prevented any long-term industrial planning on the Japanese model. Its fondness for consumption had hurt savings rates and capital investment. Its concentration upon such symbolic issues as the pledge of allegiance in schools was accompanied by a neglect of any fundamental reforms of the educational system as a whole.
Above all, the sunny, upbeat tone of the Reagan presidency and the positive response of the American public to his “good” years in office—aspects of which were repeatedly emphasized in the Bush campaign of 1988—naturally left the country less ready and prepared to accept the “bad” years of confronting the harsher reality that lay ahead. Increasing taxes, reducing consumption, cutting entitlements, investing in science and infrastructure rather than automobiles and household goods, were going to be that much harder to achieve in future years because Reagan had succeeded in convincing a majority of the American people that all was well and that there was no need to change.
Ironically, one suspects that by early next century the greatest critics of American policies in the 80’s are not going to be the liberals (although they no doubt will still have much to complain about), but the true conservatives: that is, those who prefer fiscal rectitude to profligacy, believe that the United States should possess and protect key strategic industries, distrust laissez-faire economics, and instinctively feel that the country’s long-term prospects as a Great Power do rest upon a manufacturing and financial base that was badly eroded during the period of Reagan’s carefree, good-humored, but essentially feckless presidency. It is from that quarter that there will be the greatest resentment of this 80’s legacy of short-term charm and long-term harm.
Richard John Neuhaus:
Since I do not accept the characterization offered of the policies and ideas of the 80’s, I have only to address the second question. To ask why “so many people” do accept the characterization offered is to ask why so many people identify with the general drift of the Left. Answering that would require an extended excursus on the intellectual history of the West. The more manageable question in this context is: if there has been something like a “worldwide triumph” of American ideas and policies in the 80’s, why does the Left persist in denying the lessons to be drawn from that?
In trying to understand attitudes, whether on the Right or the Left, one should not underestimate the stupidity factor. Also, there is the fact that people value continuity. Their sense of self is at stake. They do not want to let down the side or to be accused of betraying the cause. Even the most putatively radical are traditionalists, as evident in the exhortation to “keep the faith.” In addition, the revolution of 1989 took almost everyone by surprise. People were not prepared for the dramatic demise of Communism and the vindication of democracy and market economics. These events are still very new and have not been intellectually assimilated, never mind formulated into a usable partisan line. When in doubt, people who are paid to explain the world go on saying what they said before. Since recent turns of events falsify more propositions of the Left than of the Right, the Left is the more embarrassed and ends up offering more manifestly silly explanations.
We are currently in a period of post-cold-war Newspeak that, unfortunately, may not be short-lived. Thus people in comatose socialist regimes who still subscribe to the conventional doctrines of the Left are now called conservatives. Thus we are told that the Soviets’ loss of the cold war proves that there never was a cold war, except in the fevered American imagination. Thus it is explained that the revolution of 1989 demonstrates the superiority of Communism, since they are having their revolution while there is no sign of revolution in our society. There is at present a deep, and frequently amusing, incoherence in the explanations being proffered by the Left.
That will not change any time soon. A turn toward coherence would require acknowledging that political arch-enemies (e.g., Reagan and his capitalist gang) were, at least in large part, right, and the Left was wrong. Self-examination, contrition, repentance, and amendment of life do not come easily. That is true on the Left and on the Right, but it is more true on the Left. It is more true on the Left because the secularized Left has made a deeper investment, even a religious investment, in political construals of reality.
The Right is more inclined to relativize the imperiousness of the political, and to take alarm at the slightest whiff of utopianism. The Left is more attractive to political heavy breathers who seek a perpetual high from visions of peace, justice, equality, and cosmic harmony. They seek, in sum, a world very much like that promised in the messianic age. Everyone, knowingly or not, shares that yearning, but the Left is different in that it thought it had the political and economic formulas for the realization of that hope in history. In the aftermath of 1989, a measure of sympathy is in order for people whose entire construction of reality has been rudely destroyed by the very history in which they had placed their faith. In the absence of an alternative belief system, they will, at least for a time, continue to sing the tunes they know, even if they are the tunes of the gods that failed.
Others on the Left are less ideologically driven but are extremely nervous about any talk of the “triumph” of American ideas and policies. They might, sotto voce, admit that America and the West come out of the 80’s looking pretty good. They might even admit that their domestic political opponents turned out to be right on some important scores. But any talk of triumph, they fear, distracts attention from the many problems that need to be addressed in our society. So they tell us that, while it is clear that Communism and even socialism have failed, it is by no means evident that democracy and capitalism have triumphed. In this telling, the failure of the former seems almost causeless. Or, if there are causes, they are to be explained in terms of the internal contradictions, so to speak, of Communism.
Underlying the strategic concern to keep attention focused on America’s faults and failures, there is a moral anxiety about any talk of “triumph.” Many in our elite cultures have learned well Reinhold Niebuhr’s cautions against hubris, while ignoring what he had to say about historical responsibility. Patriotism, in their view, necessarily bespeaks arrogance. Patriotism and nationalism go together, and both offend against the vaulting universalism of the liberal vision. In the Niebuhrian view, patriotism may be grounded not in arrogance but in gratitude joined to a sense of responsibility. The “so many people” mentioned by the editors have great difficulties with that. The National Council of Churches, for example, this year issued a jeremiad declaring that 1992, the 500th anniversary of the landing by Columbus, should be marked not by celebration but by repentance for the exploitation, racism, colonialism, and genocide perpetrated by Europeans in the Americas.
The jeremiad is a venerable genre on both the Right and the Left. Jeremiads from the Right tend to deplore our decline from the high standards and achievements of the culture that defines who we are. Jeremiads from the Left tend to deplore the conceit that we are anything special to begin with. It is therefore much more embarrassing for the Left when Americans find themselves admired and emulated by almost the entire world. It was all right for Lincoln to talk about this experiment as the “last, best, hope of earth” in the context of our bleeding for our sin of slavery, but it is intolerable that we should derive any satisfaction from the world’s agreeing with Lincoln in the present situation. In the view of the Left, invoking American ideals against America is acceptable. Thus, during the Vietnam war, we were incessantly reminded that the constitution promulgated by the gangsters in Hanoi was redolent with rhetoric lifted from Jefferson and friends. In short, in both its religious and more secularized forms, the Left typically confuses the Christian virtue of humility with self-hatred. More precisely, what is flaunted as humility is hatred of those who dissent from the Left’s construal of America’s crimes and failures.
The question of why “so many people in America itself” condemn the ideas and policies that seem now to be carrying the day has many parts. The aforementioned time factor accounts, partly, for the difficulty the Left is having in coming to terms with the fact that, in the last twenty years or so, the “stupid party” got smart. The Center and Right-of-Center is where the ideas are. That is hard to accept for a Left that has historically understood itself to be the party of innovation, imagination, and creativity.
Examples abound. Only this year did an institution of the moderate Left, Brookings, get around to recognizing the imperative of parental choice in educational reform. The explanation merchants of the Left are tying themselves in knots to make sense of “conservative bleeding hearts” such as Jack Kemp who advance new ideas about what might be good for the poor. Others find themselves having second thoughts about, horribile dictu, censorship as they simultaneously advocate government support for pornography and the prohibition of politically incorrect speech on university campuses. And yet others are, reluctantly, beginning to acknowledge that the interests of black America are not entirely congruent with continued support for a superannuated civil-rights leadership. A growing impatience is expressed not only with the Al Sharptons but with the upmarket Al Sharptons such as Jesse Jackson. And so it is that—on education, poverty, censorship, race, and much else—the moderate Left, slowly and painfully, makes “respectable” the ideas that have for years been current on the Right-of-Center. To be sure, even this sluggish process of change has not touched the “so many people” that the editors have in mind.
The really big new thing in the aftermath of the revolution of 1989 is the now indisputable centrality of the cultural questions in public life. Public debate, in any society, is basically about three clusters of questions: the political, the economic, and the cultural. In the modern era, the great debates have concentrated on the political and economic. Indeed, for a very long time there has been a self-conscious effort to steer away from the cultural. The chief reason for this is that at the heart of the cultural are the most powerful beliefs and passions that, as in the wars of religion in the 16th and 17th centuries, can destroy civil discourse altogether. But now the arguments over the political and economic have come to an end, at least for a time. While there are marginal disagreements, it is now evident to all rational parties that representative democracy is the way to go politically, and the mainly free market is the way to go economically. For purposes of significant public debate, that leaves the cultural questions.
Both the Right and the Left will have a hard time getting accustomed to this new situation. But the Left will have the harder time, for the Right generally eschewed the notions of economic determinism and political utopianism. The Right, in most of its expressions, is more comfortable with the “social and moral questions,” which is to say the cultural questions. The emergence of the cultural questions at front stage center returns us, interestingly enough, to what Aristotle understood as authentic politics. Ethics and politics, according to Aristotle, are the same inquiry, both asking, “How ought we to order our life together?” Public debate in our century was distracted from that question by the relatively brief political madness of Hitler, and by the prolonged political and economic madness of Marxism-Leninism. The energies of the best, brightest, and most sane of our thinkers were expended in combating those bloody absurdities.
Now, at long last, we are returned to the real business of politics. The new situation cannot but be deeply distressing to the “so many people.” The Right has generally understood itself to be the guardian of culture, of the historically transmitted, of the “givens” of how we order our life together. In literature, the arts, and social arrangements so basic as the family, the Left understands itself to be countercultural. The self-understanding of the Left assumes that others will attend to preserving the culture that it is its business to change. Now, more than at any time since 1914, it becomes evident that the continuing argument is between Edmund Burke and John Stuart Mill, between traditional responsibility and the unlimited play of critical consciousness.
If I am right about the recentering of the cultural, we should prepare ourselves for ever more concentrated debate on what are called the social and moral issues. On this view, for example, the abortion debate is not a distraction from politics but raises one of the most urgent political questions of our time: who belongs to the community for which we accept common responsibility? A reporter for the New York Times who covered both the big pro-choice march of 1989 and the pro-life march of earlier this year remarked on the phenomenon of “two Americas” and “two cultures,” as indicated by the different vocabularies employed. The dominant language of the pro-choicers, she said, was about “rights and laws,” while the dominant language of the pro-lifers was about “rights and wrongs.”
In the era that we have now entered, those who can, with public persuasiveness, speak of rights and wrongs have the advantage. The “triumph” of American ideas and policies is the continuing triumph of the moral claims of the West after the two great barbarous aberrations of this century. This is not occasion for triumphalism, but for gratitude, true humility, and renewed seriousness about transmitting the civilizational story to the next generation. For many reasons, some of them mentioned above, this way of viewing the matter is abhorrent to the “so many people” who worry the editors, and who should worry all of us.
Eugene D. Genovese:
If the dreary remains of the left-wing press may be taken as a guide, the doughty survivors of the radical Left and of Left-liberalism—to the extent that the two can any longer be distinguished—are once again determined to make fools of themselves. Confronted by a victorious worldwide counterrevolution against everything they have stood for, they happily dwell on the evil legacy of Ronald Reagan. Ever ready to display a sense of humor, they claim credit for the wonderful doings in Europe, which, of course, reflect their own highest aspirations and the realization of what they themselves have wanted all along. Simultaneously, they condemn the ideology and policies of the American Right, which the leaders of those wonderful doings, not to mention the voters, take as their model.
The Left, viewing the disgraceful rout of its troops on all fronts, proudly claims victory abroad, while it undertakes the small task of convincing the American people that only the implementation of a thoroughly discredited agenda at home could save us from the horrors it claims as democratic triumphs everywhere else. Lacking the satirical genius of a Jonathan Swift, I respectfully ask to be allowed to pass over the spectacle in silence.
The pros and cons of Reaganism and the depth of the changes it has introduced will take a long time to sort out, and nothing is to be gained by continuing the present exchanges of encomiums and laments. But surely, the 80’s will be remembered as the decade in which socialism met its Waterloo. No amount of blather about the collapse of Communism’s having opened the way to “real” and “democratic” socialism will serve. The Communists, for better or worse, introduced the only socialism we have had, whereas the social democrats have everywhere settled for one or another form of state-regulated capitalism. Many things went into the making of the collapse of the Communist regimes, but, as every honest Communist from Gorbachev on admits, the immediate cause has been exposed as state ownership of the means of production-and for reasons that, alas, Ludwig von Mises, among other right-wingers, long ago identified.
That the collapse of socialism proves the virtue of a (nonexistent) free-market capitalism is another matter. The left-wing critics of Reaganism score heavily in their condemnation of the social and cultural barbarism that now reigns triumphant, even if they fail to notice that much of their critique was prefigured by traditionalist conservatives, who never have been enamored of the market and its businessman’s culture. The crisis that has wracked the socialist world has obscured the crisis that is wracking the capitalist world in general and the United States in particular. And the many-sided crisis of both demonstrates the truth of the witticism, uttered more than a decade ago by John Lukacs, that the “isms” have become “wasms.”
Socialism failed to do the one thing that might have saved it: generate the economic prosperity that could have provided the time for it to disassemble its Byzantine political systems and to establish the moral legitimacy without which no sense of political obligation is possible. In contrast, capitalism has once again proven superior to all alternatives in generating economic growth and technological development. It is easy to forget that the entire project of the socialist reconstruction of society proceeded on the bold Marxist assertion of the superiority of socialism as an economic system. Nothing could be more naive than the radical-Left daydream, which is today stronger than ever, that the economy would fare better in the wake of the destruction of all hierarchy and stratification and the transfer of economic power to the workplace. Everywhere and always, when such nostrums have been tried, the result has been a catastrophic erosion of the work ethic and a retreat into modern equivalents of peasant self-sufficiency and attendant economic devolution.
Yet the beat goes on. Almost nowhere on the American Left do we find an inclination to subject its time-honored premises to the “radical critique” called for with respect to everything else. To begin with, the classic premise of the Left has been the inherent goodness or quasi-infinite malleability of human nature—a premise shared by much of the free-market Right, occasional pretenses and qualifications aside. This classic premise has especially triumphed in the mainline churches, which have traded the hard wisdom of Christian theology for a neo-transcendentalist reduction of sin to the passing embarrassment of a lapse from the good, and which have embraced a neo-universalist doctrine of the salvation of everyone. (Personally, I have always been thrilled by the prospect of meeting Adolf Hitler in heaven.) To this day it has not occurred to radical leftists and Left-liberals that the central contradiction in the socialist countries has been the vain attempt to combine an unrealizable goal of personal liberation with a form of social organization that, above all, requires maximum social discipline. Meanwhile, the Right, having sealed off its traditionalists, embraces the illusion of personal freedom and, with sound logic, hails the market as the one great social force for its realization.
Personal liberation we are getting with a vengeance. Unfortunately, the great theologians were right and the radical leftists and free-market right-wingers—for that is what these “conservatives” really are—are wrong. With high if unintended humor, the Left demands an “involved,” “concerned,” “compassionate,” and “activist” central government to save the afflicted, and it does so, at least if we are to take seriously the rhetoric on abortion and gay liberation, by campaigning for the individual’s absolute right to his—I trust not only her-own body. With even higher, if also unintended humor, the Right weeps and wails over the collapse of the family, of morality, of respect for religion, of education and higher learning, of our “social fabric,” and it does so while campaigning for the extension of the very marketplace of consumer choice that has stamped capitalism as the greatest revolutionary solvent of traditional values in world history. Apparently, it never occurs to these defenders of God, family, and social order that “consumers,” left to choose, should be expected, more often than not, to choose self-indulgence, corruption, and scintillating filth over more mundane commodities. What a pity we cannot heed William J. Bennett and Allan Bloom and restore a genuine core curriculum to all our schools. For then we might be able to require a four-semester sequence in Christian theology or at least in common decency and elementary good sense.
As is, the rot deepens: we are being overwhelmed by drugs; mass homelessness; the poisoning of our children with pornography, perversion, and impossible aspirations; the transformation of our cities into Third World metropolises for the ostentatiously rich and the miserably poor; and the steady decline of the national productivity, work ethic, and production of basic goods that under-gird the whole. I very much doubt that sane, decent, and honorable right-wingers are sleeping any better than their left-wing counterparts.
The free market has always been a myth, never more so than in this age of international conglomerates and such evidence of massive corruption, greed, favoritism, and mismanagement as the S & L scandal. Irving Kristol may be right that much of the “peace dividend” will prove illusory, but, surely, the end of the cold war ought to offer fresh opportunities for some expansion and redirection of investment into socially more constructive efforts. The collapse of socialism should reinforce general acceptance of private property and a strong market sector, but it need not invite further irresponsibility by uncontrolled or inadequately controlled private interests.
We are living through the early stage of a massive worldwide economic and social transformation, the outcome of which is neither fated nor readily predictable. Private property and markets are compatible with a wide variety of social, political, and economic systems. If socialism (state ownership of the means of production) has been economically and politically discredited, capitalism (private ownership of the means of production under minimal control) is daily being morally and socially discredited. Probably, if the best elements of the Left and Right were speaking to each other, they could agree on that much and on the need for a new departure. But no part of the political spectrum seems able to lay its own ghosts.
The Left has carried Marx’s utopian view of human nature to its logical-and, ironically, ultra-individualist—conclusion and embraced every mutually exclusive call for personal as well as group liberation. The left wing of liberalism is following a similar trajectory, as the comical course of the national Democratic party attests. The right wing of liberalism stands for nothing discernible, at least at this moment. The Right, celebrating its electoral victories, finds itself in ideological disarray and on the verge of a political split. Morally responsible free-marketeers, including such libertarians as Murray Rothbard, stand aghast at the cultural unraveling that is accompanying the victories of their political and economic program. Unwilling to acknowledge the unraveling—the libertinism against which Rothbard admirably protests—to be the predictable outcome of a market mentality that reduces everything to the status of commodities, they content themselves with exhortations to consumers to make the right moral choices.
The free-marketeers, absurdly called “conservatives,” either find our cultural situation perfectly acceptable or at least bearable, or blame the atrocities on continued statism, bureaucracy, and welfare paternalism. They may have a point about those diseases, but their projected cure of heavier doses of individualism promises nothing except exacerbation. The sickening racial crisis alone should be enough to make clear that no solution short of the ghastly would be possible without measures that bring communities and groups to center stage and reaffirm the traditionalist and once-Marxist principle that individual freedom must be understood as a product of social organization. The free society most of us want can be free only to the extent that social safety and the requisites of social discipline permit. A society deserves to be called free to the extent that it places the burden of proof on the state when limitations on individual freedom are called for. No one has an absolute right to his own body, but no state should be tolerated if, without compelling reason, it restricts the individual’s claims to privacy and to rights in his body.
None of us has an immediate solution to the drug problem, which has become a life-threatening cancer on our body politic. But who could believe that a country with our level of political culture and material resources could be overawed by a gangster empire, if it had the will to prevail? Undoubtedly, action would require curtailment or redefinition of civil liberties, not to mention the high cost of incarceration and some judiciously selected executions. The requisite action carries with it potential evils that ought not to be taken lightly. But if the drug problem is indeed a cancer, if we are indeed in a “war,” if we mean what we say, then the only matter left to discuss ought to be how to provide maximum guarantees against excesses of a police-state nature.
And the same principle holds for racism, for poverty, for crime, for homelessness, for pornography, for all other conditions and practices incompatible with civilized life. I do not believe that the necessary measures could be carried out without the restoration of a productive national economy. And even then, I do not believe that such measures could be carried out safely—with respect for the genuine claims of freedom—without the restructuring of our socioeconomic relations in the context of a corporatism that respects private property but makes it subject to social control and the guidelines of a national moral consensus. Traditionalists, especially the Southern traditionalists to whom the Reagan and Bush administrations have given short shrift, have been saying much of this for a long time. Those on the Left who did not get drunk on the anarchism of the 60’s have too. And unless I badly misread the signs of the times, large numbers of people across the political spectrum are ready to say Amen! Whether we remain able to hammer out the specifics of an appropriate vision and find leaders worthy of the challenge remains to be seen.
Of the very few things that we can believe with absolute certainty about the 80’s—a decade more difficult to assess than any I have lived through—one is that the cold-war anti-Communists were proved right not only in their moral condemnation of Communism but in the policies they devised for its destruction. Another is that these same cold-war anti-Communists are never going to be forgiven by the Left—which nowadays includes almost (if not quite) everybody in the media, the academy, the arts, the literary and publishing worlds, the entertainment industry, and the Democratic Party—for being right about the most important political question of this century. The Left will fight to the last ditch—or the last op-ed article, anyway-to uphold the notion, which no sane person east of the now-dismantled Berlin Wall believes for one moment, that the steadfast cold-war policies of the West had nothing whatever to do with the collapse of Communism in Eastern Europe and its accelerating disintegration in the Soviet Union.
Indeed, the claim is now made—not only in places like the Nation but even in Time and other mainstream publications—that the United States actually impeded the collapse of Communism by resisting its triumph. (I know this sounds crazy, but that is the kind of craziness with which we are now obliged to contend.) It was left to Arthur M. Schlesinger, Jr., always ready to bend history to the purposes of liberal mythology, to cap such claims by arguing that the events of 1989 represent a vindication of FDR’s concessions to Stalin at Yalta! “In the short run,” Schlesinger wrote recently in the Wall Street Journal, “Stalin gained. But statesmanship is tested by the long run. Now, forty-five years later, the Soviet Union is at last honoring the Yalta agreements.” In other words, it was Franklin Roosevelt, not Ronald Reagan, who won the cold war. Or are we being asked to believe that there wasn’t any cold war to win? In this kind of Alice-in-Wonderland historiography, reality—even if it has cost millions of lives and caused suffering and loss beyond measure—is easily dispensed with.
It would be nice to be able to attribute such absurdities to sheer cynicism or fatuousness—factors that are never to be discounted in the thinking of the liberal old guard. But in fact something far more significant and sinister is involved, and in this development old-guard liberals like Professor Schlesinger, still harboring their daydream of an eternal return to the New Deal or the New Frontier, are mere supernumeraries. They can still be wheeled in on ceremonial occasions to lend intellectual respectability to certain ideas—the idea, for example, that the policies of the Reagan administration had nothing whatever to do with the collapse of Communism in Eastern Europe—but they no longer play much of a role in setting the Left’s agenda, which in fact is a good deal more radical than anything dreamt of in Professor Schlesinger’s philosophy.
That agenda, it is worth recalling, had its origin in the radical protest movements and counterculture of the 60’s—above all, in the idea that America, or rather “Amerika,” was at once an intolerably repressive society at home and the principal source of political evil abroad. The rapidity with which this bizarre and fundamentally mendacious political fantasy became so deeply rooted in our cultural life and the power it has continued to exert over the course of the last two decades—power that has now come to dominate larger and larger areas of American life—have never, to my satisfaction, been adequately explained. Perhaps there is such a thing as a collective death-wish that comes to afflict certain societies when so many of their material needs are met and so many of their spiritual hungers are left unmet, thus opening them to the appeal of strange gods. I would like to think otherwise, but the evidence of our own society just now makes one cautious about rejecting such distasteful theories. Whatever our explanation for the state of affairs in which we find ourselves, however, the fact remains that this thoroughly bogus and pernicious idea of American civilization is the one that currently prevails at almost every level of cultural life. It is as much in evidence in our law schools as it is in our wretched pop music. It governs the ideological outlook of the commercial television networks quite as much, though not quite as vehemently or systematically, as it dominates PBS programming and National Public Radio. And it is now a permanent part of the academic curriculum—the only part that meets with no protests about “canonization”—in our elite schools and universities. It is the reason why there is no longer anything in American life that can reasonably be described as a political center. The polarization that the radical movement of the 60’s set out to achieve—the division of American society into “them” and “us”—is now a fact of cultural life, and looks to be an enduring one. We have become, in effect, two nations—or at least two societies—that are so deeply divided about the fundamental issues of American life that public questions as diverse as Supreme Court appointments, the grant-making procedures of the National Endowment for the Arts, and the problems entailed in dealing with the AIDS crisis are inevitably escalated into battles resembling a form of cultural civil war.
Given these divisions, there was never any chance that the real achievements of America in the 80’s—a prosperous economy at home and a triumphant victory in the cold war abroad—would be treated by the culture as anything but a further index of American failure. Toward the Reagan administration in particular and American society in general our cultural establishment has long operated on the model of the Soviet justice system: the first thing to be determined is a verdict—which is always the same verdict: guilty!—and only then is it necessary to cite an appropriate crime. Is it any wonder that the media are so universally despised or that so many newspapers and magazines are suffering a historic loss of readership? The other America—the America that is so benighted that it cannot understand why its grade-school kids, while barely able to read, must be instructed in the use of condoms, not to mention the niceties of anal intercourse, or why its tax money should be devoted to exhibiting pictures of young men inflicting sexual torture upon each other—this America knows when it is being lied to about issues of great moral import. This other America—which, I daresay, encompasses a larger portion of the “enlightened” middle class than can nowadays ever admit to doubts on these matters, lest it be branded yahoos, reactionaries, or (worst of all) Reaganites—feels itself morally disenfranchised and more than a little terrified. Wherever it turns for moral support—the classroom, the pulpit, the legal system, or even its representatives in government-it meets with the same response: elaborate, “caring,” sophistical explanations as to why yesterday’s road to perdition must now be regarded as tomorrow’s path to salvation. No wonder there is such a widespread feeling of moral panic in this country.
As for all those pronouncements about greed and selfishness and the disorderly life of our cities and schools, I am not myself inclined to look for lessons in ethics and morals—not to mention social policy—from the folks whose lethal programs and ideas set us on the course that has led to our present horrors. We know from what quarter the concentrated assault on the family as an institution was initiated, and with what results. We know what elements in our political life organized the destruction of the New York City school system—until then, one of the best in the world—back in the 60’s. We know in what part of our culture the legitimation—nay, the celebration—of drugs as a way of life originated. (Allen Ginsberg, please take a bow here.) We know who launched the “black-power” movement, thereby destroying the civil-rights movement of the early 60’s, that has cost blacks even more than it has cost the rest of us in this society. In the politics of the social disintegration that now besets us, as in the decadent culture that is now swamping us, we are witnessing the denouement of every rotten idea the 60’s bequeathed us a generation or more ago. If ours was a society with a highly developed sense of shame, the folks who championed these ideas would be reduced to a repentant silence. Instead they are more loudly than ever urging more of the same for the future.
It was the New Republic—not exactly a Reaganite journal—that spoke in its 70th-anniversary editorial, in 1984, of “the terrible social pathologies emerging from the welfare state,” and went on to point out that “liberals are in crisis because they have hardly begun to figure out a coherent response, first, to the unhappy social facts; and second, to the vast defection of voters from the liberal dispensation in public policy.” The only thing that has changed since those words were written is that six years later the same “terrible social pathologies” are even more advanced than hitherto and are now invoked on the Left as if they were simply a product of the Reagan administration. Whether this is a case of historical amnesia or political cyncism—or both—I shall leave for others to determine.
Something like this same combination of historical amnesia and political cynicism, together with a large element of moral insensibility, now beclouds the discussion of the momentous events taking place abroad. The ease with which the media and the academy in this country have been allowed to claim the collapse of Communism as, of all things, a vindication of the idea of revolution is yet another sign of the bad faith that permeates virtually all public discourse about politics on the Left today. What has been happening in Eastern Europe and the Soviet Union is, in actual fact, the most sweeping example of a counterrevolution known to modern history. Yet our press, our pundits, and—most unforgivable of all—our Republican President prattle on, without challenge, about the “revolutionary” developments that are attempting, not always very successfully, to bring democracy and capitalism to societies devastated by the destructive consequences of revolution. This is something more than a quarrel about words. It is a battle of ideas—ideas about how to live our lives-and in that battle we seem unable to articulate the moral imperatives of the counterrevolutionary movement we have done so much to set in motion. This, too, is a legacy of the 60’s.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
The American 80’s: Disaster or Triumph?
Must-Reads from Magazine
A monstrous regime's rational statecraft
ne of the more improbable geostrategic surprises of recent years has been the revival of the North Korean economy under the direction of Kim Jong Un. Just to be clear, that economy remains pitiably decrepit, horribly distorted, and desperately dependent on outside support. Recent estimates suggest that its annual merchandise exports do not reach even 1 percent of the level generated by its nemesis, South Korea. Even so, the economic comeback on Kim Jong Un’s watch has been sufficiently strong to permit a dramatic ramp-up in the tempo of his nation’s race to amass a credible nuclear arsenal and develop functional intercontinental ballistic missiles capable of striking the U.S. mainland. That is, of course, the express and stated objective of the program. Pyongyang today appears to be perilously close to achieving its aim—much closer now, indeed, than complacent Western intelligence assessments had presumed would be possible by this juncture. But then, North Korea is full of surprises for foreign observers.
The difficulty with analyzing the country’s weaknesses and strengths comes from the fact that the North Korean system—which is made up of the Kim dynasty, the North Korean state, and the economy constructed to maintain them both—is unlike any other on earth. By now, its brand of totalitarianism (“our own style of Socialism,” as Pyongyang calls it) is sufficiently distinctive that children of the Soviet or Maoist tradition also commonly find themselves at a loss to apprehend its logic and rhythms.
North Korea is no longer even a Communist state, if that term is to have any meaning. The once-prominent statues of Marx and Lenin in Kim Il Sung Square were removed some years ago. Mention of Marxism-Leninism has reportedly been excised from the updated but still currently unpublished Charter of the Korean Workers’ Party. The 2016 version of its constitution excises all references to Communism, extolling instead only the goal of “socialism”—and its two “geniuses of ideology and theory,” Kim Il Sung and Kim Jong Il (the grandfather and father of the current dictator). Small wonder that the world routinely misjudges—and very often, underestimates—the North Korean state and its capabilities.1
Despite its suffocating ideology, for example, North Korea is capable of highly pragmatic adaptation and economic innovation. Notwithstanding its proclaimed “self-reliance” and its seeming isolation, it is constantly finding new sources of foreign cash through ingenious and often remarkably entrepreneurial schemes overseas. And despite all the international sanctions, Kim Jong Un really has overseen a North Korean economic upswing of sorts since assuming power in 2011, the signal fact that best helps explain the acceleration in Pyongyang’s push for a credible nuclear and ballistic arsenal. Thanks to these and other apparent paradoxes, an economy seemingly always on the knife edge of disaster somehow manages to stay on course, methodically amassing the military might for what it promises will be an eventual nuclear face-off with the world’s sole superpower.
Though the hour is late, given all the progress that North Korea has been permitted over the past generation, it nevertheless looks as if there may still be time left to prevent Pyongyang from completing and perfecting its nuke and missile projects through “non-kinetic means”—that is to say, through international economic pressure as opposed to military action. For such an approach to work, however, we will need an informed and robust strategy—not the feckless, episodic, and intellectually shoddy interventions we have mainly witnessed up to now.
Indispensable to such a strategy must be an understanding of the North Korean economy—the instrument that makes the North Korean threat possible. In particular, we need to understand 1) how that economy functions, and to what ends; 2) how the “Dear Respected Comrade” Kim Jong Un brought to it a limited but critical measure of economic revival; and 3) how America and others might use the considerable financial and commercial options at their disposal to impair the North Korean regime’s designs, before Pyongyang wins what is now a race against time.
Despite the information blackout that North Korean leadership has striven to enforce for generations, we already know much more about all these things today than the Kim family regime could possibly want—more than enough to begin purposely defanging the North Korean menace.
One: The Economy of Command
Given its longstanding reputation as a basket case, it may startle readers to learn that there was actually a time when North Korea was regarded as a dynamic and rapidly advancing economy. Back in 1965, the eminent British economist Joan Robinson wrote that North Korea’s achievements put “all the economic miracles of postwar development…in the shade.”2
In those days, if Western intellectuals happened to talk about the “Korean miracle,” they weren’t discussing anything going on in the South. And it wasn’t just dreamy academics and well-hosted foreign visitors who seemed to hold North Korea’s economic prospects in high esteem. Between the late 1950s and the early 1960s, Japan witnessed an exodus of ethnic Korean residents—in all, roughly 80,000 people—who packed up and steamed off under their own free will to the North, voting with their feet to join the Korean state they deemed to offer the greater promise.
Despite the devastation North Korea suffered during the war it launched against the South in 1950, and despite the blazing economic takeoff in South Korea that commenced in the early 1960s under the Park Chung-Hee junta, North Korea may have been ahead of South Korea in per capita output for two full decades after the 1953 armistice. A CIA study in the late 1970s, for example, concluded that South Korea did not catch up with North Korea until 1975.3 Contemporaries at South Korea’s Korean Central Intelligence Agency (KCIA) concurred that the North was well ahead of the South on a per capita basis throughout the 1950s and 1960s, though they argued that the South caught up with the North a few years earlier than the CIA believed.
In retrospect, the wonder is that North Korea’s economy worked as well as it did for as long as it did. For from its 1948 founding onward, North Korea was not just another Cold War Soviet-type economy: It was a Stalin-style war economy on steroids.
As fate had it, the Japanese colonial overlords who controlled Korea from 1910 until 1945 constructed a heavy industrial base in its northern half—a forward supply zone to support their own greater Asian war efforts. Unlike the South, the North had major deposits of coal, iron, and other minerals, along with plenty of natural hydropower. “Great Leader” Kim Il Sung—the onetime guerrilla fighter and later Red Army officer who started North Korea’s Kim family dynasty—inherited this infrastructure when he took over the northern part of the divided peninsula in 1945 and used it as a base camp from which he directed an upward climb toward the summit to which he aspired: an economy set on permanent total-war footing.
Kim Il Sung came perilously close to consummating his vision. By the mid-1970s, the Great Leader would observe that “of all the Socialist countries, ours bears the heaviest military burden.”4 Even by comparison with places like the Soviet Union and East Germany, his North Korea was a garrison state. By the late 1980s, this country of barely 20 million was fielding an army of more than 1.2 million—a ratio comparable to America’s in the middle of World War II. Those military-manpower estimates, by the way, are derived not from U.S. or South Korean intelligence, but rather from unpublished population figures Pyongyang transmitted to the UN in 1989 (data that inadvertently revealed the size of the country’s non-civilian male population).5
Today, two Kims later, the International Institute for Strategic Studies reports that North Korea currently maintains the world’s fourth-largest standing army in terms of sheer manpower—ahead of Russia and behind only the globe’s demographic giants (China, India, the United States). For more than half a century—since 1962, the year Kim Il Sung decreed the “all-fortress-ization” of the nation—North Korea has been the most exceptionally and unwaveringly militarized country on the face of the planet.
But why? What possessed North Korean leadership to commit their country, decade after decade, to such an extraordinarily expensive and irrational economic posture? There was a method to this seeming madness. Kim Il Sung’s grand design for unending super-mobilization served many logical purposes, given the first premises of his North Korean state.
Enforcing permanent war-economy discipline comported nicely with perfecting the domestic totalitarian order the Great Leader desired. Further, given the unhappy realities of geography and 20th-century Korean history, having the might to stand up to any and all foreign powers—including his nominal Communist allies in Moscow and Beijing—may also have seemed an imperative. But above all else, North Korea’s immense military economy reflected Kim’s overarching obsession with unifying the divided Korea, and doing so unconditionally—that is to say, to finishing up that Korean War he had started in 1950, and finishing it up on his own terms this time.
In the eyes of North Korea’s rulers, the South Korean state was (and still is) a corrupt, illegitimate, and inherently unstable monstrosity, surviving only because of the American bayonets propping it up. The Great Leader wanted to be able (when the right opening presented itself) to strike a knockout punch against the regime in Seoul and wipe it off the face of the earth—“independent reunification,” in North Korean code language. This he could not do without overwhelming military force—and without an economic system straining constantly to provide that muscle.
As early as 1970, the Great Leader was warning that “the increase in our national defense capability has been obtained at a very great price.”6 And by the late 1980s, Kim Il Sung’s “economic miracle” was all but dead in the water. Decades of crushing military burden and systemic suppression of consumer demand had taken their predictable toll. And North Korean planners had compounded these difficulties with additional unforced errors of their own.
Their idiosyncratic application of the Great Leader’s Juche (“self-reliance”) ideology, for example, included a general injunction against importing new foreign machinery and equipment. This ensured that the country would have to maintain a high-cost, low-productivity industrial infrastructure. Juche also apparently meant never having to pay your foreign debts, whether to fraternal socialist states or to “imperialist” creditors in Western countries foolish enough to lend money to Pyongyang. By the 1980s, global financial markets had caught on to the game, and North Korea found itself almost completely cut off from international capital. And the longstanding “statistical blackout” North Korean leadership enforced to facilitate international strategic deception also inadvertently impaired economic performance by blinding domestic decisionmakers and requiring them to “plan without facts.”
But it was the ending of the Cold War that pushed the North Korean economy out of stagnation, and into disaster. Juche ideology notwithstanding, North Korea had never been self-reliant; sustaining its severely deformed economy required constant inflows of concessionary resources from abroad. Pyongyang was (and remains) consummately imaginative in devising schemes for extracting aid and tribute from overseas. In the 1960s, 1970s, and 1980s, Kim Il Sung procured the equivalent of tens of billions of dollars in support from Beijing, Moscow, and the Kremlin’s Warsaw Pact satellites, expertly playing the Kremlin off against China, gaming aid out of each while aligning with neither.
In 1984, Kim Il Sung made a fateful error: He leaned decisively toward Moscow, a tilt signaled by his unprecedented six-week state visit to the USSR and Eastern Europe that same year. The gamble paid off initially: Between 1985 and 1989, the Kremlin transferred around $7 billion to Pyongyang, twice as much as the amount transferred over the entire previous 25 years, much of it in military matériel. In 1988, North Korea relied on the Soviet bloc not only for almost all its net concessionary foreign-resources transfers, but also for roughly two-thirds of its international trade, most of it arranged on political, highly subsidized, terms.
Then the came the Soviet bloc’s collapse. By 1992—the year after the collapse of the USSR—both trade and aid from the erstwhile Soviet bloc had plummeted by nearly 90 percent. North Korea’s worldwide overall supplies of merchandise from all foreign sources consequently plunged by more than half over those same years.
These sudden devastating shocks sent North Korea’s economy into a catastrophic free fall from which it would not manage to recover for decades. The socialist planning system essentially collapsed. Famine was just around the corner.
Two: A Man-Made Horror and Its Surprising Aftermath
The North Korean famine of the 1990s was a catastrophe of historic proportions. No one outside North Korea’s leadership knows just how many people died in that completely avoidable man-made tragedy, but the toll was certainly in the hundreds of thousands and could possibly have exceeded a million.7 It arguably qualifies as the single worst economic failure of the 20th century. It was the only time in history that people have starved en masse in an urbanized, literate society during peacetime.
It is noteworthy that the famine—usually dated from 1995 to 1998—did not commence until after the death of the Great Leader and the ascension of his son and heir, “Dear Leader” Kim Jong Il. This was no coincidence. Economic failure was the Dear Leader’s stock-in-trade. His political rise almost perfectly corresponds to the decline and fall of the North Korean economy. It happens that the Dear Leader did succeed in what was arguably his primary political objective: to die of natural causes, still safely and securely in power. But economic progress worthy of the name would not be possible in North Korea so long as he was its supreme ruler.
Though both father and son were totalitarian tyrants enthused with their hereditary total-war machine, the differences in their economic inclinations and impulses were nonetheless striking. Dogmatic as he was, the Great Leader still possessed a peasant’s sense of practicality. Proof of his pragmatism is the singular fact that North Korea, alone among all Asian Communist states (and including Russia in this roster), avoided famine during its 1955–57 collectivization of agriculture.
On the other hand, the Dear Leader, from his sheltered Red Palace upbringing onward, was every bit the paranoid, secluded ideologue. He not only disapproved of any concessions to economic pragmatism but feared these as positively counterrevolutionary and potentially lethal to his rule. He likewise regarded ordinary commercial interactions with the world economy as “honey-coated poison” for the North Korean system. At home, he wanted total mobilization but without any material incentives; from abroad, he sought a steady inflow of funds unconstrained by any reciprocal obligations. Kim Jong Il’s preferred economic model, in short, was to enforce Stakhanovite fervor at home through propaganda and terror while financing his war-economy state through military extortion abroad. He called this approach “military-first politics.”
Unwilling as he was to address the country’s newly dire economic circumstances with reforms—in his view, there was nothing to reform—Kim Jong Il’s North Korea was trapped in deepening depression for most of the 1990s. We will know how close the place came to total economic collapse—to the sort of breakdown of the national division of labor that Germany and Japan suffered at the very end of World War II—only when the archives in Pyongyang are finally opened. Throughout the 1990s, in any case, heavy industry was largely shut down, with inescapable consequences for conventional military forces. The death spiral for the war-making sector redoubled the importance to the regime of the nuke and missile programs, both as an insurance policy for regime survival and as the last viable military instruments for forcing the South into capitulation in some future unconditional unification.
In retrospect, it is clear that Pyongyang had no intention of desisting from its quest for nuclear weapons and ballistic missiles, even as it played Washington and her allies for aid for years by pretending its nuclear program might be negotiable. Yet also in retrospect, the slow tempo of nuke and missile development under Kim Jong Il’s rule has to be considered a surprise. Any serious weapons program requires testing to advance—yet Pyongyang managed just one long-range missile launch in the 1990s and only three during his 17-year reign. The Dear Leader also oversaw two nuclear tests before his death in 2011—but only toward the end of his tenure, in the years 2006 and 2009.
Why this hesitant tempo if nukes and missiles were a central priority for the North Korean war economy? Although other possible explanations come to mind, the obvious one has to do with financial and economic constraints. Ironically, despite his vaunted “military-first politics,” North Korea’s nuke and missile programs may also have been inadvertent casualties of Kim Jong Il’s gift for stupendous economic mismanagement. (True, North Korea could undertake expensive nuclear projects internationally, such as the undeclared plutonium reactor in Syria that was nearing completion when the Israelis leveled it in 2007—but that was apparently a cash-and-carry operation, bankrolled by the Dear Leader’s friendly customers in Iran.)
There is considerable evidence that the North Korean economy hit bottom around 1997 or 1998. That bottom was very low indeed: Rough estimates suggest that, by 1998, North Korea’s real per capita commercial merchandise exports were barely a third their level of just a decade earlier, while real per capita imports, including supplies indispensable to the performance of key sectors of the domestic economy, were down by about 75 percent.
North Korea appears to have turned the economic corner not on the strength of new or better domestic economic policies, but rather on breakthroughs in international aid procurement. Pyongyang figured out how to work the West’s international food-aid system: Between 1997 and 2005, the year before its first nuclear test, it was bringing in an average of over a million tons of free foreign cereal each year, ending the food crisis. It is tempting to regard this as “military-first politics” in action, for military menace played an important role in the international community’s solicitude. It is impossible to imagine a helpless and stricken sub-Saharan population obtaining “temporary emergency humanitarian aid” on such a scale, for such an extended duration and with so very few conditions attached.
Central to this upswing in food aid and other freebies from abroad was the fact that North Korea got lucky with the alignment of governments in Seoul, Washington, and Tokyo. For a while, the leaders of this consortium of states were commonly willing to underwrite an exploratory policy of “sunshine” or “engagement” with the Dear Leader by offering him subventions and financial transfers. To secure his June 2000 Pyongyang Summit with the Dear Leader, for example, South Korea’s then-president had hundreds of millions of dollars secretly wired to special North Korean accounts—thereby committing crimes under South Korean law (for which he later issued pardons).
In the event, the “sunshine”-aid influx that may have rescued North Korea at its darkest moment would wane after its clandestine uranium-processing project surfaced in 2002—but the nuclear crisis that revelation triggered also made possible the next big round of North Korean international aid-harvesting.
After the 2003 U.S. invasion of Iraq, Beijing—alarmed by the possibility that the U.S. might also engage in a similar military confrontation with neighboring North Korea—organized and convened a “six-party talks” diplomatic process, ostensibly for deliberations over North Korean denuclearization, to cool things down. While the subsequent years of talking quite predictably led nowhere, North Korea’s price of attendance was apparently a steep increase in economic support from China. Between 2002 and 2008, China’s annual net balance of shipments of goods to North Korea—its exports to Pyongyang minus corresponding imports—more than quintupled, rocketing upward from less than $300 million to more than $1.5 billion. By then, North Korea had become just as economically dependent on Chinese largesse as Pyongyang had been on Soviet-bloc blandishments two decades earlier—but these inflows, and the politically subsidized trade they came with, were evidently sufficient to help at least partially revive the Dear Leader’s broken economy. From Chinese trade statistics, for example, we can infer that Chinese investments were instrumental in a resuscitation of North Korea’s mining and metallurgy sectors in the last years of Kim Jong Il’s life. (We must rely on inference here since Beijing to this day remains almost totally opaque about its economic relation with Pyongyang.)
All in all, Kim Jong Il’s North Korea took in more than $1 billion from its enemies in Washington, and nearly $4 billion from the “puppet regime” in Seoul (not including the South’s additional expenditures on “off-the-books” transfers and special economic or tourist zones in the North). And from China, North Korea scored more than $12 billion of net merchandise inflows under the Dear Leader—a sum that would look even greater if valued in today’s dollars. All the while, North Korea was also earning invisible revenues from a whole network of highly enterprising if generally illicit overseas endeavors: its “nuke-and-missile homework club” with Iran; à la carte weapons sales and military services provided to a host of dictatorships and terror groups; counterfeiting of U.S. currency; drug racketeering; insurance frauds perpetrated against firms in London’s City; and more. The Dear Leader was extensively involved in the world economy, after all—just in a Bizarro World, Legion of Super-Villains sort of way.
Thanks to highly skilled aid-wheedling, international shakedowns, and financial gangsterism, Kim Jong Il’s North Korea clawed its way back from famine to a low but acceptable new economic normal—all the while forswearing domestic economic reforms or genuinely commercial contacts with the outside world. North Korea did not completely avoid potentially fraught economic changes under Kim Jong Il, of course—that was beyond the powers even of the Dear Leader. Domestic cellphone use began during the Dear Leader’s reign, for example, as did a tentative marketization of private consumption (about which more in a moment). But these and other analogous economic changes during the Kim Jong Il era are best understood as “transition without reform,” to borrow an apt term from North Korea watcher Justin Hastings.8
The economy’s “new normal” in the Dear Leader’s final days was still at a miserable level. Although North Korean scientists could launch long-range missiles and test atomic weapons, and although North Korea’s population had reportedly achieved a fairly high level of educational attainment (higher than China’s, if North Korean figures are believed), the country’s international economic profile was Fourth World. According to the World Trade Organization, North Korea’s per capita merchandise trade levels in 2010 approximated Mali’s. Its share of world merchandise trade that same year was roughly the same as that of Zimbabwe, a country with half of North Korea’s population—and despite its measure of recovery after 1998, North Korea’s global trade share fell by more than two-thirds between 1990 and 2010, even more than Zimbabwe’s under Mugabe’s misrule in that same period.
The world is a moving target and, generally, an improving one—so national stagnation also means continuing relative decline. Although the Dear Leader bequeathed his son Kim Jong Un a system that had avoided total collapse, there was little else that could be said to commend his economic legacy.
Three: The Economic Upturn
Dear Respected Comrade Kim Jong Un faced formidable odds when he took over in late 2011. The twentysomething was a novice manager at the time of his father’s demise. Unlike the Great Leader, who had groomed his son to rule from an early age, Kim Jong Il himself put off the whole business of naming a successor for as long as he possibly could, designating the child of one of his mistresses as the next Supreme Leader only after an incapacitating stroke made the naming of an heir an unavoidable matter of state.
As Kim Jong Un took office, the planned economy was no longer functioning, and to make matters worse, North Korea’s limited market sector was beset by galloping and seemingly unstoppable inflation. His father had experimented with a limited monetization of North Korea’s tiny consumer sector in 2002 but botched it—and only made matters worse with a surprise 2009 “currency reform” that effectively confiscated private holdings above $100, drastically degrading the already low credibility of the won.
From this unpromising beginning, Kim Jong Un has proved a relative success in delivering economic results in North Korea. There is evidence that the North Korean economy has enjoyed some measure of growth, macroeconomic stabilization, and even development under his aegis.
Pyongyang, “the shrine of Juche,” may be a Potemkin showpiece—but is showpiece-ier today than in the past. Construction cranes are whirring, and whole new sections of the city have risen up. Traffic jams now sometimes clog “Pyonghattan’s” vast, previously empty boulevards. Expensive restaurants and shops purveying luxury goods increasingly dot the capital, and their customers are mainly locals, not foreigners. The upsurge in prosperity and living standards evident in Pyongyang is reportedly reflected, albeit to a more modest degree, in other urban centers as well.
Furthermore, in sharp contrast to previous North Korean trends, or other earlier Soviet-type economies, the country today not only displays considerable marketization but also market stability. This much is demonstrated by cereal prices and foreign-exchange rates in informal markets across North Korea. Over the decade between mid-2002 and mid-2012, North Korea’s won depreciated against the U.S. dollar in such markets by a factor of more than 5,000 (no, that is not a typo). But that depreciation abruptly stopped a little over five years ago, and since then the won has traded around 8,000 to the dollar (fluctuating within a band around that average). In other words, North Korea now has a stable currency that is convertible into hard currencies. Likewise, the domestic price of rice in North Korean markets suddenly stopped soaring five years ago and has been in the vicinity of 5,000 won per kilogram ever since. Whatever else one may say of these new domestic price signals from Kim Jong Un’s North Korea, they are not what one would expect to see from an economy in mounting crisis and disarray.
Finally and by no means least important: In the military realm, nuke and missile testing has accelerated. In the 13 years between Kim Jong Il’s first Taepo Dong test and his death, North Korea launched three long-range rockets and detonated two atomic devices. Kim Jong Un has been in power just over six years; his regime has already set off four nuclear tests and shot off more than a dozen long-range missiles. Some of the speed-up could reflect long-term strategic choices and might in part be affected by improvements in efficiency (cost reduction) within the WMD industrial sector. All other things being equal, though, this sharp acceleration would seem to betoken a major new infusion of resources into programs already long accorded a top priority by the North Korean state. Without a bigger economic pie and substantially greater funding sources, it is hard to see how Pyongyang could have pulled this off.
All this said, North Korea is still shockingly unproductive, still punching far below its weight, still nowhere near self-sustaining growth. Kim Jong Un’s boundless self-indulgence is manifest in costly vanity projects like a spanking-new “ski lift to nowhere” resort, Masikryong, a venture otherwise inexplicable save perhaps for the memories of childhood days in Switzerland that it might elicit.
But by distancing himself from his father’s most economically destructive policies and practices, and navigating into previously uncharted waters of economic pragmatism, Kim Jong Un has opened up heretofore ungraspable opportunities for raising living standards and building military power at one and the same time. Thus the name of his signature policy: byungjin, or “simultaneous pursuit.”
In short order after his ascension, Kim Jong Un demoted—or killed—most of the Dear Leader’s closest cadres and confidants. And less than five months after assuming power—at a ceremony commemorating his grandfather’s 100th birthday in April 2012—he made an astounding declaration, coming as it did from North Korea’s supreme ruler: “It is our party’s resolute determination to let our people…not tighten their belts again.” Translation: This is no longer your father’s dictatorship; aspiration for personal betterment is no longer a counterrevolutionary act of treason.
Dear Respected has deliberately and steadily reshaped the economy under his command. The fundamental strategic difference between Kim 2 and Kim 3 was this: Whereas the Dear Leader saw “reform” and “opening” as deadly “ideological and cultural poison” pure and simple, Dear Respected believes that North Korea could withstand a bit of that poison—actually, quite a bit—and even end up stronger for taking it.
Pyongyang’s new policy directives have been informed by this insight. In agriculture, Kim Jong Un promulgated the “June 28 Instructions” (2012), which permitted family-level work units and allowed farmers to keep 30 percent of their surplus—a bonanza compared with all previous official rules. For enterprises and industry, there were the “May 30 Measures” (2014), which allowed managers to hire and fire workers, pay them according to their productivity, and keep a portion of any profits they earned. People were, increasingly, paid with money for their work—and it was real money, as in, money that could buy things people wanted. The gradual marketization and monetization of North Korea’s civilian economy over the past two decades is a major transformation, and one critical to understanding the country today.9
By the late 1980s, North Korean leadership had fashioned a consumer sector that would have turned Stalin green with envy. No country on the planet had so tiny a share of total national output flowing to personal consumption as late Cold War North Korea—and no country had so low a fraction of its personal consumption accruing to citizens on the basis of their own market choices. By the late 1980s, North Korean planners had come closer to completely demonetizing their economy than any modern polity this side of the Khmer Rouge. Most goods, services, and supplies that North Korean families consumed were provisioned to them directly by the state, with no “interference” by actual consumer preferences. North Korean planners wished to cede as little control over their command economy as humanly possible.
Pyongyang’s near-total control of the consumption basket, however, presupposed that the state would be supplying its subjects with their daily necessities in the first place. That collapsed in the mid-1990s when the Public Distribution System simply stopped providing the full promised daily food rations to most of the population—and stopped supplying any food at all to some of the population. A terrible number of those who trusted the government to take care of them ended up perishing. To survive the famine, North Koreans had to learn to buy and sell in informal markets that began to spring up—even though such activity was against the law, and some “economic crimes” were punishable by death. The Kim Jong Il government loathed these new private markets, but it needed them to forestall wholesale calamity. Thus commenced the two-steps-forward-one-step-back dialectic of marketization that lasted the rest of the Dear Leader’s life—and after his death, marketization and monetization of the civilian economy gained further steam.
Today it is all but impossible to get by in North Korea on state-supplied provisions alone—and a wide array of goods and services, both foreign and domestic, are available for money in North Korean markets. Although formally prohibited, even real estate is for sale throughout the country, with a vibrant market for private flats in Pyongyang. And a wealthy marketeering caste has arisen: donju, or “money masters,” stereotypically a well-connected official and his enterprising wife, who use political influence as well as entrepreneurial savvy to enter this nouveau riche North Korean elite.
In case you were wondering: Yes, corruption is rife in North Korean markets. It is the necessary lubricant for all North Korean private commerce. In addition, the government expects a big cut, and such funds have been integral to the recovery of the North Korean state.
The marketization and monetization of its consumer economy, in conjunction with new agricultural and commercial incentives and a more tolerant official attitude toward informal activity, laid the groundwork for a domestic-production upswing in North Korea (and a veritable boom in private consumption, although from a very low starting point).
Unlike Asia’s “reform socialism” states, China and Vietnam, North Korea has never made a serious effort to attract private investment from abroad from real live capitalists. Pyongyang prefers large-scale foreign projects that are political in nature. Such projects are bankrolled by governments indifferent to profit, which is to say by the foreign taxpayers who can ultimately be left holding the bag. Examples include the ill-fated Kaesong Industrial Complex paid for by South Korea, as well as its doomed Kumgang Tourist Resort. For international trade and finance, the overwhelming bulk of North Korean activity still falls into two categories: 1) politically predetermined, highly subsidized economic relationships, or 2) what we might call “guerilla warfare” or “outlaw” finance.
Four: North Korea’s Friends
Preferential trade ties with China are pretty much the only game in town for Pyongyang these days. With the virtual shutdown of South Korea’s politically subsidized inter-Korean trade in 2016 following accusations that money from the Kaesong project was being used to fund the North’s missile program, China may now account for close to 90 percent of North Korea’s international commercial-merchandise trade turnover. And North Korea always receives much more than it gives in its arrangement with China, year after year.
There is, to be sure, an element of harsh capitalist bargaining within this overall relationship—but most of that is in the “people to people” bartering and petty trading at the border, largely for consumer goods. At the national level, judging by Chinese customs statistics, North Korea raked in well over a billion dollars a year in net merchandise shipments from China from 2008 through 2014—with no transparency on Beijing’s part about the mechanisms by which this ongoing transfer is financed, much less about the Chinese government’s objectives and intentions in extending this lavish lifeline.
Since 2015, official Chinese numbers suggest that Beijing’s de facto aid is down—but these look like figures deliberately fudged in the face of mounting international demands for sanctions against North Korea. It is at the very least possible that important aspects of Chinese support for the North Korean economy or its defense industries have not yet come to light. Given what is already known, though, it is indisputable that deals with China under the two latest Kims have been key to reviving North Korea’s heavy industrial sector. (For the year 2016, China reported shipping over three-quarters of a billion dollars of machinery and transport equipment to North Korea, 10 times the volume in 2003, when the six-party talks commenced.)
Vital as Chinese support may be to North Korea’s survival and economic revival, North Korea evidences no gratitude for Beijing’s largesse. Pyongyang does not “do” gratitude. Moreover, leadership in Pyongyang knows very well a bitter truth about Chinese aid that they can never utter: namely, that capricious cutbacks in free food from China in the year 1994 were the trigger for the Great North Korean Famine, which became impossible to conceal by 1995.
Apart from its Chinese lifeline, North Korea’s other main sources of international support come from “outlaw” forays into the world economy—including activities tantamount to state-sponsored organized-crime operations. These shady dealings typically attempt to generate revenues for the state that avoid international detection, often relying on the special protections and prerogatives of a sovereign state for cover.
One cannot help but be struck by the industry, ingenuity, and sophistication that have generally kept such schemes one step ahead of international authorities. Koreans in the North can be world-class innovators, too—it’s just that their chosen fields of excellence happen to be in smuggling, drug-running, money-laundering, and the like.
Some of these inventive schemes have been in the news. In recent years, for example, Pyongyang has made unknown millions abroad from what we might call its own style of human trafficking: profiting off the tens of thousands of workers in labor gangs it has sent to China, Russia, the Middle East, and even parts of Europe. No less inventive has been Pyongyang’s apparent monetization of its growing capacity for cyberwarfare through international bank robbery. In 2016, “unknown” hackers relieved the Central Bank of Bangladesh of $81 million in a spectacular heist; in late 2017, similar cyber-fingerprints were detected in a theft of $60 million from a bank in Taiwan. These are just two of many “hit and runs” orchestrated under the Kim Jong Un crime family. And as the WannaCry ransomware attack last year demonstrated by infecting hundreds of thousands of computers the world over, vastly greater dividends from cybercrime may lie just over the horizon.
Then there is North Korea’s signature global service industry: WMD proliferation. For obvious reasons, most of this work never makes the news. No one outside Kim Jong Un’s court probably knows just how much this nefarious business is bringing in these days. These unobservable flows, however, may be consequential. Consider this: Barely weeks after Tehran inked its September 2012 Scientific Cooperation Agreement with Pyongyang, the won suddenly ended its decade-long freefall and finally achieved exchange-rate stability. North Korea may have had additional, still concealed, operations that were also paying off at the same time as that Iranian deal, of course. But either way, the deal clearly marked a turning point in North Korea’s macroeconomic fortunes, and the stabilization of exchange rates and domestic cereal prices probably could not have occurred without an open spigot of foreign cash.
In sum, the hallmarks of Jong-Un-omics economics would appear to be new revenues from foreign sources, along with the new flows of funds derived from privatization and growth at home. These monies have apparently sufficed not only to stabilize North Korea’s previously toxic currency, and to bring an end to runaway inflation in North Korean key private markets, but also to abet Pyongyang’s nuclear and ballistic ambitions. This, at least, would seem to be the most plausible reconstruction of the limited but meaningful evidence from the jigsaw puzzle that is the North Korean economy today.
To repeat: While we should recognize the existence of this economic upswing we should also keep its scale in perspective. All one need do is consider the sad, stunning space photos of North Korea at night—the satellite shots revealing a territory almost pitch-black, while the rest of Northeast Asia is glowing with light. They attest better than any available statistics to the limits of economic recovery under Kim Jong Un.
Among the other implications of that space imagery, the North simply does not have the pocketbook for a wholesale modernization of its conventional army and a nuke-missile program. For now at least, most of the military’s equipment, apart from critical nuclear-related pockets like submarine production, remains outdated and ill-suited for the tasks originally assigned. Today, Kim Jong Un cannot credibly threaten to roll in and occupy South Korea. But Kim Jong Un is on track to manufacture enough nuclear matches to burn the place down, with Tokyo and Washington thrown in for good measure, in the foreseeable future.
Five: How to Put Pressure on Pyongyang
Given what we know about the North Korean economy, can America and the world community keep Pyongyang from reaching its ultimate nuclear objectives through a real economic-pressure campaign?
We do not know just how close North Korea is to perfecting its weaponization of ballistic missiles, or how many nuclear weapons the North currently possesses. We also do not know as much as we need to about North Korea’s strategic inventories and reserves. If Pyongyang were stopped in its tracks today, its nuclear and missile work would require unwavering vigilance and far-reaching containment for the remaining life of the regime. That said, a serious international campaign of trade and financial sanctions—led by America, ruthlessly executed, and starting immediately—could very significantly slow the pace of Pyongyang’s ongoing nuclear-ballistic march. And if we are in it for the long haul, a serious sanctions campaign could eventually promise the effective suffocation of the entire North Korean military economy.
An international economic campaign of this sort won’t be easy (though America has many more cards in her hand than many now appreciate). It probably won’t be pretty, either. But in any case, it is the world’s last chance to thwart North Korea’s nuclear ambitions by nonmilitary means.
Let’s start with the unpleasant truths. We must recognize that economic pressure will not alter the intentions of the Kim family regime—ever. We must dispense with the fantasy, still inexplicably maintained in various esteemed diplomatic circles and Western universities, that Pyongyang can somehow be pressured—or bribed—at this late stage into changing its mind about its multi-decade march to a credible nuke and missile arsenal. There is no “bringing North Korea back to the table” that ends with CVID—comprehensive, verifiable, irreversible denuclearization. Period.
So much for the bad news. The rest of the news about the outlook for sanctions against North Korea, fortunately, is better than we usually hear.
Many authoritative voices seem to think sanctions have little chance of influencing North Korea’s nuclear trajectory. Economic historians note that the record for coercive economic diplomacy is poor and has been for centuries. Policy wonks and foreign-affairs experts add that successive rounds of UN and international economic sanctions seem to have had no real bite so far against North Korea. These pessimistic assessments, however, misread the prospects for international economic pressure against North Korea on two important counts.
As poor as the general record of coercive economic diplomacy may be, North Korea is not exactly a typical economy. It is an outlier—it’s world-class dysfunctional, recent changes under Dear Respected notwithstanding. The economy is incapable of growth (or for that matter, even stagnation) without steady inflows of financial support from abroad to keep it on its feet. Remember: When net aid from abroad sharply dropped (but did not end) in the 1990s, that was enough to send North Korea spiraling downward into paralysis and mass famine. The North Korean regime in short, is a poster child for a successful international campaign of economic strangulation. Despite Pyongyang’s nonsense about “self-reliance,” it is uniquely vulnerable to the cutoff of foreign money and subvention.
Kim Jong Un has not yet faced anything even remotely resembling an international campaign of “maximum economic pressure.” The continuing stability of North Korea’s foreign exchange rate and domestic food prices pointedly suggest international sanctions have not yet greatly impacted North Korea. But few foreign-policy experts, and even fewer general readers, seem aware of how flimsy were the array of sanctions imposed on North Korea by the UN and U.S. during the George W. Bush and Obama years.
Consider first the successive rounds of UN Security Council sanctions lodged against the regime since its first atomic test in 2006. China and Russia flagrantly and routinely violate the very sanctions their own Security Council representatives voted to impose. Most countries around the world still ignore them, too. In early 2017, the UN’s Panel of Experts on the sanctions reported that 116 of the UN’s 193 members had not yet bothered even to file implementation reports on the then-latest round (UNSC 2270, levied in response to Pyongyang’s fifth nuclear blast). The previous year, the Panel noted that 90 countries had never reported on any of the sanction resolutions against North Korea (eight at that time, the first of them ratified a decade before that report). And filing a report on these sanctions resolutions is not the same thing as enforcing them. Several countries with whom Washington enjoys ostensibly friendly relations have turned a blind eye to illicit North Korean activities on their soil for many years (Malaysia, Singapore, and some of the Gulf States being among the more egregious examples).
When it comes to Washington’s own economic measures, furthermore, North Korea is still far from being “sanctioned out,” no matter the received wisdom. In the final year of the Obama administration, according to Anthony Ruggiero of the Foundation for Defense of Democracies, fewer entities and individuals from North Korea were under U.S. Treasury Department sanction than those from seven other countries, including Zimbabwe and Sudan. While the Trump administration has been much more serious about sanctioning North Korea, Ruggiero testified that as of late summer 2017, North Korea nonetheless remained less sanctioned than either Syria or Iran. For some mystifying reason, moreover, North Korea was not put back on the State Department’s list of strictured “state sponsors of terrorism” until the end of 2017, after enjoying a nearly decade-long holiday off that roster.
As 2018 commences, three big changes augur well for the prospect of devastating “shock and awe” sanctions against the North Korean system. First: At the end of 2017, the Security Council endorsed a broad new writ and scope for sanctions against North Korea, dispensing with the earlier “marksman” approach of picking off particular military-related firms or individuals and embracing instead the “blockbuster” approach of crippling North Korea’s entire military-industrial complex. The new sanctions, among other things, ban all industrial imports by North Korea, severely cut permitted energy imports, and require UN member governments to “seize, inspect, and freeze” vessels violating some of the new restrictions.
Second: In late 2017, the U.S. Treasury announced new and much more sweeping authority for North Korea sanctions, granting U.S. officials wide discretion to impose what are known as “secondary sanctions.” Henceforth any business or person engaging in any kind of commercial or financial transactions with North Korea could be severely penalized, with punishments including fines, seizure or forfeiture of assets, prohibition against any commerce in or with the U.S., and being cut off from the worldwide clearing system for dollar-based financial settlements.
Finally, and by no means unrelated to these other changes, is the third change: the advent of the Trump administration. Under President Trump and his team, there appears to be a qualitative change in America’s North Korea policy—one that accords the North Korean threat a higher priority, and more unblinking attention, than it has been granted by any of Trump’s predecessors. The White House calls this new approach to North Korea a policy of “maximum pressure.”
Six: The American Role
Trump’s address before South Korea’s National Assembly last November on the North Korea problem was the most incisive, and moving, statement on the topic ever delivered by an American president. Whatever else may be said of him, Trump is keenly aware that the North Korean threat he inherited was allowed to fester and worsen under each of the four men in the Oval Office immediately before him. He appears to have no intention of continuing that tradition.
The Achilles’ heel of the North Korean economy—and thus, of Pyongyang’s nuclear and missile programs—is its existential dependence on foreign aid and outside money. The fortress-prison country is an operation that cannot be sustained on its own. To date, North Korea has skillfully extracted wherewithal and extorted financial concessions out of a largely unfriendly world. To jam the gears of the North Korean war machine, the international community must recognize, and finally begin systematically exploiting, Pyongyang’s unique economic weakness. This will require a campaign of economic pressure worthy of the name—and the pieces for such a campaign are already falling into place.
In broad strokes, what would this “maximum economic pressure” campaign look like? It must be Washington-led, since it will not coalesce spontaneously. To carry it out most effectively, diplomacy will be crucial: Alliance coordination and the building and maintenance of motivated coalitions are obvious force multipliers for this exercise. But the U.S. has unique international strengths that allow us to act unilaterally and with great consequence when necessary.
For starters, now that we ourselves have relisted North Korea as a state sponsor of terrorism, we have a stronger case for pressing governments around the world to shut down the regime’s embassies, trade missions, and other facilities located on their soil. Not necessarily to sever diplomatic ties, much less end all communication, with Pyongyang: just to deprive North Korea of safe havens for their illegal rackets on foreign shores. Given North Korea’s standard operating procedure overseas, affording Pyongyang an embassy in one’s country is like offering diplomatic immunity to the Mafia. The Trump administration has begun some of this advocacy already and has some initial results to show for its troubles. In conjunction with a consortium of like-minded states (including Japan), a full-court press could gain true international momentum. At the very least, this would disrupt some of North Korea’s illegal rackets and reduce the take from them.
Washington can also take the lead in lobbying governments to shut down the North Korean work crews operating within their own countries—these are too close to slave labor for comfort. This need not be quiet diplomacy. The complicit governments in question, including Beijing and Putin’s Kremlin, deserve to be called out publicly if they are intransigent. (The wording of the latest round of Security Council sanctions calls for shutting down such arrangements within 24 months, an amendment Moscow negotiated for—but there is no reason that the U.S. or independent human-rights groups should not try to speed up that timetable.) The U.S. also has options for penalizing trading partners who violate internationally recognized labor standards, which is to say we can affect the cost-benefit calculus for governments that tolerate North Korea’s odious practices in their own backyards.
This brings us to a rather larger diplomatic task: confronting China and Russia about their continuing financial malfeasance on North Korea. The scope and scale of China’s furtive support for North Korea dwarfs Russia’s, of course—but that is no reason to give the Kremlin a pass. These two states have long been playing a double game—one that must come to an end starting now.
Seven: The Russians and the Chinese
Contrary to some hand-wringing in Washington and elsewhere, the U.S. is by no means devoid of options in facing down China and Russia for their economic enablement of the Kim family regime. As already noted, Washington possesses an extraordinarily powerful tool for inducing their compliance: the U.S. dollar—the most important reserve currency in the world economic order. America gets to decide who can, and who cannot, engage in the dollar-denominated financial transactions with the myriad of correspondent banks serving the globe, for which the Federal Reserve Bank is the clearing house. Existing legislation and executive orders already provide the U.S. government with a panoply of instruments for inflicting nuanced and escalating economic penalties and losses on financial institutions, corporations, and private individuals who rely upon U.S. correspondent banks but engage in illegal or forbidden commerce with North Korea.
So far, the United States government has used only minor pinpoint-pinprick secondary sanctions against Chinese and Russian parties that violate restrictions on dealings with North Korea. Both nations face potentially major economic costs if they do not address and control such violations, should we choose to impose them.
It is no secret, for example, that the Chinese banking system is highly leveraged and that some of China’s largest banks are in what we might call a financially delicate situation. Does Beijing really want to find out whether one of these major concerns can survive a Treasury Department-Justice Department inquiry for North Korea infringements, much less the weight of actual secondary sanctions—or to find out what happens at home and in international financial markets if it looks as if a major Chinese bank might fail on that account?
If the Kremlin and Beijing believe we mean business, they will have reason to suppress illicit deals with North Korea—but convincing them we mean business is our responsibility. Washington has been curiously hesitant here, possibly for fear that Beijing or the Kremlin, or both, would respond by sabotaging any further UN sanctions. But we now have pretty much what we need from UN resolutions for a campaign of “maximum economic pressure” on North Korea—so the time for horse-trading and slow-walking is over. And while we are at it, these governments’ official economic support for North Korea shouldn’t be off the table. Isn’t it time to spotlight and track those flows, too?
As we work to rein in China and Russia, we should not lose sight of the money that North Korea receives through arrangements with other governments—including states in Africa and the Middle East that receive U.S. foreign aid. Yet much of what Washington needs to do in this economic campaign, alas, is currently unknown. This is a failure of our intelligence community that must be immediately addressed if “maximum economic pressure” is to stand a chance of ending up as more than just a slogan.
By the very nature of intelligence activity, spy agencies cannot take credit for many of their successes. But the U.S. intelligence community doesn’t deserve a slap on the back for its performance in this particular area. It should be something of an embarrassment, for example, that some of the best work mapping out the connections between Chinese front companies and the North Korean military these days should apparently come from a small think tank, C4ADS, that relies entirely on open sources. And that is just one small example of intelligence insufficiency. Our government also appears to know much less than it should about the financial relations between Pyongyang and its backers in Tehran, North Korea’s money ties with terrorist groups, and its adventures in crypto-currencies and other harder-to-trace instruments of finance.
Much of what is currently unknown—by our government—about North Korea’s covert international financial networks and overseas holdings is in fact knowable, given better legwork and intelligence. The story of the U.S. government’s interagency Illicit Activities Initiative (2001–6), which methodically mapped out North Korea’s money trails before being derailed by bureaucratic infighting under the George W. Bush administration, provides an “existence proof” that such research can be done. North Korea’s overseas financial networks have had more than a decade since the demise of IAI to evolve and hide their tracks—so a new IAI-style effort would have to play catch-up.
With the information we could gather from a well-funded and coordinated intelligence initiative, we can help shut down North Korea’s worldwide criminal enterprises, arrest their international accomplices, freeze and seize violators’ overseas assets (not just Kim Jong Un’s assets: think Iran, Syria, Hezbollah, and the rest), and levy potentially devastating fines against commercial and financial concerns that willfully aid North Korea in violating the law. We can also improve the efficacy of existing proliferation-security efforts.
With better intelligence, better international coordination, and the will to get the job done, an enhanced “maximum economic pressure” policy could swiftly and severely cut both North Korea’s international revenues and the vital flows of foreign supplies that sustain the economy. An enhanced Proliferation Security Initiative (PSI), indeed, could use interdiction not only to monitor the goods entering North Korea but also to regulate and, as necessary, suppress that level. (UN sanctions, by the way, make provisions for humanitarian imports into North Korea a matter the U.S. and others must attend to faithfully.) Yes, this is economic warfare, and it can be conducted with much more sophisticated tools than were available in the 1940s. In fact, it should be possible through such a campaign to send the North Korean economy—and the North Korean military economy—into shock, possibly even in fairly short order.
Eight: Success and Its Failures
If comprehensive sanctions and counter-proliferation against North Korea fail, we enter into a new world with darker and much less pleasant options. But what if, by some measure of success, they turn out to succeed? What then?
In addition to their intended consequences, successful policies always have unintended ones. Three potential consequences of an effective economic-pressure campaign against the North Korean regime deserve special consideration in advance.
The first concerns the role of North Korea’s donju elite in a future where North Korea is increasingly squeezed economically. These “money masters,” who until now have enjoyed waxing wealth and have lived with rising expectations under Kim Jong Un, would stand to suffer very sharp financial loss. What would a serious reversal in the fortunes of this privileged element in North Korean society mean for elite cohesion and for regime dynamics? Even North Korea has domestic politics. Poorly as we may be able to apprise North Korean politics, it would behoove us to try to understand in advance how such a change would alter the realm of the possible within the country—and what new opportunities such internal developments might present.
Second is the all-too-likely possibility that North Korea would careen back into famine under an effective sanctions campaign—and not because Pyongyang would be incapable of purchasing or procuring sufficient food to feed its populace. The reason North Koreans starved last time was the government’s dreadful songbun system, still very much in force today. Songbun is a unique North Korean instrument of social control that carefully subdivides the North Korean populace into “core,” “wavering,” and “hostile” classes, lavishing benefits and meting out penalties according to one’s station. Life chances in North Korea—and no less important, death chances—turn on one’s assigned class. Just as it is a safe bet that virtually no one outside the “core classes” has amassed great donju riches, so too death from starvation is almost entirely consigned to the state’s designated enemies from the “hostile classes.” Only “intrusive aid” (provided on site by impartial outsiders) and public diplomacy, including calling out Dear Respected on this vile practice, stand to mitigate the toll of the impending humanitarian-cum-hostage crisis should “maximum economic pressure” work.
Finally, there are the countermeasures Pyongyang will surely adopt if the economic-pressure campaign is attaining a measure of success. These will be intended to terrify and to break the will of the sanctioners. North Korean leaders are practiced masters of white-knuckle, bared-fang diplomacy—and they would naturally regard the stakes in this contest as particularly high. No national directorate is so expert in brinkmanship or so consummate at carefully gaming through seeming “outbursts” well in advance.
North Korea will test the stomach and the will of the pressure alliance, threatening what sees as the campaign’s weakest and the most exposed elements and ranks. These probes and tests may be military in nature, with a range of options that could well include threats of nuclear war. Pyongyang will try to make Washington and the international community fear that they are facing a “Japan 1941 moment,” with a cornered Kim family regime: a déjà vu of the drumroll that led to World War II in the Pacific, only this time against a nuclear-armed adversary.
This would be a point of incalculable danger. There are good reasons to think North Korea would not resort to first use of nuclear weapons, most compelling among them, its own state-enshrined doctrine known as “Ten Principles for the Establishment of a Monolithic Ideology.” (The essence of this doctrine: The Hive must keep the Queen safe, and at all cost.) But there is no sugarcoating the terrible risks, including risks of miscalculation, inherent in North Korea’s most likely countertactics.
Any way you look at it, North Korea’s adversaries are in for a long and bumpy ride. The alternative to thwarting North Korea’s war drive now is permitting Pyongyang to prepare to fight and win a limited nuclear war in the future, at a time and place of its own choosing, when the situation for America and her allies may be even more perilous.
Like it or not, Pyongyang plays for keeps, and we are in this with them for the long game. The next move is ours.
1 Full disclosure: I am one of those who seriously underestimated North Korea’s resilience in the 1990s. Twenty years ago, I would have thought it almost unimaginable for the North Korean state to survive to this day. Needless to say, subsequent events have proved otherwise, and studying my own mistakes has led to the analysis under way here.
2 Joan Robinson, “Korean Miracle” Monthly Review, January 1965, Vol. 16, No. 8, pp. 541–549.
3 Korea, the economic race between the north and the south: a research paper, ER 78-10008, January 1978, CIA.
4 Kim Il Sung, Works, Vol. 31 (Pyongyang: Foreign Languages Publishing House, 1987), p.76.
5 Nicholas Eberstadt and Judith Banister, The Population of North Korea. (Berkeley, CA: University of California, 1992).
6 Kim Il Sung, Selected Works, Vol. 5 (Pyongyang: Foreign Languages Publishing House, 1972), p. 431.
7 On this man-made, and completely unnecessary, tragedy, see Stephan Haggard and Marcus Noland, Famine in North Korea: Markets, Aid and Reform, (New York: Columbia University Press, 2007).
8 Justin V. Hastings, A Most Enterprising Country: North Korea in the Global Economy. (Ithaca NY: Cornell University Press, 2016).
9Perhaps the best analysis of this transformation is Kim Byung-Yeon, The North Korean Economy: Collapse and Transition. (New York: Cambridge Univer sity Press, 2017)
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
s I write, Michael Wolff’s Fire and Fury has become a mere husk of a book, emptied of everything consumable and tasty. And it’s only been out a week! In the hinterlands, the book is selling briskly, but here in Washington, we already find ourselves in the final phase of a mass hysteria, a hangover that we would call the Woodward Detumescence.
Woodward is Bob Woodward, of course. Every few years, for more than 30 years, Woodward has sent Washington reeling with a book-length, insider account of one administration after another, presenting government as high drama, with a glittering cast of villains and heroes.
The sequence of the symptoms seldom varies. First comes the Buildup. We hear premonitory rumblings: Freshly minted Woodward revelations are on the way! His publisher declares an embargo on the book, mostly as a tease. Another reporter writes an unauthorized report guessing at what the revelations might be. Washington can scarcely breathe. At last the first excerpts appear in a three-part serial in Woodward’s home paper, the Washington Post.
We enter the Swoon.
The excerpts tell of betrayals and estrangements, shouting matches and tearful reconciliations, tough decisions and disappointing failures of nerve, all at the highest levels of government. Woodward goes on TV shows to explain his findings. Sources attack him; he stands by his book. The frenzy intensifies, the breathing is labored, until, at last, comes the Spasm, as all the characters from the book refuse to comment on a “work of tabloid fiction.”
Then the newspaper excerpts end, there is a collapsing sigh, a dying fall, and the physical book, the thing itself, appears. The text seems an afterthought, limp as a wind-sock and, by now, even less interesting. If there were more revelations to be found in its pages, after all, we would have read them already. We skulk back to the routines of what passes for normal life in Washington, slightly abashed at our momentary loss of self-control. This is the Woodward Detumescence. Shakespeare foresaw it in a sonnet: “the expense of spirit in a waste of shame.”
The Fire and Fury frenzy omitted some of these steps, prolonged others. It was touched off by an excerpt in New York, appearing a week before the book’s original publication date. Running to roughly 7,000 words, the excerpt was densely packed and so juicy it should have come with napkins. The article’s revelations about White House backbiting and self-loathing are by now universally known, and have been from the moment the excerpt hit the Web. One thing they make plain is that Michael Wolff bears little resemblance to Bob Woodward. Over a long career, our Bob has shown himself to be a tireless and meticulous reporter. He is a creature of Washington, besotted by government; Woodward never found a briefing paper he wouldn’t happily read, as long as it was none of his business.
Wolff, on the other hand, is an incarnation of Manhattan media. He’s a 21st-century J.J. Hunsecker, the gossip columnist in the great New York movie Sweet Smell of Success, although, unlike J.J., he has a pleasing prose style and a sense of irony. His curiosity about the workings of government and the shadings of public policy is nonexistent. “Trump,” Wolff writes with typical condescension, “had little or no interest in the central Republican goal of repealing Obamacare.” Neither does Wolff. Woodward would have given us blow-by-blow accounts of committee markups. Wolff mentions Obamacare only glancingly, even though it was by far the most consequential failure of Trump’s first year.
If you want to learn how Trump constructs that Dreamsicle swirl that rests on the top of his head, or the skinny on Steve Bannon’s sartorial habits, then Wolff is your man. He tries to tell his story chronologically, but he occasionally runs out of things to say and has to vamp until the timeline lets him pop in a new bit of shocking gossip. Early in the book, for example, after he has established that Trump is reviled and mocked by nearly everyone who works for him, Wolff leads us into a tutorial on The Best and the Brightest, David Halberstam’s doorstop on the 1960s White House wise men and whiz kids who thought it would be a great idea to get in a land war in Southeast Asia. He calls Halberstam’s book a “cautionary tale about the 1960s establishment.” Wolff’s chin-pulling goes on for several hundred words. Apparently, Steve Bannon had had the book on his desk.
This is interesting, I guess, and so are the excessive digressions about New York real estate, Manhattan’s media culture, the evolution of grande dames into postfeminist socialites, and many other subjects that are orthogonal to the book’s purpose. If you’ve bought Fire and Fury, chances are, you wanted to learn things you didn’t know about the first year of the Trump administration. The New York excerpt was chockablock with such stuff, told in sharply drawn scenes and vivid, verbatim quotes. But the book dwells much more on general impressions, flecked here and there with scandalous asides. In these longeurs—most of the book—Wolff writes at an odd remove, from the middle distance. The prose loses its immediacy and becomes diffuse.
He’s not so much padding his book as filibustering his readers, perhaps hoping to deflect a reader’s attention from another revelation: He really hasn’t delivered the goods. All of Wolff’s most scandalous material was filleted and packed into the New York excerpt. Listening to discussions among friends and colleagues, I keep hearing the same items, all from the magazine: Staffers think Trump might be (literally) illiterate, Steve Bannon thinks the Mueller investigation puts Trump’s family in legal jeopardy, the president uses vulgar language when talking about women. He is a child, Wolff wants us to know, and the disorder of his government is directly traceable to that alarming fact.
And it is indeed alarming, but nobody who has followed Trump’s Twitter feed or watched his news conferences will think it’s news. Wolff wrote a scintillating 7,000-word magazine article; the problem is that he spread it over a 328-page book. The rumor has gone around (hey, if he can do it, so can I) that before submitting his manuscript, Wolff warned his publisher that it didn’t contain much that was new.
This explains a lot. Wolff clearly was unprepared for the explosion set off by the magazine article. You could see it in his halting explanations of his journalism techniques. When his quotes were questioned, he let it be known that he had “dozens of hours” of tapes. (Other news reports inflated the number to hundreds.) When quotes continued to be questioned, he was asked, by colleagues and interviewers, to release the tapes. He refused. Wolff said his book threatens to bring down the president—on evidence that he alone has and won’t produce.
Spoken like a true journalist! Much has been made of this modern Hunsecker’s techniques. One explanation for the candor of his sources is that Wolff gained their confidence by misleading them about his intentions; they had concluded he was writing a book that would show the administration in a kinder light. “I said what I had to to get the story,” he proudly told one interviewer. Many of his colleagues in the press have shrugged at his willful misdirection—his deception, in fact—as a standard trick of the trade.
They’re probably right. But they demonstrated again the utter detachment of journalists from normal life. Whole professions are generally and rightly maligned—trial lawyers, car salesmen, lobbyists—because ordinary people see that prevarication is built into their work. When it comes to the people who write the books they read, they have a right to ask how far the deception goes. If a writer will mislead his sources, how can we be sure he won’t he do the same to his readers?
“My evidence is the book,” Wolff responds. I’m not sure what he means. In any case, as the Detumescence recedes, it becomes clearer that his evidence is thin. The book isn’t particularly good journalism, but it’s a triumph of marketing. Our Trump hatred has been targeted with such precision that we’ll lower any standard to embrace Fire and Fury, even if the tale as told signifies nothing, or nothing much.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
An uncontroversial museum still manages to offend the ignorant
t one point during his 2000 campaign, George W. Bush gave his listeners a folksy admonition: “Don’t be takin’ a speck out of your neighbor’s eye when you got a log in your own.” This amused Frank Bruni of the New York Times, who called it “an interesting variation on the saying about the pot and the kettle.” Bruni’s words in turn amused the substantial portion of Americans who knew that Bush was actually quoting Matthew 7:3. To them it was simply unimaginable that someone could graduate Phi Beta Kappa with a degree in English and subsequently study at the Columbia School of Journalism, as Bruni did, without having once encountered the Sermon on the Mount. The anecdote revealed the extent to which, in the space of a few generations, America went from habitual Bible reading to biblical illiteracy, and of the most abject and utter kind. This is the justification for the Museum of the Bible.
The Museum of the Bible, which opened in Washington, D.C., in November, is an enterprise of appropriately pharaonic ambition. At a capacious 430,000 square feet, it cost half a billion dollars to build, all of it contributed privately. It is the brainchild of Steve Green, the president of Hobby Lobby, the chain of arts-and-crafts supply stores that successfully challenged the contraception mandate of Obamacare. Indeed, to those who felt the Burwell vs. Hobby Lobby decision was a catastrophic setback to the separation of church and state, the coming of the Museum of the Bible seemed nothing less than the physical manifestation of that threat—an unwelcome expression of evangelical political power standing in plain sight of the Capitol. Burwell vs. Hobby Lobby has loomed large in the coverage of the museum, as has the Green family—as well as the $3 million fine levied on Hobby Lobby for illegally importing cuneiform tablets from Iraq.
But those who looked forward to exposing the museum as a bigoted and ignorant enterprise, with a laughably literal view of biblical truth, have been bitterly disappointed. Its exhibitions are conspicuously even-handed and scholarly, and not at all sectarian. The Museum of the Bible is no vehicle of theological indoctrination. If anything, it errs in the other direction. When it was first incorporated as a nonprofit organization in 2010, it pledged itself “to inspire confidence in the absolute authority and reliability of the Bible.” It has quietly lowered its sights since, and now seeks only “to invite all people to engage with the history, narrative, and impact of the Bible.” This makes the museum less objectionable (who can object to an invitation?), but a less incendiary Bible is also a less interesting one. The danger of the Museum of the Bible is that by sidestepping the question of biblical truth it might downgrade the Good Book, as it were, into one of the Great Books.W
ith all their resources, the Green family might easily have commissioned a celebrity architect to build a prodigy of a museum. But they did not want a building that would compete with its contents. Instead, they bought a 90-year-old cold-storage warehouse two blocks south of the Mall, and into its windowless brick shell they inserted six stories of exhibition and administrative space. The interior is intelligently planned but hardly remarkable, and nothing about its materials, finishes, or details speaks of the Bible or antiquity. If anything, it has the glossy impersonal cheeriness of contemporary hotel architecture.1
The heart of the museum is in the exhibitions of the third floor (The Stories of the Bible) and the fourth (The History of the Bible). These are utterly different in texture and tone, but they work in tandem—one delivering sensation and the other information. This is hardly a new distinction; it is the difference between the stained-glass window and the sermon.
The Stories of the Bible are told through crowd-pleasing “immersive” galleries—the fashionable term for displays in which a coordinated battery of sound effects, musical cues, dramatic lighting, and moving forms are combined to induce an overwhelming sensory experience in the viewer. These were devised by BRC Imagination Arts, a design firm that specializes in corporate branding—as they put it, in “creating emotionally engaging experiences that generate lasting brand love.” When it comes to emotionally engaging material, the exhibits Genesis and Exodus offer at least as much as the Heineken Experience (another recent BRC creation) and here the designers have outdone themselves. Noah’s Ark presents “a unique, stylized representation of the great flood, they tell us.”
“Stacks of boxes tower over them. Inside each box are artistic representations of animals—two by two—lit by flickering candlelight. Guests hear the raging of the storm outside and the creaking of the wooden ship.”
Somewhat later, although not until they have seen “a hyssop bush bursting into flames from the story of Moses,” visitors themselves can part the Red Sea, or an abstraction thereof, created by a web of taut metal cables shimmering under blue light. (It is curious how the highly cinematic events of the Hebrew Bible lend themselves to abstract expression.)
By contrast, the World of Jesus is rendered in literal terms, by means of a realistic re-creation of a first-century village complete with actors in period costume. In the Galilee Theater, visitors can watch a short film and see John the Baptist confronting King Herod (as played by John Rhys-Davies). Even those of us who are allergic to historic reenactments will see that it is carried through with extreme competence and attention to detail. What is there is done well; it is what is not there that has caused a good detail of quiet grumbling. To the bafflement of many, the central events of the Christian Bible—the Crucifixion and Resurrection—are not represented. Were there fears that a scene of unspeakable horror would disturb the museum’s upbeat, family-friendly ambiance? Or is it that its academic advisors come from the mainstream of contemporary Biblical studies, for whom the Resurrection is not a truth but a trope? Perhaps both factors are at play.
Another curious aspect of the display, though unhappy, is understandable: The Hebrew and Christian Bibles are rendered as two segregated and self-contained experiences, and like oil and vinegar, the exhibition paths are not allowed to mix. Unfortunately, the visitor who has waited for the one is unlikely to stand in line again for the other. One can appreciate that the organizers wanted to avoid a linear sequence in which the Hebrew Bible serves as mere prelude to the New, but in the process, the relationship between the two is lost. Surely a compromise might have been found, perhaps with the occasional physical passage between the two, so that the viewer might move back and forth and make his own connections—alas, a proposition that is heretical in today’s world of manipulative museology.
If the third floor gives us the stories in the Bible, the fourth gives us the book itself—not only the text itself but its translations, copies, orthography, printing, binding, illustrations and all else that is associated with a literary artifact. The oldest objects here (although of disputed authenticity) are tiny fragments of the Dead Sea Scrolls, and from them to the most recent translations, one is struck by the fastidious probity with which the text was transmitted. Here we learn the high stakes of tampering with the Bible in the story of how the 14th-century theologian John Wycliffe was posthumously excommunicated for daring to make the first English translation. We also learn how the Bible acted to codify and order regional dialects into a national language; Martin Luther’s translation did this for the German language just as the King James translation did a century later for English. A remarkable display shows the innumerable phrases from the Bible that have entered vernacular speech in the world’s languages, some of which I did not know (e.g., “den of thieves,” “suffer fools gladly,” “at their wit’s end,” etc.
Here one senses a certain reservation—a curatorial suspicion, perhaps, that vellum manuscripts and printed books are intrinsically boring. There is nothing an exhibition designer fears more than a bored visitor. This would account for the rather plaintive effort to provide visual relief in the form of arresting objects: a facsimile of the Liberty Bell with its inscription from Leviticus, a tableau of books burned by the Nazis, and statues of Galileo and Isaac Newton. These diversions suggest that the designers did not trust the words themselves and their hotly disputed variants and interpretations to generate interest on their own.
This is a lost opportunity. For instance, the history of the English translations would have been far more effective with a comparison of representative examples. One might illustrate various renderings of the 23rd Psalm, juxtaposing the lapidary King James version (“The Lord is my shepherd; I shall not want”) with the explanatory translation of the International Standard Version (“The Lord is the one who is shepherding me; I lack nothing”) or the willful flatness of the Good News Bible (“The Lord is my shepherd; I have everything I need”). A few examples from the recent push to purge the Bible of any and all sexist language would also have been eye-opening. To refer to this trend blithely in passing, as the wall labels do, without confronting the viewers with the sobering reality of a gender-neutral Bible is a sign of either haste or indifference.
And for those who are not fascinated by the fact that the neuter possessive its appears just once in the entire King James translation, they still have the chance to take a peek at Elvis Presley’s personal copy of the Bible.T
he truth is, the Museum of the Bible is as innocuous, gregarious, multifaceted, and congenial an institution as one might have hoped. It certainly does not preach biblical inerrancy; the attentive reader will see that Noah’s flood is anticipated by the much older flood story in the epic of Gilgamesh, complete with divine instructions on building the ark.
Nonetheless, the museum has been greeted with extraordinary hostility, although of a strangely unfocused sort. It has hardly been “dogged by scandal,” as Business Insider charged, apart from the importation of antique materials with a false provenance (something of which the Metropolitan Museum of Art and the Getty Museum have both been guilty). The real objection is not its business practices or its theology (which it wears so lightly as to be invisible), but rather that it comes from the wrong side of the cultural tracks. One has the sense that the museum is a social faux pas, that the wrong guests have crashed the party, blundering uninvited into Washington and violating rules of which they are ignorant. CityLab, the digital magazine of the Atlantic, expressed this attitude most pithily when it called the museum “pure, 100 percent, uncut megaplex evangelical white Protestantism…megachurch concentrate.”
The charge that the museum presents a narrow and exclusively white version of Protestantism is undercut by a single visit; the audience is comprehensively ecumenical and international. But it has been repeated endlessly nonetheless, in part because of the recent publication of Bible Nation: The United States of Hobby Lobby, by Candida R. Moss and Joel S. Baden—a furiously ambitious attempt to discredit the museum, its theology, its founders, and Hobby Lobby itself. (This may be the first time a book has been published condemning a museum before it was built.) Moss first came to public attention in 2013 with The Myth of Persecution: How Early Christians Invented a Story of Martyrdom, which charges early Christians with forging accounts of their suppression. Bible Nation is written in a similarly debunking spirit. For her, the “thousands of fragments of contradictory material” in the Bible make it pointless to try to make of it a coherent or meaningful document. The insights of contemporary biblical scholarship, she says with conspicuous exasperation, ought to be “a faith killer.”
Clearly they have been for her. But if anything, the museum’s fourth floor testifies to the opposite: This is a building built by believers for whom the analysis of the materials contained within is a noble task. The curators have taken painstaking efforts to get it right, as did those scribes who through the millennia worked to reconcile the discrepancies, to choose among the contradictory variants the ones that are most rigorously supported. And where the conflicting documents are irreconcilable—as between the two opening chapters of Genesis, or between the four Gospels—the procedure has always been to preserve multiple sources rather than impose an arbitrary uniformity. In the end, the Museum of the Bible pitches it about right.
Its greatest surprise is that it makes no truth claim. The central propositions of the Hebrew Bible (God’s covenant with his chosen people) and the Christian Bible (Christ’s Resurrection) are subordinated to the existence of the Books that carry those propositions. One might imagine that a museum devoted to other monumental culture-shaping books, say The Iliad and The Odyssey, would look similar in approach.
And of course they are right to have done so. The place to make claims to the truth in these cases is a church or synagogue, not a museum. But even the lesser claim that the Museum of the Bible makes, that the Bible is a foundational document of our civilization, is to many an unwelcome one. And as biblical ignorance grows, the claim grows progressively more unwelcome. The Bible seems to be one of those books that the less people know about it, the less they like it. And for those who know it only as a “Bronze Age document” (one of Christopher Hitchens’s favorite epithets) and from some of the livelier passages in Leviticus, it is an offensive absurdity.
Writing in the Washington Post, the novelist and art historian Noah Charney asserted that “in Washington, separation of church and state isn’t just a principle of governance, it’s an architectural and geographic rule as well.” It’s unclear who established such a rule, and in any case, the “principle” of the “separation of church and state” does not originate in the Constitution. Rather, its source is to be found in Matthew 22:21: “Render therefore unto Caesar the things which are Caesar’s; and unto God the things that are God’s.” We all carry a stock of mental habits and moral values, and a language with which to express them, that ultimately derives from the Bible, whether we have read it or not. The Museum of the Bible merely proposes that we read it. And for all its shortcomings and missed opportunities, and all its fits of cuteness (there’s a Manna Café), it does so with refreshing sincerity and surprising effectiveness.
1 The building has one passage of real brilliance. The entrance portal on Fourth Street is flanked by a pair of immense bronze panels, nearly 40 feet high, that call to mind Boaz and Jachin, the mighty bronze pillars that guarded Solomon’s Temple. In fact, they are panels of text inscribed with the opening lines of Genesis, as printed in the Gutenberg Bible of 1454, the first mass-produced book to use moveable metal type. The letters are reversed, confusingly, until one realizes that this aids in making souvenir rubbings that themselves embody the printing process. The genesis evoked here is that of universal literacy and the cultural transformation wrought by the printed book.
Choose your plan and pay nothing for six Weeks!
Review of '(((Semitism)))' By Jonathan Weisman
Now, two years later, Weisman has published a book about anti-Semitism—and, more specifically, about the supposedly grave threat to Jews springing from the alt-right and the Trump administration. (((Semitism))), for such is the book’s title, suffers from two grave ills. First, Weisman believes that political leftism and Judaism are identical. Second, he knows little or nothing about the political right, in whose camp he places the alt-right movement. Combine these two shortcomings with a heavy dose of self-regard, and you get (((Semitism))): a toxic brew of anti-Israel sentiment, bagels-and-lox cultural Jewishness, and unbridled hostility toward mainstream conservatism, which he lumps together with despicable alt-right anti-Semitism.
According to Weisman, Judaism derives its present-day importance from the way it provides a religious echo to secular leftism. This is his actual opening sentence: “The Jew flourishes when borders come down, when boundaries blur, when walls are destroyed, not erected.” Thus does he describe a people whose binding glue over the millennia is a faith tradition literally designed to separate its adherents from those who are not their co-religionists.
This ethnic-Jew-centric perspective leads Weisman to reject not merely Jewish observance, which he finds parochial and divisive, but the tie between Judaism and Israel, which he subtly titles “The Israel Deception.” He laments: “The American Jewish obsession with Israel has taken our eyes off not only the politics of our own country, the growing gulf between rich and poor, and the rising tide of nationalism but also our own grounding in faith.” He sneers at Jews who promote the “tried and true theme of the little Israeli David squaring off against the giant Arab Goliath.” Weisman believes, like John Mearsheimer and Stephen Walt, that members of both parties are guilty of “kissing the ring” at AIPAC, of “turn[ing] to mush when the subject was Israel.” In fact, Weisman says, the anti-Semitic BDS movement on college campuses “is worrisome as much for what it says about the American Jew’s inextricable links to Israel as for what it says about anti-Semitism.” In his view, “Barack Obama was the apotheosis of liberal internationalism.…The Jew thrived.”
Thus Weisman has this to say about his infamous Iran-deal chart: “I had my own brush with fratricidal Jew-on-Jew violence during that heated debate.” Was Weisman attacked? Assaulted? No, he received some nasty notes in response to running a chart. Weisman says he found the uproar “absurd” and laments that he is “still hearing about it.” Poor lamb.W eisman gets it right when he writes about the mainstreaming of the alt-right—the winking and nodding from Breitbart News and Donald Trump himself, the willingness of many in the mainstream to reward alt-right popularizers like Milo Yiannopoulos. (I left Breitbart in March 2016 due to differences regarding our coverage of the presidential campaign). Weisman is at his best when describing the origins of the alt-right and their infiltration of more well-read outlets.
But he can’t stop there. Instead, he seeks to impute the alt-right to the entire conservative movement and builds, Hillary Clinton–style, a fictitious basket of deplorables amounting to half the conservative movement. He cites “Christian fundamentalist” Israel supporters, to whom he wrongly attributes universally apocalyptic End of Times motivation. He condemns anti-immigration advocates, whose opposition to importation of un-vetted Muslim refugees he likens to anti-Semitic anti-immigrant movements of years past. He reviles “anti-feminists,” those who oppose political correctness in video games, Republican Jewish Coalition members who laughed at Trump making a Jewish joke, and free-speech advocates supposedly engaged in “forcible seizure of the free-speech movement” (a weird charge to level, considering that it cost Berkeley $600,000 to prevent Antifa from burning down the campus when I visited). In other words, pretty much anyone who didn’t vote for Hillary Clinton gets smeared with the alt-right brush, outside of those specifically targeted by the alt-right.
The problem of alt-right anti-Semitism, Weisman thinks, is just a problem of anti-leftism. If we could all just give money to the notoriously left-wing propaganda-pushing Southern Poverty Law Center, watch Trump-referential productions of Eugene Ionesco’s Rhinoceros at the Edinburgh National Festival (yes, this is in the book, and no, it is not parody), ignore anti-Semitic attacks at the Chicago Dyke March (I am not making this up), slap some vinyl signs on synagogues (no, I am still not making this up), and “not get too self-congratulatory” (seriously, guys, this is all real), all will be well. In the end, Weisman’s goal is to build a coalition of ethnic and political groups, cobbled together in common cause against conservatives—conservatives, he says, who represent the alt-right support base.
As the alt-right’s chief journalistic target in 2016, I’m always happy to see them clubbed like a baby seal. And there is a good book to be written about the alt-right. At times, Weisman borders on it, particularly when he seeks to investigate the bizarre relationship between Trump and the trolls who worship him.
But Weisman’s ardent allegiance to leftism leads him to misdiagnose the problem, to ignore the rising anti-Semitism of his own side (the DNC nearly elected anti-Semite Keith Ellison its leader last year), to prescribe the wrong solutions, and, most of all, to react in knee-jerk fashion to the alt-right by flattering himself as the epitome of everything the alt-right hates. Thin as the paper it was printed on, (((Semitism))) is a failure of imagination.
Choose your plan and pay nothing for six Weeks!
Review of 'The People vs. Democracy' By Yascha Mounk
The save-democracy writers have generally taken two tacks in answering it. Some see a simple replay of the previous century: The West’s authoritarian spirit has resurfaced, they say, and seduced the multitudes once more. It is up to heroic liberals to fight back, as their forebears in the 1940s did. But others have tried to trace today’s crack-up to liberal missteps or even to flaws in the liberal-democratic idea. This is a more useful avenue for those of us concerned with the preservation of self-government.
Yascha Mounk’s The People vs. Democracy wants to be the latter kind of (subtle, thoughtful) book but too often ends up making the cruder arguments of the former. The author, a lecturer on government at Harvard, argues that while liberals took liberalism’s permanence for granted, voters became “fed up with liberal democracy itself.” Elections across the developed world, in which fringe characters and populists routed mainstream establishments, provide the main evidence. Mounk has also collected mountains of public-opinion data, mainly from the World Values Survey, which shows a deeper transformation: People in the U.S. and Europe increasingly reject democratic principles and even hanker for strongman authority.
Fewer than a third of U.S. millennials “consider it essential to live in a democracy.” One out of 4 believes that democracy is a bad form of government. One-third of Americans of all ages now favor some sort of strongman rule, without checks and balances, and 1 of 6 would prefer the strongman to don a military uniform. Similarly, a third of German respondents and an astonishing half of those from Britain and France support strongman rule. Parties of the far right and far left are rapidly expanding their appeal, particularly among young people. There are many more depressing statistics of the kind, presented in numerous charts and graphs throughout.
Mounk thinks there are two factors at play in these attitudes. The first is the emergence of illiberal democracy, or “democracy without rights,” as a serious rival to the current order. Vladimir Putin in Russia, Recep Tayyip Erdogan in Turkey, Narenda Modi in India, and Viktor Orbán in Hungary, among others, exemplify this model. Once elected, these leaders chip away at individual rights and independent institutions until democracy is all but hollowed out and it becomes nigh impossible to remove the ruling party from office. Mounk strongly suspects that the Trump administration plans to pull something like this on the American public, though thus far the president’s illiberal bluster has proved to be just that.
The second factor is undemocratic liberalism, or “rights without democracy.” Here Mounk has in mind technocratic liberalism’s drive to remove an ever-growing share of policy decisions from the purview of voters and their elected representatives. This has been necessitated by the complexity of contemporary problems such as climate change and international trade, Mounk contends. Yet rights without democracy has generated mistrust and cynicism. Liberals, he says, should aim to “strike a better balance between expertise and responsiveness to the popular will.”
Mounk’s sections on the damage wrought by undemocratic liberalism should be instructive to his fellow liberals. But conservatives have for years stamped their feet and pulled their hair over the same phenomenon, only to be ignored by elite liberals on both sides of the Atlantic. Right-of-center readers might be forgiven for sarcastically muttering “no kidding” as Mounk takes them on a guided tour of liberal folly.
Conservatives have been warning about administrative bloat, for example, since at least the first half of the 20th century. It turns out that they had a point. Writes Mounk: “The job of legislating has been supplanted by so-called ‘independent agencies’ that can formulate policy on their own and are remarkably free from oversight.” Ditto activist judges: “The best studies of the Supreme Court do suggest that its role is far larger than it was when the Constitution was written.” And ditto the European Union’s democratic deficit: “To create a truly ‘single market,’ the EU has introduced far-reaching limitations” on state sovereignty.
He also strikes upon the idea that nations really are different from one another, and in politically significant ways. “After a few months living in England,” the German-born author confesses, “I began to recognize that the differences between British and German culture were much deeper than I imagined.” No kidding. What about the anti-Western monoculture that lords over most college campuses? Here, too, the right was on to something. “Far from seeking to preserve the most valuable aspects of our political system,” Mounk writes, liberal academe’s “overriding objective is, all too often, to help students recognize its manifold injustices and hypocrisies.”
Mounk’s discovery of these core conservative insights, however, doesn’t spur a rethink of his reflexive disdain for conservatives. This is most apparent in his coverage of American politics. The book is supposed to be a battle cry for democracy to rally left and right alike. Yet, with few exceptions, conservatives and Republicans are cast as cynical operators who rely on underhanded tactics and coded racism to undermine democracy and ultimately abet the populists. (Hillary Clinton and Barack Obama receive adulatory treatment.)
He describes Senate Majority Leader Mitch McConnell’s refusal to hold hearings for Merrick Garland, Obama’s final Supreme Court nominee, and GOP filibustering of Democratic legislation as “abuse[s] of constitutional norms” (they weren’t). But he pooh-poohs popular outrage at Clinton’s unlawful use of a private email server and elides the Obama Internal Revenue Service’s selective targeting of conservative nonprofits ahead of the 2012 election.
He also underestimates a third development of recent years—liberal illiberalism (my term, not his)—a liberalism that not only lacks democratic legitimacy but seeks to destroy, in the name of tolerance, the fundamental rights of those who stand in the way of full-spectrum progressivism. This is the kind of liberalism that compels nuns to pay for contraceptives and evangelical bakers to bake gay-wedding cakes, silences conservative speakers on campus, and denounces sushi restaurants as “cultural appropriation.”
Mounk isn’t ignorant of these tendencies, and he wants liberals to ease up (a bit). Yet, because he maintains that the censorious left’s heart is in the right place, he can’t seem to reach the necessary conclusion: that much illiberalism today comes, not from the right, but from ostensibly liberal quarters, and that this says something about the nature of contemporary liberal ideology. The true illiberal villains, for Mounk, are only ever the Modis, Trumps, and Orbáns—plus the troglodytes down South. Well-intentioned liberals who back censorship, he writes at one point, “ignore what would happen if the dean of Southern Baptist University…were to gain the right to censor utterances” he dislikes.
In fact, there is no such institution as “Southern Baptist University.” According to the most recent rankings from the Foundation for Individual Rights in Education, however, four of the 10 worst U.S. colleges for free speech last year were public schools located in blue states, while five were blue-state private or religious schools with longstanding reputations for progressivism (Mounk’s own Harvard among them).
His quickness to frame Southern Baptists as illiberal bogeys is telling and suggests that, for all its exhortations against liberal highhandedness, Mounk’s book comes from the same high-handed place. It colors the author’s approach to questions of nationalism and immigration that are at the heart of the current ferment. He concedes that liberal democracy is compatible with voter demand for limits on mass migration. But he can’t help but attribute those demands to irrational “resentment,” eschewing completely the—perfectly rational—fear of Islamist terrorism.
He sees the nation-state as an “imagined community” to which too many of our fellow citizens remain attached. Ideally for Mounk, the empire of rights and procedural norms would thrive independently of nationhood, civilizational barriers, and sacred communities. For now, he allows, liberals unfortunately have to contend with these anachronisms. His view is an improvement over the liberal transnationalism that is still committed to doing away borders altogether, even after the popular counterpunch of 2016. Still, why should Poles or Hungarians or Britons remain politically attached to Polish, Hungarian, or British democracy? What is it about Polishness as such that matters to Poland’s democratic character? Mounk has no answers.
No wonder, finally, that the author never satisfactorily links liberalism’s turn against democracy and the rise of illiberal democrats. He can never bring himself to say outright that the one (rights without democracy) is begetting the other (democracy without rights). Liberals, of the classical and the contemporary varieties, badly need a book that offers such uncomfortable reckonings. Yascha Mounk’s The People vs. Democracy is not it.