A Commentary Symposium
This Symposium was made possible by a generous grant from the Gale Foundation.
I am pessimistic. Let me count the ways!
First of all, there is our government: no longer merely dysfunctional, it has now entirely ceased to operate in a coherent manner. The sorry spectacle of the impasse on Capitol Hill over the summer disgusted the American electorate, a majority of whom now believes Congress should simply be dismissed. It has become all too apparent that, with a few valiant, struggling exceptions, the members of Congress no longer represent their constituents and have been bought and paid for by various corporate powers and financial institutions. Perhaps we should require them all to wear uniforms with logos, like NASCAR drivers, so that we can identify their corporate sponsors. The fact that a significant majority of American voters would like to raise taxes for the very rich and to preserve Medicaid and Medicare, while Congress is swinging in the opposite direction, is proof enough that they’ve stopped representing us.
And what about our national debt? Congress can bicker over limiting “entitlements” all they want, but the problem cannot be resolved without overhauling the health-care system and radically reducing military engagements—issues that our government’s corporate and military-industrial sponsors will not allow onto the table. The total price tag for the Bush/Obama wars in Iraq and Afghanistan is now estimated at something between $3.7 and $4.4 trillion, if one counts the medical costs of caring for maimed and traumatized veterans.
As for unemployment: even if our Democratic president came up with a truly brilliant jobs program and our Republican-led Congress actually passed it, we would still be dealing with the basic facts that industrial and manufacturing jobs are disappearing and much of the American workforce is not prepared for the new information-technology jobs that are coming along. Workers in India and other offshore sites are just so much cheaper, and so much better educated.
Which leads me to one of the root causes of my long-term pessimism: the state of American education. We are constantly confronted with dismal statistics on test scores, our students’ performance relative to other developed nations, etc. But what is the reason for this, and what is the solution? It’s not an answer, I think, to throw more money at the problem. As the parent of college students and as a teacher of college students, I’ve noticed that kids from “good” high schools (both public and private) are often as ill-prepared as any others. The problem seems to me a deep-seated one: we simply have no consensus as a nation, no unified philosophy of what an educated person should know. Perhaps this relates to the breakdown of government; we have arrived at no consensus as a nation about what a government should do.
As he took office in 1789, George Washington admitted in private that he doubted the Union would last for more than two decades. It has lasted, if dysfunctionally, for more than two centuries. But it is no longer a nation he would recognize, and its government is certainly not one he’d be proud of.
Brooke Allen is the author, most recently, of The Other Side of the Mirror: An American Travels Through Syria (Paul Dry Books).
Twelve years ago, I was asked by First Things to write my predictions for America in the new millennium. I decided to look at the question from the perspective of an ancient Roman in the year 0 trying to predict his city’s own next millennium. Self-confident Rome in many ways resembled self-confident America in late 1999: it was robustly prosperous, the world’s lone superpower, heir to a vast and rich storehouse of Western civilization, and overwhelmingly dominant culturally. The Roman world stretched—or was on the verge of stretching—from nearly all of Western Europe well into Central Asia. I observed that Rome might have seemed invincible in the year 0, but by the year 1,000 its Western European heartland was in shambles, there was little left of its empire, and the world had changed in ways that would have shocked that ancient Roman. I wrote that America’s future was equally unpredictable, and that by the year 3000 we might well be yet another long-vanished civilization whose downfall will be puzzled over by archaeologists and historians.
What I could not predict was how quickly the downward slide would come. As with ancient Rome, the signs were already present: the barbarians at the gates (9/11 was months away); the demographic implosion of populations of European descent; the cultural decadence; and, worst of all, the drastic loss of national self-confidence and self-direction. And now, the statistics everywhere you look are ghastly: an unemployment rate of 9.1 percent; all-time-record-setting foreclosures; a 40-percent out-of-wedlock birthrate; uncontrollable illegal immigration (12 million illegals currently living in the United States, compared with 5 million in 1996); a Federal Reserve that seems to be aiming at Weimar Republic–level inflation; swollen, immovable unionized bureaucracies at every level of government; a K-12 education system that is one of the worst in the industrialized world; and an entitlement burden that eats up half the federal budget. Over all this looms the colossal black shadow of our $14 trillion national debt—an amount so massive that we can’t even imagine what the number really means, let alone figure out how to repay it.
Lone superpower? Tell that to China. Or for that matter, natural resources–rich Russia. We seem unable to deal firmly with militant Islamists—one group of people that is not demographically challenged and is systematically replacing Europe’s declining population. It is a horrifying sign of the decline of our national will that not only has 9/11 not yet been properly avenged, but public authorities are pushing a plan to build a mosque on one of the devastated sites that, until a public-relations makeover, bore the Islamic-triumphalist name “Cordoba House.” Another sign of national weakness: ObamaCare. Not only because it’s an expensive, wasteful, intrusive health-care scheme, but because enough Americans were willing to turn health care over to the government in the first place, ending our proud and longtime resistance to socialized medicine, a resistance that once helped make American medical care the best in the world.
Some of these problems may be temporary. We can elect a better president and a better Congress whose ideas about curing the recession do not consist solely of raising taxes, further bloating the government, and crippling us with even more debt. I don’t know what we can do about everything else. What is called for are deep cultural changes that may come too late. I hope not. But I have to remember that Rome did disappear. And one of the driving forces behind the disappearance of its last Eastern remnants was militant Islam.
Charlotte Allen has a doctorate in medieval studies and is a frequent contributor to the Los Angeles Times, the Wall Street Journal, and the Weekly Standard.
Whether someone is optimistic or pessimistic is usually more a product of his temperament than external conditions. My own outlook is generally optimistic, so it should be no surprise that I am bullish about the prospects of my country. But there is also good reason to have faith in America’s future.
Look at how far we have come since the start of the War of Independence in 1775: from 13 beleaguered colonies with 2.5 million inhabitants perched precariously on the eastern seaboard to a continental nation of 307 million that is wealthier and more powerful than any other in history.
There was nothing foreordained about our rise. We had to surmount numerous challenges—from the initial revolution to the War of 1812, the Civil War, the Great Depression, World War II, and the Cold War—that could have done us in, or at least vastly reduced our standing. Just look at how other megastates such as China and Russia, or potential megastates such as Europe and Latin America (both of which have long dreamed of unification), have sabotaged their own prospects with suicidal political and economic policies. That could have been us. But it wasn’t.
The reasons for our success surely include a favorable geography that provides us lots of natural resources and few nearby enemies and allows us access to both Europe and Asia; a political system that makes the state stable and flexible; a legal system that guarantees property rights and minimizes corruption; an entrepreneurial culture that encourages innovation and economic growth; an openness to immigrants that allows us to assimilate newcomers better than any other nation in the world does; and a civic spirit that leads citizens to serve when called upon—whether in 1861, 1941, or 2001.
I have no reason to think that we have lost any of these fundamental strengths. None of our “near peer” competitors is so lucky.
Europe must deal with chronic disunity, economic stagnation, an aging population, a sclerotic welfare state that cannot be cut back without riots in the streets, an influx of immigration that threatens traditional culture, and puny military capabilities. Japan’s population is aging even more rapidly—it’s in a demographic death spiral. The same goes for Russia.
China is facing its own demographic issues: its population is predicted to decline after 2020. It will age so rapidly that there will not be enough workers to support hordes of retirees. China must also deal with the fundamental illegitimacy of its unelected government, its lack of civil society, pervasive corruption, environmental devastation, and paucity of natural resources. (Almost all its oil must come from the Middle East along sea-lanes controlled by the U.S. Navy.) India, as a fellow democracy, may have greater potential to knock us off our perch, but given how poor it remains, that is unlikely to happen in this century.
We have our own urgent problems to address—especially too much federal spending and too little economic growth—but they are hardly unsolvable. Ronald Reagan dealt successfully with similar issues in the 1980s. All it will take is a political change in Washington, which is becoming more likely as Obama’s popularity wanes. There is no reason the 21st century cannot be another American Century.
Max Boot is the Jeane J. Kirkpatrick Senior Fellow in national-security studies at the Council on Foreign Relations and a regular contributor to Commentary’s Contentions blog.
Arthur C. Brooks:
Recent statistics about America’s levels of debt and tax burden make for depressing reading. Our national debt has increased from 42 percent of GDP in 1980 to 100 percent of GDP today. Government spending (27 percent of GDP in 1960) is 37 percent of GDP now—and is set to hit 50 percent in 2038. Between 1986 and 2008, the share of federal income taxes paid by the richest 5 percent of Americans has risen from 43 percent to 59 percent. At the same time, the number of Americans who pay zero or negative income taxes has risen from 18.5 percent to 51 percent.
Numbers like these have led some to despair that there are really only two possible scenarios for America’s future. In one, we finally hit a tipping point where so few people actually pay for their share of the growing government that we embrace European-style social democracy. (Think France.) In the other scenario, our growing welfare state slowly collapses under its own weight, and we get some kind of permanent austerity once the rest of the world finally realizes the depth of our national spending disorder and stops lending us money at low interest rates. (Think Greece.)
These are not, however, the only two choices. We can make the hard choices as a nation to consolidate fiscally in a way that cuts government spending and stops penalizing entrepreneurs. But the way to achieve this is not the way conservatives typically advocate, which is doubling down with scary data about terrible economic growth and distortionary taxation. Instead, what conservatives must do is turn to the moral case for free enterprise: that it allows individuals to flourish as they earn their own success, is fundamentally fair in rewarding merit, and is the best way to give opportunities to the less fortunate.
This prescription would hardly sound foreign to our Founders, who in the Declaration of Independence asserted our right to pursue happiness instead of the mere possession of property. On the other side of the Atlantic, the father of free-market economics, Adam Smith, was articulating a soulful defense of human liberty in which every man “is left perfectly free to pursue his own interest in his own way.” He wrote these words 17 years before penning The Wealth of Nations.
How is it that today’s free-enterprise warriors have forgotten how to use the language of morality? Statists may talk about fairness and “social justice,” but free marketers seem content to console themselves with the language of economic efficiency and productivity. They are right about free enterprise being good for economic growth, but their arguments rarely move the soul. Yet it is the moral, cultural case for free enterprise that America most needs to hear today if it is willing to make sacrifices for the future.
Rather than making the business case, free-enterprise advocates must stop talking about dollars and cents and start talking about what is written on their hearts. They must talk again about why America is an exceptional nation and about what its system of free enterprise offers—the possibilities of self-realization, matching our skills with our passions, and pursuing happiness in whatever way we choose to define it.
If we do this, then Americans may help us change course before statism changes our nation for good.
Arthur C. Brooks is president of the American Enterprise Institute.
During the 1980s and 1990s, many conservatives issued warnings about the decline of American culture and American values. We learned in the ensuing years about the danger of these sorts of sweeping prognoses. Far from sliding to Gomorrah, America experienced a cultural renewal—lower crime rates, lower teenage pregnancy rates, less domestic violence, more community service, and on and on and on.
Many of those positive trends still hold. After the disruption of the 1960s, we are living in a period of social repair. But there is one problem, which emerged in those years, that is still with us, worse than ever.
It has to do with the enlargement of the self. The generation reared in the 1930s had a relatively small definition of self. They saw how great historic events could sweep up mere individuals. (“The problems of three little people don’t amount to a hill of beans in this crazy world.”) They were raised with the vestiges of the Augustinian warnings about the sin of pride.
But then came the psychologizing movements of the 1950s and 1960s. The big danger was not pride, but lack of self-love. That was amplified by the individualizing effects of the political and cultural shifts of the 1960s (morally) and the 1980s (economically). These narcissistic tendencies have been amplified further by Facebook and reality television—the rise of the instant fame culture.
The consequences are grim. They include a rising level of consumption (as people spend on themselves in a matter that befits their station); a rising tolerance of debt (which goes along with a greater confidence in people’s perceived ability to handle it); a greater level of political intolerance (as people lose the sense that they need their political opponents to correct the errors in their own thinking).
And so we wind up with a more consumption-oriented, short-term-oriented, and polarized nation. You can think that the overall culture is strong, but in this one way it is weak.
The question is whether this one tragic flaw undermines all the good things that are going on. I believe in the short term it will. We remain the crossroads of the world, the place where people from around the globe want to come to best magnify their talents. China will never match this. But in the medium term we are headed for a fiscal crackup that is the economic manifestation of a deeper moral shortcoming.
We will endure it, thanks to all the underlying strengths. But it will not be pretty.
David Brooks is a columnist for the New York Times and the author, most recently, of The Social Animal (Random House).
I am not a professional futurologist and am, in fact, profoundly skeptical about attempts to predict something as complicated as the future of America. The problem with predicting the future is that we generally assume that it will be created by people just like us, only living in the future. But the future is going to be the future precisely because it will be created by people who are different from us in ways that we cannot anticipate. We normally ask older people to predict the future, because they have had the time to become experts of one kind or another. We should instead be asking five-year olds. Short of that, I will say something about 18-year olds. As a college professor, I do have some knowledge of America’s youth.
Here I have every reason to be pessimistic, and yet I remain cautiously optimistic. Despite my grave doubts about the direction higher education is taking in the United States, I cannot help being impressed by individual students I encounter, both at my own university and at other campuses I visit. And what surprises me is not so much their schooling as their character. I still see students who are freedom-loving, self-reliant, resourceful, willing to take responsibility and risks, and open to genuine challenges—in short, Americans at their best. This is all the more remarkable when, from what I can tell, our whole world, and especially our educational institutions, are working to make young people weak and dependent. Maybe formal schooling is not as important as we academics would like to think. A look at history suggests that Americans have often achieved great things in spite of their formal education rather than because of it. Among nations, America can pride itself on being the land where high school and college dropouts can not only survive but also sometimes succeed beyond their wildest dreams—and ours.
In looking for factors that are still building character in American youth, I think of several traditional explanations. It really helps when a student comes from a two-parent family, in which both take an active interest in his or her development. Athletics builds character and helps toughen up young men and women. Provided that they do not become in effect professional athletes in high school or college, they can experience in sports one of the few remaining areas where objective achievement is still measured—and demanded.
But there are some new forces working to inspire independence in today’s youth: the Internet and social media. These are often accused of corrupting youth, and to the extent that they appeal to a virtual herd instinct, they are creating new forms of dependence. But cyberspace is also the new frontier for the most ambitious and audacious of our youth, and it’s teaching them anew the value of freedom. They resent attempts to censor and otherwise regulate their freedom of expression. They have learned to appreciate new forms of freedom of exchange, and a new generation of cyber entrepreneurs is being born before our eyes.
I am sure to be bombarded with statistics that show how poorly today’s young Americans do on tests and how low they rank compared with students in, say, Finland. To such criticism—aside from asking, “What has Finland done for us lately, besides the newest Rautavaara symphony and some Nokia phones?”—I would reply that standardized exams do not test character, and all they offer are statistical aggregates and averages. I am not talking about the average youth of today or tomorrow. America has never depended on the achievement of average people. What has made America great is that, by and large, it has given the most talented and spirited among its youth a chance to show their stuff. If I am pessimistic, the reason is that this American tradition is being eroded by all sorts of factors, most of them emanating from Washington, D.C. But if I nevertheless remain optimistic, it’s because I still see exceptional young people in my classes and I can feel them straining to do something exceptional with their lives. If only we would get out of their way.
Paul Cantor is Clifton Waller Barrett Professor of English and Comparative Literature at the University of Virginia. His most recent book, coedited with Stephen Cox, is Literature and the Economics of Liberty (Ludwig von Mises Institute).
James W. Ceaser:
There is no sadder sight than an American pessimist. Americans—Jean de Crèvecoeur told us—are born a free and hopeful lot, “a new race of men,” blessed with a bounteous land and a moderate government. Lincoln called Americans an “almost chosen people,” a designation bound to leave many readers of this magazine wondering at the divine improbability of being selected not once, but twice. Optimism, by nearly all accounts, has been an integral part of our national DNA.
What, then, is one to think of opinion polls today showing that, by a margin of almost 4 to 1 (77 percent to 20 percent), the public considers the nation to be on the “wrong track”? Malaise of such Carteresque proportions might easily be interpreted to mean that Americans have lost faith in themselves and in the future.
I am not so sure. Contrary to initial impressions, the real pessimists today are probably to be found among the “right-trackers,” clinging stubbornly to the change they once believed in. Having put their dream team on the floor, under the leadership of one touted to be the greatest political talent of our era, these die-hards have little choice now but to put on a grim public mask of hopefulness. For two years (2009–2011), we enjoyed by their reckoning virtually unchecked government of the best, by the best, and according to the best theories.
Yet things have not panned out. The outcome is being blamed on the difficulty of the challenge, on fate, or on severe headwinds, but doubt must certainly be creeping in that the fault is theirs. Right-trackers today are a desperately dispirited group, filled with dread that their opponents will take over and fail or, much worse, succeed.
One segment alone of the right-trackers seems upbeat: the so-called foreign-policy realists. For decades, intellectuals of this persuasion have yearned for a much less assertive America on the world scene. They have a president who agrees. Now, with constraints imposed by our current indebtedness as the rationale, these thinkers insist that America has no choice but to cede leadership to others. Blessed is the nation in decline, for it shall disinherit the earth.
And what of the almost four-fifths of Americans who think the nation is on the wrong track? Many, perhaps most, in this group have not lost hope. Dismayed almost to the point of despair at where the nation is now heading, they nevertheless see a path to revival and restoration by a change of direction. They reject a no-growth economy as the “new norm,” affirm a return to more limited government, and back a vigorous foreign policy. Their rallying cry has been American exceptionalism, a concept vague in its content but expressive of a strong, almost defiant, spiritedness.
Whether optimism or pessimism prevails will not by itself determine the outcome to the crisis the nation now faces. Far more will depend on the soundness of our leadership and the wisdom of the policies it adopts. The prophet, not the prognosticator, alone can know the future. Still, only a nation that possesses an underlying confidence that it can shape its own destiny is prepared for greatness.
James W. Ceaser is professor of politics at the University of Virginia and a visiting fellow at the Hoover Institution.
There is much to warrant optimism about the future of the United States, given the nation’s history of resilience in the face of adversity. But one social trend, the supplanting of the American family by government as the major source of economic security from cradle to grave, may prove more destructive to America’s future than any previous threat, foreign or domestic.
The problem begins with the dramatic change that has taken place in the family. An estimated 60 percent of all American children will spend at least some of their childhood in a single-parent household primarily as a result of divorce and rising out-of-wedlock births. The most recent figures show that, overall, 4 in 10 children in America are now born to single mothers. But among blacks the number is more than 7 in 10; and among Hispanics, fully half of all births occur out of wedlock.
In 1965, the late scholar and senator Daniel Patrick Moynihan warned that the rate of illegitimate births among blacks was responsible for “a tangle of pathology” that included high crime rates, poor performance in school, and high unemployment, especially among black men. At the time, 24 percent of black births were to single women, a rate lower than the current 28 percent illegitimacy rate for white women. “There is one unmistakable lesson in American history,” he said. “A community that allows a large number of men to grow up in broken families, dominated by women, never acquiring any stable relationship to male authority, never acquiring rational expectations about the future—that community asks for and gets chaos.” But as trenchant as his analysis of the problem was, his solution—more government programs—did not alleviate the disaster taking place in the black family but accelerated it. Worse, dependence on government assistance spread to ever-larger segments of the American population.
Uncle Sam has largely replaced fathers in poor, single-mother-headed households, providing the food on the table, the roof over the family’s head, and the income to put clothes on their backs. And the expansion of the welfare state is no longer confined to the indigent but has now extended to the middle class as well. Middle-class parents have less incentive to save for their children’s college education when the federal government makes low-interest loans and grants available. Adult children, even those who are well off, are less likely to help support their elderly parents when government programs take on that responsibility. A study from the University of California, Davis, looking at welfare use among elderly Chinese immigrants in California in the 1990s, for example, showed that, despite cultural traditions that encourage children to provide for elderly parents, 55 percent of elderly Chinese were receiving welfare; and the great majority of these lived in households whose income was above the national average, often substantially so.
Even Social Security and Medicare, which most Americans think they’ve paid for through payroll taxes during their working years, have become a form of government subsidy. On average, even wealthier Americans will receive substantially more in benefits than they have contributed through payroll taxes. And the list goes on, including federal guarantees for home mortgages, interest rates that have been kept artificially low by the Federal Reserve, and mandated universal health care.
We are fast becoming a nation of takers, increasingly dependent on government through income transfers from the wealthy. Families made up of responsible, self-sufficient individuals who pay their own way and save for the future are fast disappearing. Unless we can reverse this cultural shift, the future of America is at risk.
Linda Chavez is the chairman of the Center for Equal Opportunity. Her most recent contribution to Commentary, the short story “Afterbirth,” appeared in the May issue.
I’m an optimist by nature, and a comic writer; all my novels, dark as they are, end with an uplift. I believe in sweetness and light. But there are some very good reasons to be direly pessimistic about the future of this country, which has come to feel like an amalgam of corporatocracy, fascist police state, and mini-mall. I feel by turns overwhelmed and angry and worried about the environment, the food industry, corporate greed, and the ballooning (in both senses) population. There are seemingly so many systemic failures that facing and fixing any of them, let alone all of them, feels impossible.
Where to start? Our great Constitution is simultaneously disregarded, on the one hand, in the fearmongering interest of “national security,” and on the other, iron-fistedly brought to bear on Supreme Court decisions that hinder necessary social progress. Monsanto is taking control of agriculture and the food industry with non-propagating seeds and genetically modified “Frankenplants.” Obesity already affects a third of our population, and will likely affect 50 percent of us by 2030. Our population itself is projected to reach 400 million by 2043, doubling in my lifetime. The pursuit of oil and natural gas to meet the energy needs of this growing population threatens what’s left of our environment. Weather patterns are changing in drastic and undeniable ways and, by all reputable accounts, it’s too late to stop them.
Public education is primarily concerned now with teaching kids how to pass multiple-choice tests. Health care and Social Security are unsustainable; we can no longer afford them. Our all-encompassing “culture industry” has proved Theodor Adorno right: popular art seems increasingly to exist primarily to feed market interests, and any potential counterculture is immediately enveloped by the market. Then there’s the growing disparity between rich and poor—when our only agency lies in the dollar, not the vote, only the rich have any power—the skyrocketing debt, the crumbling of basic infrastructures, and the toxic divisiveness of our political culture.
What did I leave out? Oh yes, the economy. It’s bad.
How is any of this ever going to be reversed when all indications are that it’s entrenched and accelerating? The idea of protesting unchecked corporate power strikes me as futile, like punching the Pillsbury Doughboy in the stomach—all you do is bury your arm in corrupt goo, and then you’re stuck trying to pull it out again before it gets swallowed. And, of course, full-out revolution is impossible. There’s nothing to topple. Our government is impotent, and the multinational corporations whose interests it serves are like mutant super-ivy embedding itself into the planet’s surface with enormous stems and horror-movie tentacles.
In the face of this clear and overwhelming and deeply upsetting evidence that we’re already in the handbasket to hell, I see no alternative but to abandon all hope. This breaks my heart. I remember believing as a kid that this was a great country, that America was free and strong and full of possibility. I would love to be optimistic, in the end, about Americans pulling together to overcome any crisis. But I can’t convince myself, much less anyone else, that there’s anything we can do, given what this country has become and what it is further becoming. All I can do is mourn.
Kate Christensen is a novelist and the author, most recently, of The Astral (Doubleday).
Pessimist? Optimist? Why not go all out and embrace the great American tradition of the jeremiad? Given the slightest excuse, we Americans rend our garments, fill the air with lamentations, and prophesy doom. The end is approaching; strap on your seatbelts; we are going to hell. Evidence can be found everywhere: harvests wilting, prices rising, oil spills gushing, banks defaulting, Congress stalemating, and the economy threatening to collapse.
From my corner of the world (I am a professor and a university librarian), there is a lot to lament, beginning with the use of language. Students’ papers contain phrases such as “between you and I.” Deans say, “going forward” instead of “in the future.” And a corporate idiom has invaded everything. We deal in “trade-offs” and “takeaways” and can’t pursue a course of action without issuing “mission” and “vision” statements, preferably in color and with arrows pointing to boxes meant to show where we are headed and how we intend to get there.
I take the language as a symptom of something more serious: the commercialization of the world of knowledge. Learning never was free, and research libraries are complex organizations, which require business plans. But how can we balance our budgets when the price of scholarly journals, set by monopolistic publishers, has spiraled out of control? The average institutional subscription price to a journal in physics is now $3,368 a year, and several journals cost $30,000.
It once seemed as though Google would democratize access to knowledge by digitizing all the books in our research libraries. But when Google struck a deal with the authors and publishers who had sued it for breach of copyright, it turned its digitizing operation into a commercial venture; the prices it could charge libraries for subscriptions to its database could have escalated as badly as the prices of journals did. Fortunately, a New York court declared the deal unacceptable because it threatened to eliminate all competition, and now we have an alternative to Google Book Search.
I refer to the Digital Public Library of America, a project to digitize millions of books and to make them available free of charge to everyone in the world. Far from being a utopian dream, this plan is doable. A coalition of foundations will provide the funding, and a coalition of libraries will supply the books. We will announce its details at a conference in Washington, D.C., on October 21, and we expect it to begin providing books and all kinds of digital material to the public within three years.
Despite my lamentations, therefore, I look forward to a promising future, at least insofar as ordinary people will have access to their cultural heritage. Am I an optimist? Yes, but not a cockeyed optimist.
Robert Darnton is the Carl. H. Pforzheimer University Professor and university librarian at Harvard. He is the author, most recently, of Poetry and the Police: Communication Networks in Eighteenth-Century Paris (Belknap).
Can I just say that each morning I look at the paper and grow increasingly despondent? Back in the 19th century there was a Know-Nothing Party, but I never thought I’d see its revival. We now have elected hicks who have apparently spent their lives reading nothing but Ayn Rand and the King James Bible, who express grave, very grave reservations about evolution and global warming, and who would like to rescind any social or economic law of the past century that helps the working class.
Some days, I say to myself, that it was ever thus. Nearly a hundred years ago, H.L. Mencken described the America of his day:
Here, more than anywhere else that I know of or have heard of, the daily panorama of human existence, of private and communal folly—the unending procession of governmental extortions and chicaneries, of commercial brigandages and throat-slittings, of theological buffooneries, or aesthetic ribaldries, of legal swindles and harlotries, of miscellaneous rogueries, villainies, imbecilities, grotesqueries, and extravagances—is so inordinately gross and preposterous, so perfectly brought up to the highest conceivable amperage, so steadily enriched with an almost fabulous daring and originality, that only the man who was born with a petrified diaphragm can fail to laugh himself to sleep every night, and to awake every morning with all the eager, unflagging expectation of a Sunday-school superintendent touring the Paris peep-shows.
Mencken had no children, so he could afford to be entertained at what he viewed as a carnival of bunkum. But I have sons, just starting out in life, and I weep at the state of this country and the gimcrack, meretricious mall-world of the 21st century. As a child of the frequently maligned 1960s, I grew up on dreams of, on the expectation of, a better, more equitable and peaceful world. Progress was made, no question. Yet, look back on this past decade and what stands out? Suicide bombers and a forever war, the economic destruction of our country by venal plutocrats, and our young blithely sedated by the addictive distractions of their digital toys.
Americans are taught to believe that somehow our country is uniquely indestructible, that we can bounce back from anything. But in 1911 the British Empire—the one upon which the sun never set—felt and believed exactly the same. Forty years later, it was gone. These United States of America are, of course, absolutely exempt from such a possibility. We’re special.
As for literature, my own field: I worry that e-book culture actually inhibits serious reading. A work of art requires a slow, steady interaction between a reader and a text, a contemplative frame of mind, a kind of immersion in a poem or novel’s waking dream. Screens, however, are all about speed, the quick retrieval of facts, the gathering of data. But the getting of information is not the same as the getting of wisdom or aesthetic delight. Will an e-book user slow down enough to appreciate the great and sometimes demanding books of the past?
I really hope I’m dead wrong about the future of America and about the negative consequences of screen technology. Maybe, just maybe I’ll wake up tomorrow or the next day and all shall be well, all manner of thing shall be well. That Dirda, such a dreamer.
Michael Dirda is a Pulitzer Prize–winning literary journalist whose books include four collections of essays, the memoir An Open Book, and the recently published On Conan Doyle (Princeton).
The future is both dark—the problem isn’t debt but dependency—and bright, because the real achievement of the Internet will be a return to the one-room schoolhouse.
Public debt will be brought under control—a clear majority wants it; but once America crosses the tax-dependency threshold, the future swallows hard and gets heart palpitations. The total number of Americans who live off tax revenues is hard to figure out: government workers and their families, teachers, staff at government contractors, the military and so on. It’s not dishonorable to be a tax client, but disinterested voting is tricky for such people, and it requires much civic virtue—which isn’t always available.
Remember, Wisconsin ought to be a theme of every conservative campaign next year: the danger is not that tax clients will become a majority but that they will increasingly make common cause, gain arrogance and swagger, and become a danger to democracy.
In Wisconsin, voters elected a Republican governor to get control of a large state budget deficit and a huge unfunded state-worker pension liability. The governor suggested, among other far-right ideas, that state workers should pay into their own pension funds. Mobilizing union and establishment support from across the country, Wisconsin’s privileged minority of state workers (who earn more, on average, than do ordinary citizens) did its best to commit armed robbery against the population. Democratic state legislators actually walked out on democracy—ran away and hid. The Detroit Symphony, which also happened to be on strike, sent commando squads to entertain Wisconsin state workers with solidarity anthems and inspirational chamber music. Well-funded recall attempts against several Republicans were fended off with difficulty, like shark attacks. The Democrats are lucky they failed, or they might have faced actual public wrath.
Enlarging the tax client state-within-a-state is increasingly dangerous to the republic.
On the other hand, sometime within a decade or so, a new and refreshing type of building will rise somewhere in suburbia: a one-room schoolhouse with seats for 30-odd students, computers and headphones for each, some printers, a desk and flag in front and a playground outside.
The 30 students who attend this “school” are of assorted ages; each is enrolled in a separate set of online courses chosen by his parents. The children could learn at home, but spending time at school is good for them, their parents, and the community. The adult sitting up front doesn’t need an education degree or any other degree. She only needs to be known in the neighborhood as sensible, reliable, and good with children. She calls the school to order, takes attendance, leads the Pledge, announces recess, and handles any child-type emergencies. These new micro-schools are so cheap, we can build as many as we like.
It goes without saying that American public schools, and most colleges and universities, are now on the long, slow ride to the gallows. Their high costs, obvious political agenda, and gross incompetence mean that eradication is their only conceivable fate. Online schooling is a far-from-perfect alternative, but it’s the one we have. To balance its obvious disadvantages, it has enormous potential for good—beyond the decent education it provides. If we are imaginative about this new kind of public institution, these little red Internet schoolhouses, much good may yet emerge from the wreckage of American public schools.
David Gelernter is a contributing editor to the Weekly Standard and the author, most recently, of Judaism: A Way of Being (Yale) and a forthcoming book about the American Cultural Revolution.
I remain optimistic in general terms about the United States. Despite all the troubling economic, political, and social trends, I still trust the energy and common sense of the average American. However slowly and painfully, the country will eventually sort out its most pressing problems.
I am far less confident, however, about the nation’s cultural and intellectual future. There has been a vast dumbing down of our public culture that may already be irreversible.
There can be no doubt from the many detailed and reliable studies available that Americans now know less, read less, and even read less well than they did a quarter century ago. These trends have measurable consequences in lowering academic achievement and economic productivity. They also demonstrably diminish both cultural activity and civic participation. We live in a society addicted to constant electronic entertainment—mostly done by individuals at home, isolated not only from their communities but increasingly even from their own families.
Our public culture consists mostly of low-level entertainment and advertising (often intermixed), which is now ubiquitous—filling not only television, radio, the Internet, and print, but also restaurants, bars, airports, and even gas stations and elevators. Media saturation is no longer voluntary but mandatory for anyone entering public spaces. The goal is to fill every moment of human consciousness with paid commercial content. Perhaps this is good to stimulate economic consumption, but it cannot be good for human thought and reflection. “How with this rage shall beauty hold a plea?”
Cultural vitality has fewer advocates than do wealth and prosperity. When the arts and humanities break down, the outer signs are less immediately visible. There are no sophisticated monthly measurements to track their progress or decline. But their collapse has human consequences as devastating as material decline, even to a society that may have forgotten why they once mattered.
Dana Gioia is a poet and the former chairman of the National Endowment for the Arts.
James K. Glassman:
The big question is whether America can continue to lead the world.
If we can’t, our future looks awfully grim. Either the world slips into chaos or another country—China?—takes the lead. Imagine that within the next decade, North Korea threatens Japan or Iran gets set to attack Israel or Pakistan falls completely apart. Will the United States be able to decide what to do—and have the authority to do it, with or without a coalition of the willing?
Global leadership has two requirements: one moral, the other economic. On the moral side, America’s will to lead seems to be slipping away, with the growing attraction of isolationism (to both parties); and on the flip side of that coin, multilaterism for its own sake. The superficial success of the lead-from-behind strategy in Libya doesn’t help. Waiting for the Arab League or the United Nations to step out first could easily become American custom and policy, especially at a time when we’re so preoccupied with domestic economic matters. The moral requirement for leadership is, of course, a function of desire and priority in a nation’s leader. But the zeitgeist counts, and right now, it bodes ill.
Which brings me to the second requirement of leadership. Today’s moral and political atmosphere is heavily determined by the state of the economy, and, in the short term, the U.S. economy is lousy. Typically, the economy snaps back like a rubber band: bad recessions are followed by strong recoveries. That hasn’t happened. But even worse is the long-term picture. Economists forecast growth in the 2-to-2.5-percent range as far as the eye can see. That’s a full percentage point lower than the post–World War II average. Living standards will still rise, but at a snail’s pace. The danger is that we won’t have the wealth to lead or, worse, we won’t have the confidence, in a crisis, to believe that we should spend what we must now, with the certainty that we can pay for it later, as we did in World War II.
The trend lines for both the moral and economic imperatives of leadership are heading down, but they can both be raised. The moral side needs inspiration and purpose from policymakers and intellectuals, who should dedicate themselves to the project of its revival. The economic side needs a clear goal to which policy can be directed. The Bush Institute suggests 4 percent sustainable economic growth—perhaps a bit aspirational, but we can certainly get close with a consumption tax, cuts in wasteful spending and regulations, a sensible immigration policy that beckons the best, and policies to produce domestic energy production.
America can no longer get very far on momentum alone. The physics of inertia are kicking in. Yes, our comparative advantages in technological imagination, entrepreneurship, and good business management remain unmatched, and animal spirits haven’t been snuffed out. Will America continue to lead the world? I say yes, but right now that’s a judgment based more on faith than reason.
James K. Glassman, formerly undersecretary of state for public diplomacy and public affairs, is the founding executive director of the George W. Bush Institute.
Example is the school of mankind,” Edmund Burke counseled, “and they will learn at no other.” By that standard, America has undergone quite a schooling in the last few years.
In 2008, the assembled forces of liberalism—and not only the pundit classes, but academia, business elites, and, of course, Hollywood—were convinced that America was not only on the cusp of a transformative and realigning liberal-left presidency, but also at the dawn of a new New Deal. Perhaps even a generation-spanning new Progressive Era. More than a few conservatives felt the tectonic plates moving and repositioned themselves accordingly.
Across the liberal firmament, those inflated expectations have been lowered like a Thanksgiving Day parade float put back in the box. It’s safe to say that no serious-minded liberal anywhere still holds out hope for any of that, at least not in the near term (and many of those migrating conservatives have quietly trudged back home, refugees from a lost cause).
Obviously, liberals are right to chalk up some of their problems to mere human error, as it were. Had President Obama and the Democratic leadership pursued different tactics in 2009—a different kind of stimulus, a smarter approach on health care—liberalism’s fortunes might be a bit rosier now. But his supporters would go further, arguing that the evidence against Obama’s core philosophy is entirely circumstantial. Keynesianism, like liberalism proper, never fails; it’s simply never fully tried. But as David Thoreau said, “Some circumstantial evidence is very strong, as when you find a trout in the milk.”
Simply put, Europe’s financial calamities combined with the failures of America’s efforts to import the European model have had a profound teaching effect. No, the country hasn’t been converted wholesale to the Church of Milton Friedman, but Obama’s bromidic “Yes, we can” and “Sputnik moment” rhetoric has next to no purchase with the American people today.
This creates a moment for optimism that did not seem nearly so plausible in 2008. America is poised to deal with its myriad problems in ways we haven’t seen since 1981.
What about the “big issues”: China, globalization, climate change, and the other grotesques in the usual parade of horribles? Some are very serious, others not so much. China will get old before it gets rich. The entry of hundreds of millions of inexpensive workers into the global labor force is a short-term challenge, but the massive growth of the global middle class is a long-term opportunity. Climate change may indeed be a threat, but the greater danger lies in how we respond to it. A few years ago, it looked like the generations-old Malthusian effort to manage scarcity had finally got from climate change what it always wanted from other scares like overpopulation. Now, around the globe, that approach is a nonstarter.
Yes, America faces grave challenges, but it always has. I was more pessimistic three years ago when it seemed Americans had given up on themselves, preferring a long self-indulgent slide into European social democracy. Now, with the power of example guiding us, there’s reason for hope.
Jonah Goldberg, a visiting fellow at the American Enterprise Institute, is editor-at-large of National Review Online.
Richard N. Haass:
It is tempting to be glibly optimistic and quote Winston Churchill’s observation that “you can always count on Americans to do the right thing—after they’ve tried everything else.”
But this would be, well, glib. There are, to be sure, plenty of reasons for optimism: America’s many excellent institutions of higher education; its relative openness to immigrants; the availability of venture capital for promising innovations; its fundamental political stability; a rich endowment of minerals, energy, and water; and the absence of a powerful peer competitor akin to Germany in the first half of the 20th century and the USSR in the second.
At the same time, there are reasons for genuine concern: a debt larger than GNP; persistent high unemployment; low economic growth; a K-12 educational system that is not preparing most Americans for a competitive, dynamic world; aging infrastructure; a rising China; and a polarized political system that is beholden to special interests and increasingly unable to act in behalf of national interests.
So, confident or pessimistic? The best reason for optimism is that we can identify policies that will help: raising the retirement age; means-testing entitlements; simplifying taxes, reducing tax rates, and eliminating certain tax deductions; cutting use of oil through regulation, taxation, or both; combining near-term economic stimulus with long-term deficit reduction; expanding trade.
Changes also need to be made in how we do things: allowing more talented immigrants to remain in the country, reforming health care so the incentive is not always to increase treatment, curbing the power of public-service unions, resisting wars of choice where the interests at stake are less than vital or where policies other than military intervention promise to yield acceptable results.
But none of this will just happen. It will take real leadership, defined here as a willingness to advocate policies that are inconsistent with the narrow interests of many groups and individuals but that would be good for the society and the country as a whole. It will require leveling with the American people about the consequences of not meeting our challenges and what it will take to meet them. It will require taking on numerous sacred cows.
There are three alternatives to real leadership. One is drift. Business as usual, though, would likely bring about the second alternative: crisis. It could come in the form of domestic unrest or economic disaster imposed by a world that tires of lending us dollars. A third alternative—faux leadership, essentially populism that would deepen social divisions without fixing problems—would be the worst of all worlds.
It may not be realistic to do what I am calling for and survive, much less thrive, politically. It may not be possible within either of the two existing parties; it certainly won’t be easy given our 24/7 Internet and media environment.
Still, I am hanging on to my optimism, if only barely. I could just as easily be a pessimist who has not given up. Either way, it is too close a call for comfort.
Richard N. Haass is president of the Council on Foreign Relations.
Our abundant national energy, unrivaled technological genius, and history’s most powerful military ought to leave me and everyone else an optimist about our country’s future.
There is simply no better place or time to live than America at the end of 2011, even with the most incompetent president since the discovery of electricity, even after a horrific decade of tears and sacrifices made by the innocent at home and the best and brightest of America on battlefields across the world.
The widespread tentativeness, the gnawing doubt felt by all parents and grandparents, is due to government never having been this large, with burdens so sclerosis-inducing in all aspects of national life.
Out here in California—once the best place of all when measured by freedom and creativity, plentitude, and sheer exuberant living—the arteries have already closed, and the political class seems simply incapable of doing anything to reverse the disease. Asking the California legislature to repeal what must be repealed and slash the tax burdens that must be slashed is akin to asking a third-grader to do calculus.
There simply isn’t the capacity. Jerry Brown knows it. We all know it. The goose is on life support.
The California disease, like the deadly “greyscale” sickness in George R.R. Martin’s Song of Ice and Fire novels, spreads slowly and inexorably across the country. Stupidity and power is a bad combination, and it seems as though the country has now touched the bottom California hit long ago. Again and again, interviews of people with power, from both parties and across all three branches, reveal they simply don’t read, think, or analyze.
They don’t know anything. And most of the media that covers them knows less.
Epic incompetence didn’t matter so much when government was smaller. Now, penetrating every aspect of the economy and encroaching on what had previously been the private sphere, government incompetence is poisoning everything. Of all the hats I wear—law school professor, practicing lawyer, broadcaster, and writer—my experience practicing law before federal regulatory agencies, witnessing the defense of businesses against trial lawyers with absurd claims, constitutes the wellspring of my pessimism.
There are so many destroyers of wealth and productivity, legions of dim-witted and credentialed bullies, that even the sunniest optimist may eventually pull down the blinds.
But…young people loathe government. Many millions who fell for Obama have learned a hard and necessary lesson.
Amazing veterans of the wars are returning to take up public life. They are smarter than can be imagined, wise beyond their years, courageous, and ready to lead in politics as they have in combat.
And the relentless hum of technology mixing with freedom, still vastly more prevalent here than anywhere else, is at work 24 hours a day in every corner of the country, from the tiniest hamlet to New York City, all linked by a net of astonishing power.
If upcoming elections deliver the rebuke to the tenured overlords of government, media, and academia, it will be enough to salvage the situation, just as the election of 1980 did 32 years ago.
If not, well then, I offer another George R.R. Martin reference: “Winter is coming.”
Hugh Hewitt, is a law professor at Chapman University Law School and a nationally syndicated radio talk show host.
Kay S. Hymowitz:
If there’s one domestic problem that should be keeping us believers in American exceptionalism up at night, it’s the ailing middle class. Labor economists sometimes call ours an hourglass economy. The top bulge of the hourglass refers to a large population of educated workers earning good money, accumulating significant wealth, and living comfortable, optimistic lives. The bottom bulge holds another large group, living paycheck to paycheck, whose houses, if they have them, are under water and whose children’s futures look as dim as their own. Meanwhile, the middle, the once dominant, stolid, quintessentially American class, is wasting away.
There are two related causes for this, and neither of them suggests an easy—or for that matter, any—answer. The first cause, itself the consequence of technology and globalization, is the earnings gap between knowledge-based jobs and everything else. Clichés about the loss of well-paying manufacturing jobs are true as far as they go, but any routine work is at risk of being automated or outsourced. That means the spoils now go to the specialized and the educated. Over the past 50 years, wages and wealth have risen markedly for those with a college diploma and even more dramatically for those with a graduate or professional degree. Whereas the college-educated earned 40 percent more than those with a high school degree in 1980, today they earn 75 percent more. It goes without saying that the gap for those without a high school degree—and remember, more than half of high school students drop out in many of our largest cities—is even worse. The current economic crisis is intensifying the problem. Unemployment rates are triple for those with only a high school degree compared with the college-educated and six times that of dropouts. Edward Wolff, of New York University, estimates that the net worth of the middle fifth of the country declined 26 percent over the past two years alone.
The other reason for the wasting away of the American middle class is the breakdown of families. Not so long ago, middle-class family life was defined by stability and child-centeredness. No more. According to the National Marriage Project, there’s been a sharp rise in divorce and out-of-wedlock childbearing among the less-educated middle class, those with a high school diploma and perhaps a year or two of college. Only 58 percent of the 14-year-old daughters of moderately educated mothers are living with both parents. Not only is that down significantly from 1982, when the number was 74 percent; it is appreciably closer to the 52 percent of the daughters of the least educated than it is to the 81 percent of the girls of the college-educated. Forty percent of American children are born to unmarried mothers, almost all of them with little or no college education.
These two forces—the knowledge economy and the loss of stable family life among the less educated—create a negative-feedback loop. Children are far less likely to succeed in school if they don’t grow up in stable, child-focused families. Yet a college education is now a necessity for achieving upward mobility. In sum, the loss of a middle class threatens to turn America into a rigid and cynical caste society, the very opposite of its dynamic and optimistic self.
Kay S. Hymowitz, a senior fellow at the Manhattan Institute, is the author of Manning Up: How the Rise of Women Has Turned Men into Boys (Basic Books).
As Yogi Berra pointed out, “It’s tough to make predictions, especially about the future.” Even more so if, to quote Yogi again, “the future ain’t what it used to be.”
What the future used to be—or at least what it used to seem to be—was intelligible. The liberal account of the future was generally optimistic, and the optimism was based on a belief in the ineluctable course of history, or on faith in the victory of enlightened leaders and progressive movements over reactionary forces and premodern prejudices. There were basically two conservative accounts of the future. One was pessimistic, judging the distempers of modernity too powerful to resist successfully for long. The other was more optimistic, looking to the possibility of some sort of conservative restoration or awakening.
Today, who knows? Post-9/11, and postfinancial crisis, and post-postmodernism, the range of possible outcomes seems amazingly wide and the odds on any of them strikingly indeterminate. I suspect our thinking about the future isn’t yet radical enough, either analytically or prescriptively. “A new political science is needed for a world altogether new.” But saying that is one thing. Thinking with the breadth and depth of a Tocqueville about our present condition is another.
So should one be optimistic or pessimistic? God knows. But I do know that conservatives—indeed all friends of political liberty and American greatness—should, in the short term, be agonistic. They need to fight. Fight to defeat President Obama in 2012. Then fight in 2013 to repeal ObamaCare, to rebuild our defenses, to restore U.S. credibility abroad, and to establish fiscal, regulatory, and monetary sanity at home. That’s all difficult—but relatively simple.
Then the agenda gets more ambitious and less determinate. But more interesting.
William Kristol is editor of the Weekly Standard.
Peter Augustine Lawler:
Any modern country is about narratives of change. Our conservative view, found everywhere from the Tea Party to professors of political philosophy, is that our country’s recent history has been a turn away from the limited government based on the natural rights of free individuals toward bigger and bigger government based on a progressive devotion to History. From this view, our country has been getting worse as it slouches toward the serfdom Friedrich von Hayek described or the soft despotism Alexis de Tocqueville imagined.
This narrative contains some truth, of course, but it’s clearly becoming less true. Big government is now in retreat on two fronts, national and state. The entitlements that have structured our welfare state are eroding or even imploding. The movement is from defined benefits to defined contributions, with risk being transferred from the government or the employer to the individual. The good news is that free people are going to have more opportunity to exercise personal responsibility. The bad might be that elderly Americans will have less reason than ever to believe that their money will last as long as they will. The Tea Party is wrong to believe that what its members regard as a new birth of freedom will ever actually be popular.
As every reader of Commentary knows, our country’s always ambiguous and now seemingly temporary use of big government as a way of redistributing income and eradicating poverty started to fade in the late 1960s. Big government has continued to gradually get bigger, but more because of inertia than any ideological enthusiasm. (What about the Progressive Obama? His vision for change is already discredited, and ObamaCare just won’t work.) Today, most Americans know that a bigger nanny state can’t provide any effective remedy for what really ails them.
In the same 1960s, the Supreme Court began its very successful war against big government understood as the moral regulator of the state. Our Court now thinks it’s adhering to the Founders’ view that the single word liberty in the 14th Amendment is a weapon every generation of Americans can wield to achieve unprecedented individual liberty. It makes a strange kind of sense, from this view, to say that same-sex marriage didn’t used to be an individual right, but it’s become one over time. Soon enough the Court might discover it makes sense to say that the entitlement of marriage itself is unjustified oppression, because it arbitrarily privileges what married people do at the expense of the dignified autonomy of single individuals.
So change over the last generation has been progress in the individualistic sense of John Locke. But some of it has been change Locke himself didn’t anticipate. It didn’t occur to Locke, it seems, that so many free persons would become so self-absorbed—or that contraceptive technology would work so well—that we’d be stuck with a “birth dearth.” Sophisticated Americans have not so much transferred their dependence from family to government as they have chosen to thwart nature’s intention for them by staying around as individuals for an indefinitely long time. If I’m not planning on going anywhere, there’s no need for me to generate any replacements.
If it weren’t for our demographic crisis, nobody would be talking much about reforming or eliminating Social Security and Medicare. We’re going to be stuck more and more with too many old and unproductive people and not enough young and productive ones. That change can be accounted for as a product—both good and bad—of our creeping (and sometimes creepy) individualism or libertarianism. The change has wrecked the progressive dream of an expanding social democracy humanely enveloping us all.
There are some reasons to be confident about America’s future. The road to serfdom, it turns out, never gets to serfdom. The downsizing of the welfare state and the accelerating progress of technology demanded by free individuals will likely be good for prosperity. There is, of course, also reason to worry about people so unwilling to think of themselves as parts belonging to wholes greater than themselves—as parents, children, citizens, friends, and creatures.
Peter Augustine Lawler, Dana Professor of Government at Berry College and executive editor of the scholarly quarterly Perspectives on Political Science, is the author, most recently, of Modern and American Dignity (Intercollegiate Studies Institute).
On the face of it, our time should be high tide for American pessimism. The economic calamity of 2008 has been succeeded by a precarious stall. Growth is anemic. Unemployment remains very high. The public is in a sour mood. Our president seems to yearn for a low-profile America. And those charged with looking forward tell us that things will get even worse: the aging of our society combined with the imprudent design of our entitlement programs promises to inflate our national debt to twice the size of our economy by the mid-2030s. We have never seen debt on that level, and there is reason to think such debt would make it very difficult for America to be as strong and prosperous as it has been since the Second World War. The stench of decline is in the air.
And yet, my answer to the editors’ question is that I am decidedly optimistic about America’s future. How could I be? Because the list of woes laid out above describes not the demise of the American order but the demise of the liberal welfare state, and we must be very careful not to conflate the two. The economists’ impossibly grim projections only describe what will happen if we don’t change course, and they therefore make it clear that we will change course.
Granted, that will be no simple matter. The liberal welfare state and the vision of social democracy that underlies it have given shape to our public life for a century—providing a roadmap for the left and a foil for the right. Viewing capitalism as an effective but morally dubious engine of wealth, it sought to balance economic prosperity with economic security through technocratic management of key sectors of the economy combined with all-encompassing programs of social insurance. It seemed to work while our population was booming and our postwar growth was strong. But it undermined both of those preconditions for its own success, while also undermining the traditional family, the moral underpinnings of American working-class life, and the dynamism of our economy to boot.
Now the bill is coming due, and a growing segment of Americans can see that the liberal welfare state is a failure. But those voters still want some other way to achieve the goal of the welfare state: balancing growth and prosperity with economic security and compassion for the poor. That means they would be open as never before to a conservative approach to achieving that goal, but they are not open to abandoning that goal—they have not become libertarians. The right kind of conservatism—one that sought to make the benefits of democratic capitalism available to all—could thrive in this moment of challenge and could help America thrive again, too.
The nation, therefore, need not share the liberal welfare state’s grim fate. We have the world’s largest economy, tremendous untapped (and indeed repressed) growth potential, far rosier demographic prospects than those of our competitors, by far the world’s largest and most able military to protect us, and a tradition of economic drive and growth.
A public-policy agenda that sought to encourage such drive and growth would go a long way toward helping us thrive again, and such an agenda is easily imaginable—indeed, it is gradually emerging on the right. If we’re lucky, it could even help us turn things around before a monumental debt crisis, rather than after. And we’re Americans, so we already know we’re lucky.
Yuval Levin is the founding editor of National Affairs and a Hertog fellow at the Ethics and Public Policy Center.
Michael J. Lewis:
The closer you look, the bleaker it seems. In the next few years, Iran will detonate a nuclear bomb and (perhaps during this diversion) China will reclaim Taiwan in an unexpectedly swift air-and-sea assault. We will then peer into our national larder and find it distressingly bare. After the attack on Pearl Harbor, Winston Churchill reminded himself that the annual Anglo-American steel production exceeded 100 million tons, and the Japanese only 7—and he slept soundly. Today he would toss and turn, when Chinese production exceeds 625 million tons, and is growing, while ours is barely 80 and declining. We once heard that such statistics were immaterial, because economic vitality now rests on technology, finance, and a vibrant service sector, not on heavy industry. That claim now falls flat.
One still hears another cliché, which is that our political culture has grown too poisonous and polarized to solve the debt crisis. But this gets things exactly backwards. It is the debt—and the entitlement payments that increasingly compose it—that poisoned our politics. Nothing has debilitated our political culture more than the task of maintaining a welfare state that demands an ever-greater share of the nation’s wealth. A fundamental tenet of parliamentary government holds that no parliament can bind its successors. But the welfare state binds the legislators in just this way, increasingly restricting their scope of action. A great deliberative body has withered into something like a speech-giving collection agency. And as the scope for genuine legislative action narrows, the great questions of American life are increasingly settled by fiat on the part of nonelected regulators or judges.
In 2008, when government intervention in the mortgage market led to a financial crash, and when national confidence in our international presence faltered, we elected a president and a Congress that promised more of the same: an even greater government role in the economy and an even more cringing international presence. Barack Obama invested these policies with peculiar clarity and urgency. Of course they are the very policies that have brought us to this impasse. History may be cruel, but you can’t say that it lacks comic timing. The consequence has been a startling reinvigoration of our political life, of which the Tea Party is but one manifestation, and which shows that (contrary to what we feared) the American public overwhelmingly does not yearn for an endless expansion of an entitlement culture. It shows that the United States retains its culture of personal initiative and self-sufficiency, and capacity for spontaneous civic action—those natural traits of a vigorous colonial culture. It remains the most charitable society (and not merely in terms of private philanthropy) in human history. Even when demoralized, as during the Great Depression, or when savagely divided, as over slavery, it shows a capacity for regeneration and self-correction that is nearly limitless. We are witnessing it again. And this is why I am optimistic about America’s future,
more so than in years.
Michael J. Lewis is professor of art at Williams College.
Herbert I. London:
McLandburgh Wilson once observed, “Twixt the optimist and the pessimist, the difference is droll: the optimist sees the doughnut, but the pessimist sees the hole.” Since a diet of doughnuts can be deadly, I describe myself as a guarded optimist. The adjective saves me from the charge of being a Pollyanna. As I see it, there are two reasons for hopefulness.
One, pessimism is not a policy prescription. If the world were going to hell in a handbasket, most people would, ostrich-like, put their head in the sand and yield to forces they cannot control. My fear is that pessimism can easily morph into despair.
Two, empirical evidence provides some justification for guarded optimism. 1979 was a terrible year politically: the Iranian revolution deposed the shah and set loose Islamic fanaticism; the Soviet military invaded Afghanistan; the Grand Mosque in Mecca was captured by Wahhabis who were able to extract extortion payments from the House of Saud; the United States was living through a period of double-digit inflation; and the nation was saddled with a bungling president whose only response to the Soviet military action was boycotting the Olympics.
The Cassandras warned of even more dire days ahead. But in 1980, an actor from California who became the state’s governor was elected president of the United States. He exuded hope about the future, and that hope was infectious. Ronald Reagan described the Unites States as a shining city upon a hill and, despite his many detractors, lifted the nation out of doubt.
Analogies are usually faulty. Surely this moment is different from 1980, but it would be a mistake to underestimate national resilience and the role a leader can play in elevating the spirit in the body politic. There are days when gloom is a mist in the country’s air. I understand why so many are convinced the best of times are in our past, but I don’t buy this line.
Paul Valéry was right when he said, “the future isn’t what it used to be.” Alas, the future is what we make of it. An inspirational leader can awaken a dormant national esprit. Notwithstanding all the problems we face, the United States is still a model of liberty for people across the globe.
When those courageous Chinese freedom fighters jammed Tiananmen Square in 1989, they didn’t build a statue of Muhammed or Chairman Mao. They constructed a Statue of Liberty. It is our liberty that they wanted to emulate. From the condition of liberty we often take for granted springs our strength and our endurance.
Yes, I am an unapologetic optimist, admittedly guarded. But my view is grounded in reality. As I see it, in a world of manic pessimism, my realism seems like manic optimism. I wonder if that could be a bumper sticker.
Herbert I. London is a senior fellow at the Manhattan Institute and president emeritus of the Hudson Institute.
The 1990s were a decade to make you believe there was no such thing as an intractable problem. We had defeated the Soviets. We won in the Persian Gulf War in a matter of weeks and quieted ancient ethnic furies in the Balkans. At home, we beat back crime. Welfare reform was a success. We seemed to have the business cycle figured out. It wasn’t really until the financial crisis of 2008 that we were reminded of what it means for things to be utterly out of control. It was a calamity for which no remedy presented itself; and, even if we made the best decisions, it might still have ended in catastrophe. Everything since—the spiraling debt, the persistent unemployment, the sense we might be on the precipice of another collapse—has been a great humbling.
I still tend to be an optimist about most of what dominates our public debate. Over time, the economy will recover. One way or another, we’ll bring the deficit under control. We’ll reform entitlements, inadequately and clumsily, but reform them nevertheless. Our international power will diminish, yet we’ll still be far ahead of any competitor. The American public has shown an admirable resistance to the vast designs of the Obama administration, and I expect President Obama to be either defeated or even more hemmed in during a second term than he is now.
What makes me pessimistic about our future is what nearly no one talks about: the breakdown of marriage and associated bourgeois institutions and virtues in what sociologist Brad Wilcox calls “the solid middle”—those Americans, representing 58 percent of the adult population, who have graduated from high school but don’t have a four-year college degree. Illegitimacy started its corrosive march from the bottom decades ago, but it has steadily crept up the income scale. Among those without a high school degree, the rate is 54 percent; among the solid middle, it’s 44 percent. Marriage and traditional sexual mores have made their last stand among the highly educated (people with a four-year degree or more), reversing everything we thought we knew about the supposed decadence of the elite. Their illegitimacy rate is only 6 percent, and they are less likely to divorce or commit adultery. The solid middle is becoming de-institutionalized. Its members are less likely to go to church or get involved in civic institutions than they were 30 years ago. The middle is thus losing crucial stores of social capital just as—in an interrelated trend—the economy offers fewer ready opportunities, especially for its men. We are witnessing a slow-moving social catastrophe that is mostly ignored, especially on the right.
It has become a mantra among conservatives—echoing a point originally made by Charles Krauthammer—that decline is a choice. But this social decline is not. Even those sounding the alarm about these trends offer few plausible answers for how to check them. How do you recover a culture of marriage once it’s been lost? How do you counteract the baleful side effects of globalization and automation? We seem to be heading inexorably in a direction that threatens our identity as a mass middle-class society. We’ll become more stratified and less mobile, with long-term political consequences that are impossible to predict, except that they can’t be good. William Dean Howells said that Americans love a tragedy so long as it has a happy ending. This is a tragedy that won’t end well.
Rich Lowry is editor of National Review.
Heather Mac Donald:
In Seoul, South Korea, thousands of people sequester themselves for months and years at a time in “Exam Village” to study for grueling professional tests. In China, tiger parents push their children relentlessly to succeed. American teens are definitely good at socializing.
As waves of Asian engineers and computer scientists lap at our shores, it’s hard not to despair at the educational apathy of many American students. Placing all the blame on schools for our listless academic performance ignores some unpleasant truths. Yes, the reign of progressive pedagogy means that American students spend much of their time in dopey “group learning,” allegedly creating their own knowledge (translation: talking about last weekend’s parties), rather than interacting with a teacher who demands attention and conveys hard facts. Yes, America’s fear of not being “inclusive” has redirected focus away from high achievers to the bottom rung. But if you dropped a Chinese student into a mediocre American classroom, my guess is that he would still learn, and he would certainly outlearn his peers, at least until he succumbed to the anti-intellectual student culture.
One of the reasons why educational effort is so fierce in the Far East and Southeast Asia, however, is that economic opportunities are more constricted there. Corruption and crippling red tape in many exam-driven cultures make it far harder to start a business, resulting in bottlenecks of talent. Americans take for granted the absence of endemic corruption in our political system, but it represents one of the great triumphs of Western civilization. However oppressive it can seem to comply with the Clean Water Act or the California Coastal Commission, at least an entrepreneur usually doesn’t have to pay off his local environmental inspector and other parasites to get a building permit. And while the thousands of regulations that pour out of federal agencies every year absorb senseless amounts of a businessman’s time, they are miracles of efficiency and minimalism compared with the Indian bureaucracy.
So for the moment, let’s be optimistic—if the United States can expand its deep-seated advantages of the rule of law and a culture of entrepreneurship. In the long run, however, if the rising economies in the East can reform their corrupt and backwards governments, the discipline of their populations in the fanatical pursuit of knowledge could well leave the United States as a pop-culture-addicted also-ran. It’s time to junk the communitarian agenda of progressive education and to embrace competition and grouping by ability in schools. Vocational training should be rehabilitated from its unjustified ignominy, and the idea that everyone is capable of and should pursue a college degree should be recognized as the fantastical pipe dream that it is. Most important, however, we should acknowledge that learning requires focused, disciplined work to master a body of knowledge that exists independently of a student’s overrated need for self-actualization.
Heather Mac Donald is a fellow at the Manhattan Institute and a contributing editor to City Journal.
On the whole, I am optimistic about America’s future. But I do not take “optimistic” to mean that things are bound to get better, or even that they have a tendency to do so. Rather than try to predict, it is better to understand things as open to prudent improvement and thus be opportunistically hopeful of America’s prospects.
A big choice lies ahead for America, in which the entitlements we have voted for ourselves now threaten us as if they were our unchosen fate. They are called entitlements because they were supposed to have been chosen for good, past recall, and thus put “beyond politics.” What you are entitled to will no longer be subject to dispute. Now it appears we cannot pay for them, and not just arguably but indisputably. Democrats, who first proposed them, are beginning to agree on this point with Republicans, who at first opposed them. Very few want to abolish entitlements; most Republicans want only to change their terms so as to make them affordable. Still, to change them at all robs them of their character as entitlements and sets a precedent for future changes that might restrict them further. They become mere benefits without the security of special protection in the sanctuary of nondiscretionary payments.
Democrats established entitlements to provide “social security” against the risk that people would not save enough voluntarily to provide for their retirement. This was security against our citizens’ lack of the virtue of thrift. Yet if you did save enough, your savings might be lost or reduced through the uncontrollable action of the market, “market failure.” Recourse to government is the cure for risk arising from personal or impersonal forces that people feel impotent to control. But government has transformed itself from an instrument of control into an uncontrollable force of its own, unwieldy, with its own inertia and mindless direction. Its public servants serve themselves first; setting the example for the rest of us, their security comes ahead of the country’s. A mountain of debt testifies to the inability of government to control itself. People have lost confidence in their instrument and therefore in themselves. Self-government looks like it doesn’t work.
The need to recover control, most evident in domestic matters, is paramount. In foreign affairs America has been moderately successful, due in good part to its military prowess, whether employed with gusto by Republicans or apologetically by Democrats. The entitlements are the problem. The mentality they produce is just what President Kennedy decried in the line “ask not what your country can do for you.” A controllable government needs to be both limited and energetic: limited to benefits that do not make dependents of our people and energetic when it must act. With this goal we can reasonably look to America’s future with hope.
Harvey Mansfield, a recipient of a 2011 Bradley Prize, is professor of government at Harvard University and senior fellow at the Hoover Institution.
Wilfred M. McClay:
The answer can’t be arrived at by reason alone, for there are always as many reasons for pessimism as for optimism. In addition, history is full of surprises and unexpected potentialities, like a board game in which new pieces are constantly introduced. What gives one heart, though, is the recognition that challenging events have in the past called forth unsuspected strengths in the American people, a lesson that both Imperial Japan and Osama bin Laden learned to their dismay.
In the end, everything depends on the character of the American people. About that, one simply can’t be sure, although one’s faith is better placed in the people than the elites. One can be encouraged by the astounding upsurge of sheer dogged resistance from ordinary Americans to the statist agenda of the Obama administration. No one would ever have predicted such a thing in 2007 or 2008. Yet one also can be appalled by the foolishness and gullibility of the American people in electing such an unknown and ill-prepared man to the presidency, and investing such preposterous hopes in him. Which qualities of character will predominate at the polls when the question is something difficult but essential, such as the dramatic reform of Medicare and other entitlements?
There is one tectonic shift, however, that may make a positive response more likely than anyone might have thought possible even five years ago. For most of my lifetime, advanced minds have tended to advertise themselves in the United States by delivering national chastisements beginning with the following words: “The United States is the only major nation in the developed industrialized world that does not have…” The relevant noun was always some feature of the all-embracing social-welfare state that so many European nations adopted over the course of the 20th century. Universal health care, generous paid maternity leave, indexed pensions: the list went on and on. The blood and treasure of the United States may have saved European democracy, but, as the tut-tutting admonishment was meant to remind us, Americans remained seriously backward in comparison with the European social-democratic model.
Well, given the highly visible and accelerating problems of what has now proven to be an unsustainable model, there is a new way to complete the sentence. The United States is now the only major nation in the developed industrialized world that does not face inevitable decline, caused by the enormous and insuperable barriers to growth and prosperity imposed by an impossibly expensive social-welfare apparatus saddled on feeble economies supported by ever-diminishing populations. America has a chance to avoid this fate. And if we do survive and thrive, it will be because of our resistance to the very advanced ideas that have condemned our cousins in the United Kingdom and elsewhere to a future of steady, grinding diminishment.
Obama was educated by those who have lived according to the motto “America is the only developed industrialized country that does not have…” and he was their dream candidate. With his ascent to the presidency, and his muscling-through of cardinal pieces of state-aggrandizing legislation, it seemed that America was finally on the verge of shedding its unwanted exceptionalism. Yet world-historical timing has worked against Obama, and his moment has already passed. It has never been clearer, thanks to the inescapable empirical examples across the Atlantic, that America should not go that route. Far from being the heralded prophet of a new America, Obama represents yesterday’s visionary of tomorrow, the last gasp of dated and fatally flawed ideas.
Wilfred M. McClay is the SunTrust Chair of Excellence in Humanities at the University of Tennessee, Chattanooga.
Optimists foresee a future that brings Americans better options, while pessimists insist we will use those options to make worse choices.
Hope merchants assume that the relentless pace of technological advancement and globalization will inexorably foster more opportunities for entertainment, education, and employment, while gloom peddlers worry that the new possibilities will paralyze the populace or else appeal to destructive instincts that send society toward a downward death spiral.
Consider, for example, recent developments in the elemental area of fast-food cuisine: last-generation greasy hamburgers and watery milkshakes used to be the only options, and now shopping-center food courts provide a constellation of exotic offerings, including Thai, Indian, Mediterranean, Cajun, and aromatic coffee from multiple sources. Awash in these appetizing alternatives, consumers show an unfailing preference for unhealthy food, fueling an “obesity epidemic” that alarms public-health authorities.
Or review trends in electronic entertainment, where the iron tyranny of the three broadcast networks gave way to a dazzling array of enriching selections on cable, the Internet, and in educational video games. A disproportionate segment of the audience nonetheless spends leisure time in regular communion with The Jersey Shore or Dancing with the Stars. Meanwhile, young adults confront a titillating menu of intimate arrangements, including blended families, same-sex marriage, single parenting, and premarital, postmarital, or extramarital cohabitation. In response to these novel choices, at least one-third of American children grow up in unstable living arrangements with predictably bad consequences for the kids and society. As the great philosopher Janis Joplin once warbled, “freedom’s just another word for nothin’ left to lose.”
In the long run, however, good news will overwhelm the bad because new opportunities inevitably influence everyone, while destructive or beneficial choices vary according to the segment of society or moment in history. My son, for instance, recently made a sound selection for his first car: an inexpensive, fuel-efficient, safely engineered marvel from the Hyundai Motors of South Korea. The very idea of top-flight automotive production in Korea remains an amazement, considering the utter devastation in that formerly underdeveloped country after Japanese occupation and an unspeakably bloody civil war.
And the Korean miracle, like most other positive developments of the last hundred years, stemmed from American sacrifice (39,000 of our finest young men) and imported American ideals to such an extent that skeptics now see countries we once rescued as outdoing us in virtues traditionally associated with the United States: entrepreneurial energy, social mobility, technological and cultural innovation.
This rise of formerly blighted societies in Asia and Latin America may indeed produce new competitors and a far more multipolar world (especially in comparison with the near universal devastation that surrounded us after World War II), but there’s no evidence of a looming replacement for America’s role as international leader and the planet’s single indispensable power. Visions of Chinese dominance ignore inherent instabilities in Beijing’s authoritarian government and contradictions within their economic model. Fifty years ago, Americans worried about being displaced by Khrushchev’s “We will bury you!” Soviet Union, and 30 years ago prophets of doom anticipated the global supremacy of “Rising Sun” Japan. More recently, serious observers saw united Europe as the coming global superpower, but European unity today looks not only like a dubious blessing but also a questionable reality.
For all our problems, America retains more sources of national resilience than any potential rivals do: a growing population, continuing attraction for immigrants, natural resources, a durable sense of mission, and robust political institutions. Even the much-derided gridlock in Washington provides an example of American vigor rather than decadence: the emphatic push-back against Barack Obama’s desired transition toward a European-style welfare state shows our system operating in the way our founders intended, and avoiding sudden, wrenching change in either a leftward or rightward direction.
The best news about America over the past decade involves what didn’t happen, rather than what did. In the decade following the September 11 attacks, we experienced neither a major terror assault nor a meaningful loss of civil liberties. The Christian right’s theocratic takeover, so widely feared by some, never materialized; nor did the collapse of religious faith, as secularists ardently desired. The United States defies conventional logic by remaining both the most religiously engaged society of the West (2011 figures suggest 40 percent still attend services weekly) and the most accepting of even novel and exotic forms of faith. Most notably, surveys show that ordinary citizens maintain a hearty sense of American exceptionalism and cherish their country’s distinctive blessings and positive role, despite several decades of political correctness meant to foster national guilt.
The case for American optimism remains unshakable, because worldwide multiplication of personal possibilities remains unstoppable. Yes, many people will elect to abuse new chances by making foolish choices, but freedom and opportunity represent important values in and of themselves, and Americans will almost certainly continue making better choices than most.
Michael Medved hosts a nationally syndicated radio talk show and is the author, most recently, of The 5 Big Lies About American Business (Crown Forum).
It has become common to contrast the sunny optimism of Ronald Reagan, the 40th president of the United States, with the pessimism of our current president, Barack Obama. “Morning in America” versus disbelief in American exceptionalism. This is too simple a contrast, of course, and I do not, in fact, believe that pessimism about America is Obama’s problem. His problem is the condescension and arrogance with which he too often approaches his fellow citizens. In any case, I want to approach the optimism/pessimism contrast from three different angles.
First, and answering the question most directly, I am optimistic about America as a political community but rather pessimistic about America as a cultural community. Contrary to the constant calls that we hear for an end to partisanship, partisan politics serves us well. Disagreement and argument are essential to the health of a free people, and, unfortunately, many of those most given to regarding diversity as an undoubted good are the least willing to tolerate disagreement. But as long as we remain free to argue about our political aims and policies, I suspect we will not go too far wrong. Nevertheless, it does take a certain kind of citizen to engage in American politics, and too many of our children are growing up in a culture of failed marriages and broken homes. Such cultural disintegration does not produce the trust or trustworthiness that democratic politics requires. How the political and the cultural interact will in large measure shape our future.
Second, claiming a measure of agnosticism seems to me the right way to respond to this question. America’s future is finally in the providence of God, not in our hands. In the greatest political speech ever given in our country’s history, Lincoln—while fondly hoping and fervently praying that the bloody Civil War might cease—left the question of its duration up to the true and righteous judgments of the Lord. That seems right to me. What we need is not so much optimism or pessimism, but a willingness to carry out the public and private tasks set before us with care and devotion: “firmness in the right, as God gives us to see the right.” The results can be left in the hands of those more knowledgeable than us.
Finally, we can grant that there are plenty of political reasons for pessimism: an economy in which many people may be permanently unable to find work, the racial divide that has burdened our entire history and still does, the threat of Islamism around the world but especially in the Middle East, an aging population that is setting us up for a clash of generations. What we need in the face of such difficulties is not optimism but hope, and they are not the same. As G.K. Chesterton noted, external conditions can never—in good times or bad—give sufficient reason for hope. We need the virtue of hope precisely when circumstances seem to offer no grounds for optimism. “For practical purposes it is at the hopeless moment that we require the hopeful man, and the virtue either does not exist at all or begins to exist at that moment. Exactly when hope ceases to be reasonable, it begins to be useful.” Which means that the question that most needs our reflection is: How does one elicit, nourish, and sustain the virtue of hope?
Gilbert Meilaender is the Duesenberg Professor in Christian Ethics at Valparaiso University.
Polls show widespread pessimism about America’s prospects. Such moods reflect the slow growth and fiscal problems that followed the 2008 financial crisis, but they are not historically unprecedented. After Sputnik, Americans thought the Soviets were 10 feet tall; in the 1980s, it was the Japanese. Now it is the Chinese.
The United States has very real problems, but the American economy remains highly productive. America remains first in total research-and-development expenditures, first in university rankings, first in Nobel prizes, first on indices of entrepreneurship, and fourth in the World Economic Forum’s list of the world’s most competitive economies (China ranks 27th). America, moreover, remains at the forefront of such cutting-edge technologies as biotech and nanotechnology. This is hardly a picture of absolute economic decline.
Some observers worry that America will become sclerotic like Britain, at the peak of its power a century ago. But American culture is far more entrepreneurial and decentralized than was that of Britain, where the sons of industrial entrepreneurs sought aristocratic titles and honors in London. And despite recurrent historical bouts of concern, immigration helps keep America flexible. In 2005, foreign-born immigrants had participated in onw of every four technology start-ups in the previous decade. As Singapore’s former Prime Minister Lee Kuan Yew once told me, China can draw on a talent pool of 1.3 billion people, but the United States can draw on a talent pool of 7 billion and recombine them in a diverse culture that enhances creativity in a way that ethnic Han nationalism cannot.
Many commentators worry about the inefficient American political system. It is true that the Founding Fathers created a system of checks and balances to preserve liberties at the price of efficiency. America, moreover, is now going through a period in which party politics have become very polarized, but nasty politics is nothing new and goes all the way back to the Founders. American government and politics have always had problems, and, though it is hard to remember in light of the current melodramas, they were sometimes worse than today’s.
The United States faces serious problems regarding debt, secondary education, and political gridlock, but one should remember that they are only part of the picture. In principle, and over a longer term, there are solutions to current American problems. Of course, such solutions may forever remain out of reach. But it is worth distinguishing problems for which there are no solutions from those that could, in theory, be solved.
Whether Americans seize the available solutions is uncertain, but Lee Kuan Yew is probably correct when he says China “will give the U.S. a run for its money” but not pass it in overall power in the first half of this century. If so, the gloomy views reported in the latest polls will turn out to be as misleading as those in decades past.
Joseph Nye is a professor at Harvard and the author of The Future of Power (Public Affairs).
When America’s future looks grim—and it’s seldom looked grimmer—I take no comfort in the splenetic pronouncements of talk-show hosts or the equivocations of pundits, all of whom reinforce a stubborn sense of despair. It takes bloody-mindedness to be an optimist. Optimism is a bit like religious belief—a faith in things unseen. But such faith is meaningless if it doesn’t take a hard look at things seen. No harder look has ever been cast on our republic than Walt Whitman’s in the years following the Civil War.
Whitman believed fervently in American spiritual energy, in that astonishing capacity we possess for ceaseless reinvention of ourselves. He had no rosy illusions. America, he warned, could yet prove to be “the most tremendous failure of time.” He wrote in Democratic Vistas, that scathing prophecy of 1871:
Never was there, perhaps, more hollowness at heart than at present, and here in the United States. Genuine belief seems to have left us. The underlying principles of the States are not honestly believed in, (for all this hectic glow, and these melodramatic screamings) nor is humanity itself believed in. What penetrating eye does not everywhere see through the mask? The spectacle is appalling. We live in an atmosphere of hypocrisy throughout. The men believe not in the women, nor the women in the men. A scornful superciliousness rules in literature. The aim of all the littérateurs is to find something to make fun of. A lot of churches, sects, etc., the most dismal phantoms I know, usurp the name of religion. Conversation is a mass of badinage. From deceit in the spirit, the mother of all false deeds, the offspring is already incalculable.
I’ve lived abroad now for some 25 years, and my perspective on America may be skewed. But it isn’t the obvious dangers that America faces—terrorist attack, fiscal collapse—that most get me down but something humbler, less catastrophic, and yet more insidious. I think of it as the death of discourse. Nowadays, even among friends, a dissenting opinion is met not with rebuttal or debate but with stony silence or Whitman’s “melodramatic screamings.” The purpose of conversation on any serious topic is no longer a “mass of badinage” but an occasion for sniffing out “deviant” views and affixing labels.
I grew up in the South in the bad old times. During Sunday dinners, my family, all Atlanta-born, refought the Civil War, sometimes bitterly. My mother and brother and I displayed disagreeable “Yankee” tendencies: we proclaimed segregation evil. When I went so far as to praise William Tecumseh Sherman, a mighty rumpus ensued. Still, we voiced our beliefs, we raged and we wrangled, and in the end we were reconciled in mutual affection. What has happened in America that no common ground—the simple assumption of good faith, if not of affection—seems open for civil discourse?
If I remain optimistic about the future of America, even against the odds, it’s because I share Walt Whitman’s belief that we still provide “full play for human nature to expand itself in numberless and even conflicting directions.” But for that to occur, we need to learn how to listen to one another once again.
Eric Ormsby is a writer in London whose most recent books include Fine Incisions: Essays on Poetry and Place (Porcupine’s Quill) and The Baboons of Hada: Selected Poems (Carcanet).
Americans remain intoxicated by the possibilities of the future, untrammeled by economic convulsion, and undeterred by the persistence of enemies. While indicators have declined during this corrosive recession, nonetheless, only 31 percent of Americans polled by Pew last year were pessimistic about the next 40 years. Consider that the same poll also found that most Americans believed we would face another world war in the next 40 years and that there would be a major terrorist attack on the United States involving nuclear weapons by 2050. In other words, despite bets on a probable world war or nuclear terrorist attack on our nation, most Americans think life for their family, our country, and the U.S. economy will be better. A little odd, no?
But it isn’t really odd. Too many think of the nation’s founding as a desperate escape from the onerous bonds of a greedy monarch; for the Founders, however, the notion of America was much more. Indeed, the description of America as a “shining city on a hill” did not begin with Ronald Reagan but with John Winthrop, at the founding of the Massachusetts colony. Americans have long regarded themselves as being in the vanguard of human history, destined for greatness. The Virginia colonists immodestly set their western border at the Pacific Ocean. When the White House extols the virtues of “leading from behind,” it swims against four centuries of the American tide. And most Americans still hold a firm conviction that their country is something special; that their children’s lives will be better than their own; that come what may, the country will explore new frontiers and expand what Thomas Jefferson called the “empire for liberty.”
If there is any nation that can resist the siren song of retreat and decline, it is this one. A country that continues to believe that life will be better after a nuclear attack is a country that believes in its own future. That belief remains the foundation of America’s power. To be sure, the edifice needs a little work. Our government now wastes “the labors of the people under the pretense of taking care of them,” as Jefferson warned it might. A political culture of exceptionalism and individual rights has given way to one of apology and group grievance. We seem embarrassed still to be the “sole superpower” and impatient for the “rise of the rest.”
Such lassitude will not last. Americans have always found within themselves the strength to rise from adversity, to take, as with Lincoln at Gettysburg, “increased devotion” to the “great task remaining before us.” America is forever “an unfinished work,” one “nobly advanced,” but with greater nobility ahead.
Danielle Pletka is the vice president of foreign and defense policy studies at the American Enterprise Institute.
I am both optimistic and pessimistic regarding America’s future. Here are my reasons for pessimism: first, the unique American values system, what I call the American Trinity, is under assault. These three values are announced on every American coin: Liberty, E Pluribus Unum, In God We Trust. The left has declared war on all three. It seeks to replace Liberty with equality (of result), E Pluribus Unum with multiculturalism, and In God We Trust with secularism. America is being transformed—candidate Barack Obama’s favorite word for what he sought to do to America—into another Western European country, the left’s model of a great society.
Second, the primary purpose of high schools and colleges—and increasingly, even elementary schools—is to turn the students into secular leftists. Many of these graduates know what the climate will be like in 2080 but don’t know who Stalin was, let alone who Cain and Abel were. They are proficient at using condoms and recycling, but little else. They have been taught nothing of American exceptionalism and would likely find the term incomprehensible, if not repulsive. They would save their dog before a human they didn’t know because morality is a matter of feelings, and they feel more for their dog.
Third, the expansion of the state has produced a new American. This American believes in rights more than in obligations and that the state should take care of him, his parents, his children, and his neighbor.
Fourth, the melting pot of Americans has been replaced by a patchwork quilt of Latinos, African Americans, and other identity groups, all of whom are victims of an oppressive sexist, racist, intolerant, Islamophobic, xenophobic society.
Fifth, half or more of the Jews and Christians who attend synagogue or church are more likely to be led by a priest, minister, or rabbi who preaches not about their sins but about America’s.
Sixth, civilization’s single most important institution, marriage, is increasingly regarded as pointless and is being redefined for the first time in history to include members of the same sex. Why? Because the notions that marriage is sacred and that men and women are intrinsically different—a difference that carries unique significance—are depicted as patriarchal, anachronistic, and sexist.
And seventh, most American Jews are on the wrong side of this American divide. They do not even understand that an America that abandons her unique values will join most of the rest of the world in abandoning Israel. And many, incredibly, do not even care.
Now, my reasons for optimism:
Many Americans have finally awakened to the threat posed by leftism. They understand that the bigger the government, the smaller the citizen; that the death of God leads to the death of objective moral standards; and that the Marine Corps, not the Peace Corps, are the greatest force for world peace. And they are fighting to reassert small government, Judeo-Christian values, American exceptionalism, and a strong military, and to undo the Balkanization of America.
If these Americans win the next presidential election, I will be optimistic…about America. But the world is another matter.
Dennis Prager is a nationally syndicated radio talk-show host and columnist. His next book, Still the Last Best Hope (HarperCollins), will be published in 2012. His latest project is the Internet-based Prager University (prageru.com).
I sit down to ponder whether there’s cause for optimism about America’s future on a day that brings proofs, as the days regularly do, that a significant number of Americans—Americans of all ages—live lives bereft of any sense of identification with the nation, with all that it has been and all that it is. This is not a cause for optimism. True, that sense of identification still lies entrenched in the hearts of most Americans, but there is no missing the fact that it would not have been necessary to make grateful note of that point, say, 40 years ago. Decades of revisionist history taught as revealed truth in high schools and universities have taken their toll, decades in which students have learned, at the hands of politically progressive instructors—there are precious few of any other kind in most institutions of higher learning—to view their country as a rapacious exploiter of the poor and the oppressed, and fierce enemy of justice and truth.
But we know all this. We’ve known it for a long time—books galore have been published and conferences held on the transformation of our campuses into centers of indoctrination and thought reform. What we’ve not quite grasped are the insidious consequences of decades of this learning, which has sent countless graduates into the world armed with a degree and an education shaped by poisonously distorted views of their nation, its history, and its values. These are the graduates who now people our media, of course. It was from the political swamplands of such learning that the current president of the United States came as well. No need to ask why the members of our media took so easily to candidate Barack Obama: they had gone to the same schools and shared the same assumptions.
The more dramatic impact of this learning comes in the form of views that most Americans, fortunately, still look upon as aberrations. Few people take the 9/11 Truthers seriously, and rightly so—but that their view has taken even as much hold as it has is altogether telling. These middle-aged and older Americans have found in this deeply held faith—that American leaders arranged to have 3,000 American citizens slaughtered—an outlet for their fixed idea that the U.S. government is a source of evil and an enemy to fear. To encounter any Truther in standard mode is, of course, to witness psychological disturbance of a familiar kind—a kind not far different from the sort found in people who believe that the CIA has implanted radio transmitters in their teeth to control them.
What is significant about the Truthers is the reach of their views: all sorts of Americans can now be found entertaining the possibility that America could well have been responsible for 9/11. Academics, entertainers—too many are now drawn to a view that would have been limited to the clinically deranged 50 years ago. The belief that the United States planned the 9/11 attacks testifies to an unparalleled hostility toward the nation, not just its government. And such belief can now be pronounced aloud and considered an acceptable—indeed distinguished—viewpoint.
So it happened that I could hear, the week I write this, Tony Bennett’s views, offered on Howard Stern’s radio show. Bennett said that the United States, which had bombed other countries, had “caused” 9/11. He let it be known, as well, that we—not the people who flew the planes into the buildings—were the terrorists. We hear voices like this regularly these days. They’re worth noting because of what they represent. So, too, if we’re looking for a bright side, are the splendid tides of outrage with which Americans respond to this preening pathology.
Dorothy Rabinowitz writes on politics and culture for the Wall Street Journal.
Paul A. Rahe:
We live at the end of an era—at a time when the old order can no longer be sustained and a new set of arrangements has yet to emerge. It is a time fraught with discomfort, distress, and anxiety. Millions of Americans are looking for work; millions more have given up the search; and further millions are underemployed. All of them are having trouble making ends meet, and those fortunate enough to have steady work fear that a market collapse, rampant inflation, or a government desperate for revenues will deprive them of their savings.
This is also, however, a time of unparalleled opportunity. It helps that Americans are no longer in denial. They now know that there is no such thing as a free lunch, and that the entitlements regime begun under Franklin Delano Roosevelt’s New Deal, vastly expanded under Lyndon Baines Johnson’s Great Society, and expanded again, at least in prospect, under Barack Obama’s New Foundation is unsustainable. It is now possible for a presidential candidate to describe Social Security as a gigantic Ponzi scheme without ruining his prospects, because everyone understands that the money in the so-called trust fund was spent by Congress long ago, and hardly anyone under 50 seriously expects to get Social Security upon retirement in his mid-60s. Everyone is aware, moreover, that Medicare is insolvent, that we cannot pay for Medicaid, and that the cost of health care is soaring; and most Americans recognize that Obama’s attempt to expand the sphere of public provision will, if not repealed, make matters considerably worse.
The presumptions that sustained the administrative state have also been exposed as lies. The experts on the President’s Council of Economic Advisors and those in charge at the Department of the Treasury and the Federal Reserve Board have repeatedly been proven wrong. When the president of the United States consulted his advisers and claimed in early September that his “jobs bill” would reduce unemployment and that it could easily be paid for, hardly anyone, even in his own party, believed a word. Distrust in the federal government is at an all-time high.
All of this is a blessing in disguise. As a people we were far worse off when we were prey to the illusion that we would be better off if we outsourced provision for our welfare to an administrative elite empowered to manage every detail of our lives. Our liberation from this illusion means that we can begin to dismantle the administrative entitlements regime; that we can return to the states and the localities the functions that are properly theirs; that we can refocus the federal government on the limited but vitally important tasks that the Constitution reserves for it; and that we can restore to individuals and families the obligations, responsibilities, and liberties that are properly theirs. The transition will be painful, but prosperity and low unemployment will return if we limit the burdens that public provision and administrative regulation place on private initiative and if we create a legal regime favorable to entrepreneurship—and morally, in taking responsibility for our own well-being and those of our families, we will be much better off.
Paul A. Rahe is a professor of history at Hillsdale College and the author, most recently, of Soft Despotism, Democracy’s Drift (Yale).
Mark me down as an American optimist. True, we face many challenges: the fiscal crisis of the modern welfare state, the end of American military super-hegemony, an elite culture bent on dismantling the Judeo-Christian moral consensus. Add our present economic woes, which seem intractable, and only a naif can but conclude that we face real problems posing real threats. Nonetheless, I remain convinced that America will remain a vital, attractive, and immensely powerful nation in the coming decades.
The overwhelming majority of Americans—elite, middle class, and working class—are visceral patriots. We’re critical, we find fault, we anguish over our racist past, but the Declaration of Independence continues to express what we believe. This fact about America—the fundamental, deep, and rock-solid legitimacy not only of our system of government but also and more important of our common myths and civil religion—gives us an incalculable strength over and against any of our competitors on the global stage.
The American myth, moreover, has a remarkable—an unprecedented—absorptive power. It reabsorbed a defeated South after the Civil War. It absorbed and still absorbs waves of immigrants, even the children of ex-slaves, whose suffering and humiliation should have made them eternal enemies. A decade ago at my church, one of the elderly black members wept as he watched a documentary about the Tuskegee Airmen, black pilots in World War II who had to endure Jim Crow while training in the South. “How,” he said to me afterward, “could our country have been so unjust to those men?”
Our country! I defy anyone who understands the anguish of that man (who had himself grown up under Jim Crow!) to be anything other than an American optimist. Deficits, unemployment, new international threats, the fraying moral fabric of society—has any generation, any nation not faced these or similar challenges? A country doesn’t “solve” these sorts of problems but rather meets, ameliorates, and endures them. In these times of threat (and we certainly live in such a time), a nation is only as strong as its common culture, and ours is very strong, very strong indeed.
It’s easy to miss the forest for the trees. My elderly friend at church is a rock-ribbed Democrat, and I have little doubt that he disagrees with me about how to solve our present fiscal woes. Other friends think me a religious fanatic in my opposition to same-sex marriage, easy divorce, and abortion on demand. Still others have dreamy ideas about global conflict, the United Nations, and international law. They take the Rodney King approach to national defense: “Why can’t we all just get along?”
Their views and those of others on the left are wrongheaded, and if they control our national future we’ll suffer accordingly. But a nation hobbled by its own stupidity is almost inevitable. What makes us great is the fact that underneath our political and moral debates we have a healthy, robust common culture, a backstop, a bottom line.
Osama bin Laden was stupid enough to imagine that America’s all too real and obvious corruptions—our wanton hedonism, our empty materialism, our reality-TV political culture, our supine, bleating efforts to placate enemies with our vast treasure rather than meet them with military resolve—constitute our national essence. He was very wrong. As we face and fight these corruptions, let’s not make the same mistake.
R.R. Reno is editor of First Things.
Readers of Commentary surely need few reminders that pessimism about America’s future is as old as the republic. “We shall soon see the country rushing into the extremes of confusion and violence,” wrote historian and playwright Mercy Otis Warren—in 1788. Forecasts of decline and fall have been a recurring staple of our political discourse ever since. They have always been wrong. They are wrong again today.
What is it about the present moment that inspires so much gloom? Previous generations of Americans have endured deeper recessions, waged costlier wars, suffered worse social maladies, incurred larger debts (at least as a percentage of GDP), faced tougher foreign competitors, and made graver policy mistakes. And elected worse presidents: nothing Barack Obama has done in his 33 months in office quite matches the malfeasance of James Buchanan or the obtuseness of Herbert Hoover or Jimmy Carter. And like those presidents, Obama looks increasingly like a one-termer—assuming, that is, that he has a competent opponent next fall.
Americans might also take comfort in the fact that Obama’s record as president so far amounts to a remarkable mix of defeats, retreats, and Pyrrhic victories. His bid to impose a cap-and-trade carbon-emissions scheme went nowhere, as did his union-friendly card-check legislation, as did the public-option piece of his health-care plan. He abandoned his efforts to close Guantánamo and try terrorists in civilian court. He gave up on trying to woo Iran and bully Israel. He agreed to an extension of his predecessor’s tax cuts. He made stimulus a dirty word. ObamaCare is the most unpopular legislation in memory and may soon be overturned by the Supreme Court. He led Congressional Democrats to a historic midterm defeat.
None of this has done more than contain the damage Obama’s presidency might otherwise have wrought. But it tells us important things about America. It turns out that the cult-of-personality style of politics that served Obama well as a candidate quickly lost its charm once he was in office. It turns out that the pride we felt in electing a black president didn’t translate into guilt when it came to criticizing his policies. It turns out that a political moment that supposedly heralded the death of conservatism was nothing of the sort. It turns out that Americans have an innate suspicion of loose monetary policy, intrusive government regulation, bullying unions, socialized medicine, and runaway deficit spending.
In short, America’s political culture remains in excellent health, free and frank and largely unencumbered by the shibboleths and taboos that paralyze Europe and Japan. And a healthy political culture is what, after the inevitable fits and starts, will ensure that we return to a growth economy, contain the entitlement state, loosen the death grip of public-sector unions, fund a military adequate for our strategic purposes, assimilate immigrants, and so on.
Now, if we can just bomb Iran’s nuclear sites….
Bret Stephens is deputy editorial page editor of the Wall Street Journal and the paper’s columnist on foreign affairs.
In 1993 I helped William J. Bennett assemble The Index of Leading Cultural Indicators, which provided an empirical assessment of the social condition of American society. It provided a comprehensive statistical portrait of behavioral trends over the previous 30 years, and the results were alarming: a 500 percent increase in violent crime; more than a 400 percent increase in out-of-wedlock births; a tripling of the percentage of children living in single-parent homes; a doubling in the divorce rate; and a drop of almost 75 points in SAT scores.
I believed at the time that these exploding social pathologies might lead to the decline and even the collapse of our republic.
It was right about that time that the United States, as if at once, began to turn things around. And within a decade and a half, significant improvements were visible in the vast majority of social indicators, with progress in some areas, such as crime and welfare, taking on the dimensions of a sea change.
It was a stunning, encouraging, and wholly unexpected recovery. And I learned my lesson: do not underestimate the recuperative and regenerative powers of America.
This does not mean that success is preordained or that optimism is always warranted. And we shouldn’t for a moment downplay the challenges we face, which include reforming public institutions that were designed for the needs of the mid-20th century. Our health-care and entitlement system, tax code, schools, infrastructure, immigration policies, and regulatory regime are outdated, worn down, and insanely out of touch with the needs of our time. This has impeded economic growth, impaired the creation of human capital, and put us on the path toward an unprecedented fiscal crisis. Each of these public institutions needs to be improved and modernized, requiring structural reforms on a scale that right now seems nearly impossible to achieve.
It’s not. The necessary first step toward reform and renewal is a massive ballot-box repudiation of President Obama, his progressive agenda, and those who have supported it. That needs to be followed by the emergence of political leaders with concrete plans to replace the liberal welfare state and who possess the skill to rally the public to their cause. “Public sentiment is everything,” Abraham Lincoln said. “With public sentiment, nothing can fail; without it, nothing can succeed.”
This is no easy task. Fundamental reforms, especially when it comes to entitlement programs, will require (carefully) changing settled ways and settled assumptions. On top of that, right now Americans are anxious, unnerved, and unusually pessimistic. A recession and a failed presidency will do that to a nation. But we also continue to possess enormous strengths, economic as well as military, and great resiliency. We can take some comfort in the fact that at every important moment in American history—our founding, the Civil War, the Great Depression and World War II, the civil-rights struggle, the wreckage of the Carter years—America has produced political leaders who were up to the challenge. I’m betting it shall again.
Peter Wehner is a senior fellow at the Ethics and Public Policy Center and a managing director of e21.
The year I was born, the nonviolence champion Martin Luther King Jr. was slain by an assassin’s bullet, touching off race riots in more than 100 American cities that left 46 people dead and a trail of physical destruction still visible to the naked eye. It was the deadliest year for the United States in the Vietnam War, with more than twice as many servicemen dying than have succumbed, combined, in every U.S. military action since. Soviet tanks crushed the Prague Spring, Americans elected a future crook as president, and most right-thinking people were convinced by Paul Ehrlich’s book, The Population Bomb, that “hundreds of millions of people” would soon “starve to death,” particularly in India.
The year I turned 21, elite anxieties had moved on to Japan’s imminent takeover of the U.S. economy. Entire American cities (including New York City) had been given up as lost causes, Nelson Mandela was still a prisoner in apartheid South Africa, and then all at once the world as we thought we knew it fell on its head. As predicted by no one, imperial Communism collapsed largely without a shot, proxy superpower wars all over the globe gave way to fragile but lasting peace, and a decade of unparalleled prosperity and freedom tumbled happily forth.
The year I write this may prove to be the most momentous for human freedom since that annus mirabilis of 1989, with one authoritarian regime after another in the Islamic world coming under intense pressure from decentralized protesters demanding more liberalized lives. Even before the Arab Spring, we had already seen the number of “free” countries, as rated by Freedom House, rise from 29 percent in 1972 to 45 percent in 2010 (and “partly free” countries rise from 25 to 31 percent) and 44 new sovereignties enter or reenter the family of nations. Former mass-starvation candidates India and China are now producing yet another wave of American neuroses over competing with Asiatic foreigners, even though U.S. per-capita income, adjusted for inflation, has doubled since 1968.
It requires a surplus of myopic self-regard to gaze upon this undeniable and thrilling human advancement and proclaim a wasteland of impending decline, but we Americans have always had a difficult time distinguishing between our market share of global responsibility and the overall health of the world.
The apparently uncomfortable truth is that people everywhere are, on balance, seeking more and more freedom, and they don’t necessarily need or even want heavy American involvement in that quest. Which is fortunate for us, because we can no longer afford to take care of ourselves, let alone the rest of the world.
Like it or not, the near future will be marked by a relaxation of American geopolitical control and a resurgence in local and regional responsibilities assumed by the people who actually live there. For those of us who truly believe in the virtues of responsibility and competition, and who have an enduring faith in the irresistible lure of freedom, it is the very best of times to be alive.
Matt Welch is editor-in-chief of Reason and co-author, with Nick Gillespie, of The Declaration of Independents (Public Affairs).
James Q. Wilson:
Many years ago, I confidently published an essay in which I made a prediction. It was hopelessly, embarrassingly wrong. Since then I have embraced the view that social scientists should never predict; leave that job to pundits. If you doubt me, make a list of the economists who predicted the 2008 recession, political scientists who predicted the Arab Spring, or criminologists who said that this recession would be accompanied by falling crime rates. A few names may make the list, but very few.
Historians may do a better job than other scholars in making generalizations, but that is because the good ones never predict, they generalize from past experiences. Those experiences suggest that this country has been extraordinarily lucky, and they hint at some reasons for that good fortune: an adaptable government, an optimistic national character—and extraordinary good fortune (we won the Revolutionary War against a superior enemy, defeated the Confederacy despite a series of terrible northern generals, overcame the Great Depression because the Second World War increased the demand for goods and services, sent transports to confront Germany just at the time when the Nazi code had been broken, confronted an armed Japan that made every conceivable tactical mistake, and defeated Saddam Hussein by discovering that he was an incompetent military leader). We had some bad luck as well (racism and Vietnam, for example), but the good outweighed it.
It is easy to understand why Commentary would ask whether one is optimistic or pessimistic. We remain in the depths of a major recession, the nation’s deficit grew by more than $4 trillion in the first three years of the current administration, our military faces unjustified cuts in its budget, many people who want to vote against President Obama feel they lack a suitable Republican alternative, the federal government (except for the military) lacks any public confidence, and most Americans think the country is on the wrong track.
It would be easy to be grumpy, but it also would not be hard to be optimistic. We face serious problems, but this recession like all before it will end, something will probably be done to reduce the growth in the deficit, international reality will require the maintenance of a serious military force, and somebody will run against Obama and may well defeat him. Dislike of government institutions will no doubt persist (but without any reduction in American patriotism), and the meaning of answers to the poll question about whether the country is on the right track will remain, as it is now, obscure. Take your pick.
James Q. Wilson teaches at Pepperdine University and is the coauthor of American Government: Institutions and Policies (Wadsworth).
Optimism is the very lodestar of the American experiment. We are a nation of immigrants who left behind everyone and everything we knew to take a chance for a better future. Pessimists stayed home in Europe or Asia, pulled by a history of thousands of years of living in one place as one people. Those who became Americans leapt toward a dynamic society that rewards individual talent and hard work—not social class, religion, racial differences, or proximity to government power.
We as Americans have optimism programmed into our DNA. Where others might see cause for doubt, we see opportunity. Even as the economy remains mired in recession, entrepreneurs continue to conjure forth inventions that bring the knowledge of the Library of Congress to our fingertips, cure once deadly diseases, and deliver almost any product to our doorstep in days. Even as our elected leaders overreacted to the downturn with massive spending programs and the nationalization of financial firms, car companies, and the health-care sector, a great political movement rose up to shake the establishment with demands for a return to frugality and modesty. Even as our armed forces have encountered stiff resistance in Iraq and Afghanistan, we have killed off the leadership of al Qaeda (including Osama bin Laden), midwifed an Arab democracy in the center of the Middle East, and hastened the overthrow of despots in Tunisia, Egypt, and Libya. Despite the rise of China and the return of Russia, the United States protects the peace among the great powers, keeps the channels of global commerce open, and spreads the freedom to think and worship to distant lands.
It is harder still not to be an optimist during this, the 150th anniversary of the Civil War. When president-elect Abraham Lincoln left his home of Springfield, Illinois, for Washington, D.C., seven Southern states had already seceded. Acknowledging that he “had a task before [him] greater than that which rested upon Washington,” Lincoln still declared, with the “assistance [of God], I can not fail” and called upon a thousand well-wishers to “let us confidently hope that all will yet be well.” Four years later, after a bloody civil war that cost 600,000 American lives, Lincoln was still an optimist. At his second inaugural, Lincoln could report his “high hope for the future,” though he would venture “no prediction” on the war’s final outcome. Still, he finished with an optimistic vision of the nation’s character:
With malice toward none, with charity for all, with firmness in the right as God gives us to see the right, let us strive on to finish the work we are in, to bind up the nation’s wounds, to care for him who shall have borne the battle and for his widow and his orphan, to do all which may achieve and cherish a just and lasting peace among ourselves and with all nations.
After the most devastating war in our nation’s history, Lincoln could foresee the national greatness that lay just beyond the horizon. With this example before us, we the living can overcome temporary setbacks to continue the American experiment.
John Yoo is a professor of law at the University of California, Berkeley, and a visiting scholar at the American Enterprise Institute. He is coeditor of Confronting Terror (Encounter).
The Case for Pessimism
By Mark Steyn
In September 2009, Barack Obama and Muammar Qaddafi both addressed the United Nations. It is a pitiful reflection upon the Republic in twilight that, when it comes to the transnational mush drooled by the leader of the free world or the conspiracist ramblings of a pseudo-Bedouin terrorist drag queen presiding over a one-man psycho-cult basket case, it’s more or less a toss-up as to which of them was the more unreal.
Qaddafi spoke for 90 minutes, and in the midst of his torrent of words, his translator actually broke down and cried out, “I can’t take it anymore.” The colonel gravely informed the world body that the swine flu was a virus that had been created in a government laboratory, and he called for a UN inquiry into the Kennedy assassination on the grounds that Jack Ruby was an Israeli who killed Lee Harvey Oswald to stop the truth coming out about Kennedy being killed to prevent an investigation into the Zionist nuclear
facility at Dimona.
On the other hand:
“I have been in office for just nine months, though some days it seems a lot longer,” President Obama mused. “I am well aware of the expectations that accompany my presidency around the world. These expectations are not about me. Rather, they are rooted, I believe, in a discontent with the status quo that has allowed us to be increasingly defined by our differences.”
Now, forget the first part, which was just Obama’s usual narcissistic “but enough about me; let’s talk about what the world thinks about me” shtick. It was the second part of Obama’s remarks that reveals the danger we find ourselves in, two years later, even with Qaddafi toppled and in hiding and Jack Ruby’s Israeli roots
The thing is, for better or worse, we are defined by our differences, and if Barack Obama didn’t understand that when he was at a podium addressing a room filled with representatives of Iran, Sudan, Saudi Arabia, North Korea, Venezuela, and the whole gang of evil, the rest of the world certainly did as soon as Qaddafi appeared. Obama and Qaddafi may both have been the heads of state of sovereign nations, but if you’re on an Indian Ocean island when the next tsunami hits, try calling Libya instead of the United States for help and see where it gets you.
The global reach that enables America and a handful of other nations to get to a devastated backwater on the other side of the planet and save lives and restore the water supply in a matter of days isn’t a happy accident or a quirk of fate. It is something that derives explicitly from our political system, our economic liberty, our traditions of scientific and cultural innovation, and a general understanding that societies advance when their citizens are able to fulfill their potential in freedom.
In other words, America and Libya are defined by nothing but their differences, even though the very thought of “differences” seemed to pain the president on that day. “No nation,” he announced to the assembled warmongers and genociders, both actual and would-be, “can or should try to dominate another nation.”
As far as I’m aware, neither Qaddafi’s translator nor anyone else screamed “I can’t take this anymore” and fled the room. But someone should have. Whether or not any nation should try to dominate another, they certainly can. And they have. Nations have sought to dominate others and have succeeded at it with ease all over the planet and throughout human history.
So who’s next? According to the International Monetary Fund, China will become the planet’s leading economy in the year 2016.
If the IMF is right, in five years’ time, the preeminent economic power on the planet will be a one-party state with a Communist Politburo and a largely peasant population, no genuine market, no human rights, no property rights, no rule of law, no freedom of speech, no freedom of the press, no freedom of association. It will mark the end of a two-century
Anglophone dominance, and—even more civilizationally startling—for the first time in a half millennium the leading economic power will be a country that doesn’t even use the Roman alphabet.
Whether or not this preeminent China should dominate other nations, it certainly can. And it certainly will.
If you think like President Obama and believe nations are not defined by their differences, then China’s great leap forward is not that big a deal. But if you think, like someone who has given it a moment’s thought, that nations are defined by their differences, it is a very big deal. Most immediately, it means that the fellow elected next November will be the last president of the United States to preside over the world’s leading economy. This should be a source of shame to every American. It is not. Not yet. Instead, we battle over trivialities.
Washington spent most of the summer of 2011 gripped by the debt-ceiling showdown. Cable-news correspondents stood outside the White House and the Capitol all day long, reporting the comings and goings of the movers and shakers. Everyone was agog as to whether the president and the administration would reach a deal before the clock chimed midnight on August 2, whereupon the president’s lavishly weaponized Canadian-manufactured black coach in which he toured Iowa would turn back into a pumpkin.
Now, just to put this so-called debt-ceiling battle, in which the Republicans were supposedly battling to secure budget cuts that would destroy the social safety net, in perspective: there was a dispute between Speaker of the House John Boehner and the Congressional Budget Office about the so-called scoring of the plan that eventually passed and was signed by the president. Boehner said the plan called for $7 billion in cuts for the 2012 budget. The CBO said the plan only reduced the 2012 deficit by $1 billion.
Which of these numbers is correct?
The United States government currently spends one-fifth of a billion dollars that it doesn’t have every hour, every day, seven days a week, 365 days a year including Thanksgiving, Christmas, and Ramadan. A fifth of a billion dollars every single hour—so the $7 billion that John Boehner calls “a real enforceable cut for financial year 2012” represents what the
government of the United States currently borrows every 37 hours. In the time between the Friday announcement of the plan and the Sunday morning talk shows’ discussion of it, the government borrowed back every dime of those painstakingly negotiated savings.
On the other hand, if the CBO’s scoring is correct, and it reduces the 2012 deficit by just $1 billion, then the cut represents what the United States borrows every five hours and 20 minutes. Don’t bother waiting for the Sunday talk shows, because the savings will all be borrowed back in the time it would take you to read this issue of Commentary. But let’s give John Boehner the benefit of the doubt and concede that for a month of shuttling back and forth between the Capitol and the White House, he got a “real enforceable cut of $7 billion.”
In September, the president swanned into Congress for a nationally televised address on jobs and proposed, off the top of his head, another $477 billion in spending—a half trillion dollars we don’t have, that the world has no desire to lend us, and the majority of which will be “electronically created” by the United States Treasury selling its debt to the Federal Reserve under the policy called “quantitative easing.”
The politico-media class of this country seems to think it entirely normal that we should spend two months in tense, difficult, painstaking negotiations over how to go seven billion steps forward—and then breezily spend 20 minutes going 447 billion steps backwards. The inconsistency between the bottomless pit that supposedly awaited us on August 2 and the airy coverage of September 8 tells us a great deal about the unlikelihood of meaningful course correction in this country.
The other day a friend of mine watched the film The People Versus Larry Flynt, which tells in part of the battles between the title pornographer and a conservative activist named Charles Keating, who owned Savings and Loan. The film’s final card portentously informs us that “Charles Keating was part of the Savings and Loan scandal that cost American taxpayers $2 billion.” The People Versus Larry Flynt came out in 1996. That was a mere 15 years ago. And yet, just as we find it hard to comprehend that the average peasant in medieval England had to get by on six pennies a day, we now find it difficult to imagine an age lost in the myths of antiquity when there were scandals that cost American taxpayers a mere two billion dollars.
What a primitive society that must have been, barely advanced out of subsistence agriculture! Today, the government of the United States borrows $2 billion every 11 hours. We could have 220 Savings and Loan scandals for the cost of the Obama jobs bill. We could have 500 Savings and Loan scandals for the cost of one Obama stimulus package. We could have 850 Savings and Loan scandals for the cost of this year’s budget deficit. We could have vast armies of Charles Keating clones rampaging across the fruited plain, and they would barely make a dent in America’s finances.
Here’s another example of the kinds of dollars that are being thrown around now. The Obama administration’s $38.6 billion clean-technology program was supposed to “create or save 65,000 jobs.” Half the money has been spent, $17.2 billion, and we have 3,545 jobs to show for it. That works out to an impressive $4,851,904.09 per green job created. A world record! People say America can’t be number one anymore, but mister, we’re number one at this. The previous world record was held by Spanish taxpayers who subsidized every job on a solar panel assembly line to the tune of $800,000 per post. I’ll bet Spain thought that record was safe for a couple of years. Not so fast, amigos. The American taxpayers took it and sextupled it—not $800,000 per green job, but $4,800,000 per green job. I’d like to see those cheeseparing Spaniards reclaim that record any time soon!
Nobody spends like this. Nobody except us.
Nobody uses the T word—trillion—except us. It’s easy to look at debt-to-GDP ratios and conclude there’s nothing to worry about, but when you’re squandering $4.8 million per artificial non-job, it’s not the comparative numbers that will kill you. It’s the sheer dollar sums.
There were three great citadels of Western civilization: Rome, Athens, and Jerusalem. It took a fourth, London, Washington’s immediate predecessor as the dominant power, to disseminate the ideas of Athenian democracy and Roman law and the Hebrew Bible to the farthest corners of the earth. America has signs of decline that follow the examples of all four.
Rome once built aqueducts, and then it stopped building aqueducts, and then the aqueducts it had built started to decay. At the dawn of big government, in the 1930s, we built the Hoover Dam. Then we stopped building dams. In September, in the town of Port Angeles in the state of Washington, there commenced the destruction of two century-old dams in order to “liberate” the Elwha River. So now we’re dismantling dams.
You can see this at work—or rather, not at work—every time you’re on the isle of Manhattan. The Empire State Building was put up in one year and 45 days in the middle of a depression. Ground Zero is still a building site after a decade. 9/11 is something America’s enemies did to us. The 10-year hole in the ground is something we did to ourselves.
Now consider the people who went rampaging through the streets this summer in London. These are the children of dependency, people who have been marinated in stimulus within an inch of their lives, and they’re good for nothing but lobbing concrete through store windows so they can steal the latest models of electronic toys. They tore apart a city that, within living memory, governed a fifth of the earth’s surface and a quarter of its population. When you’re imperialist on that scale, you make a lot of mistakes. But nothing the British did to any of their subject peoples in far-flung corners of the globe compares with what they did post-imperially to their own population.
These are the great-grandchildren of a tiny island that stood alone against the Germans during the Blitz in that terrible year after the fall of France. If those Britons of mid-century were to come back, they would assume they had landed in some bizarro alternative universe—until, like Charlton Heston rounding the corner and seeing the shattered Statue of Liberty poking up out of the sands, they realize that the Planet of the Apes is their own. The evil of big government is not that it is a waste of money, but that it lays waste to people.
In Israel in the mid-1990s, an idea called normaliut seized hold of its populace. What it meant was that Israel wanted to live like any normal Western society. That was the real attraction of the 1993 Oslo peace accords. In a sense, it offered not merely a treaty negotiated in Oslo but the possibility to be Oslo, the chance for Israelis to live as Norwegians, to live as any other advanced Western nation. Instead, Israelis are on the military call-up list until 55—or about the age a Greek hairdresser gets to retire on full salary. Israel’s example suggests that if you think you’re an advanced Western democracy, but you don’t get to live like one, eventually the conflict between what you are and what the difficult circumstances ensuring you are not obliterated from existence require of you, you get worn down over time.
Israel implemented the terms of the Oslo accords, and in return Israelis got an Arafatist terror squat on their Eastern flank, suicide bombers on their buses, Iranian proxies to their north and west—and, in the wider world, isolation, demonization and delegitimization accompanied by a resurgent and ever more respectable anti-Semitism. The dream of normaliut didn’t work.
In 2008, the U.S. electorate also voted for normaliut. Americans voted to repudiate the previous years, dominated by terror attacks and Code Orange alerts and anthrax scares, and thankless semicolonial soldiering in corners of the map no one cared about. They were under the sway of a desperate hope that wars can simply come to an end when one side decides it’s all a bit of a bore. In reality-TV terms, the Great Satan wanted to vote itself off the island.
But as Israel understands by now, sometimes who you are is more important than anything you do. And sometimes who you are is an offense to those indifferent to anything you might or might not do. America will discover, as Israel did, that a one-way urge for normaliut will lead to a more dangerous world.
When you have government on the scale Europe enjoys and America has moved toward, there are hard choices to be made: as postwar Britain came to understand, you can have Scandinavian-style entitlements or a military of global reach, but you can’t have both. The current “supercommittee” or the next will find it easier to cut military commitments for which the public has little appetite than to shrink in any meaningful sense an ever more deeply ingrained transgenerational dependency culture.
And without a military or global reach, we will find the spaces in the Pax Americana left unoccupied like an underwater house in a Nevada real-estate
development quickly filled by anti-American menaces. Last year, Die Welt reported that on a recent visit to Tehran, Hugo Chavez had signed an agreement to place Iranian missiles at a jointly operated military base in his satrapy, Venezuela. That’s how it begins. In the years ahead, distant enemies of this country will seed new proxies in Latin America as Iran did to Israel with Hamas and Hezbollah.
It starts with the money, but it doesn’t stop there: as all dominant nations learn, when money drains, power drains.
Nowhere can we see the effects of that truth better than in East Asia. China is already the world’s biggest manufacturer. It is already the world’s biggest exporter. It is the postcolonial patron of resource-rich Africa. It is the post-downturn patron of cash-strapped Mediterranean Europe. It is the biggest trading partner of India, Brazil, and other emerging powers. We should not be surprised that in such a world, getting on with America will matter less and less.
There have been moments, without question, when this has proved to be unexpectedly good news for us. Washington and its geriatric EU allies wanted the Copenhagen climate change deal in 2009, the biggest exercise in punitive liberalism ever mounted, an embryo exercise in global government. Brazil and India joined with China to block it. It’s a mark of the perversity of the age that it takes the Politburo to save global capitalism.
Sometimes, though, it’s not so good. In 2010, the Royal Australian Navy participated in its first naval exercises with Beijing. A few weeks later, Britain and Germany declined to support the United States in its efforts to get China to increase the value of its currency. Why would they? Even for America’s closest allies, the dominance of both the Pentagon and the almighty dollar has become conditional.
We will not like this post-American world, which will not even bring us normaliut. America will discover, as Britain has in twilight, that, long after imperial grandeur has faded, imperial resentments linger. We will not be left alone to fade into second-rate status. We will be taunted and humiliated and haunted and chased on the way down.
And yet, even in my deepest and most pessimistic vision, I can see a different future for the United States. For as the past few years have taught us, the great thing about the United States is that it is not Europe. When the economy headed south in 2008 and 2009, everywhere around the planet, people besieged their parliaments, asking them, “Why didn’t you, the government, do more for us?” They did it in Iceland. They did it in Bulgaria. They did it in Lithuania. They did it in Greece. They did it in the United Kingdom. They did it in France.
The United States is the only country in the world where a mass movement took to the streets in 2009 to say we could do just fine if you, the government, stayed the hell out of our pockets and the hell out of our lives. That fact, that populist refusal to be Europeanized, represents the best hope for this country. Those now-caricatured, much-maligned Tea Partiers moved the meter of public discourse significantly back in the direction of sanity. And that includes Barack Obama.
In 1975, Milton Friedman said this: “I do not believe that the solution to our problem is simply to elect the right people. The important thing is to establish a political climate of opinion which will make it politically profitable for the wrong people to do the right thing. Unless it is politically profitable for the wrong people to do the right thing, the right people will not do the right thing either, or if they try, they will shortly be out of office.”
Just so. Every time Barack Obama stands at his teleprompter and is forced to pretend that he’s interested in deficit reduction, we have taken a step toward that Milton Friedman reality. You have to create the conditions, as the Tea Party and the town hall meetings did, whereby the wrong people are forced to do the right things.
One cannot wait for the great leader to descend from the heavens to do the work for us. Every glamour boy, from Barack Obama to Mitt Romney to Rick Perry, proves to have feet of clay. It’s more important that tens of millions of ordinary citizens move the meter on public discourse and force the wrong people to do the right things.
But we don’t have much time to force them. If we don’t turn this thing around by mid-decade, if we let China become the dominant economic power in a world where the Iranians are nuclearizing and where Russia is making whatever mischief it can, we will see something new in world history. Something terrifying. This will not be like the transition from Britain to America, from a crucible of liberty to its greatest exponent. This will be the greatest step backwards for the civilization that built the modern world and spread its blessings across the map. There will be no new world order. There will be no world order.
The only way to prevent it is to act, and act quickly. Otherwise, it’s over. In 1969, in a poem about the end of the British empire called “Homage to a Government,” Philip Larkin wrote: “Next year we are to bring all the soldiers home/For lack of money…/We want the money for ourselves at home/Instead of working.” The narrator keeps saying that “this is all right,” but he concludes with this: “The statues will be standing in the same/Tree-muffled squares, and look nearly the same./Our children will not know it’s a different country./All we can hope to leave them now is money.”
We Americans can’t even hope that. And our children will know their reduced America was not the America that should have been theirs by right.
The Case for Optimism
By John Podhoretz
There is a growing propensity to place the blame for the disastrous fiscal and economic condition of the United States on the supposedly damaged spiritual condition of the American people. President Obama himself, inclined these days to blame the nation’s economic woes on his predecessor and on millionaires and billionaires, stepped on his own storyline recently when he told a Florida TV reporter that the American people had “gotten a little soft.” By saying this, he was echoing the view that something had gone wrong inside the body politic over the past decade or longer. The American people wanted benefits they didn’t want to pay for; they borrowed money they didn’t have; they refused to make tough choices. “The richest society the world has ever seen has grown rich by devising better and better ways to give people what they want,” Michael Lewis, the most influential financial journalist in America, writes in his new book Boomerang. “The boom in trading activity in individual stock portfolios; the spread of legalized gambling; the rise of drug and alcohol addiction—it is all of a piece.”
This secular-Calvinist argument has achieved standing because it seems to take seriously the most nagging aspect of the past 10 years: the role we should assign to personal responsibility when we attempt to understand what happened, how to keep it from happening again, and how to deal with the pressing matters ahead of us. It is also alluring because it spreads the blame far and wide, which seems appropriate for a cascading series of events that developed over decades and then all came crashing into each other.
No other theory of wrongdoing draws a straight line from the expansion of the Community Reinvestment Act in 1995, which led to the growth in subprime lending that helped create a new market in derivative products from that lending, to the seemingly unrelated pension and medical-care crises afflicting state and local governments now and that will soon overwhelm the federal government if the spending trajectory isn’t altered. The fault lies not in Democrats, nor in Republicans, not in unions or cosseted banks; the fault, dear Brutus, lies in ourselves. We are the constant: the overindulged, overindulgent, overweight American people, wanting things heedlessly, getting things hedonistically, and ruining things wantonly. We are $14 trillion in debt because we ate the debt.
It is a powerful argument. But it is wrong. And by understanding the ways in which it is wrong, we can see the contours of the case for optimism about the American future taking shape. Americans made entirely rational choices in the years leading up to the crisis in 2008; they responded properly to a series of incentives created over the preceding decades by politicians who meant well but were satisfying the interests not of the public as a whole but of constituent groups that stood to benefit far more than the ordinary voter from the creation of those incentives. Just as Americans responded to the realities of the time before the crisis, they will respond to the realities of the United States in which we now live. And the nation will come out the stronger for it.
When you are living in the heyday of a bubble—and we’ve been through two in the past 15 years, one involving the Internet and the other real estate—you are presented with two opposing realities. The first is that something miraculous is going on around you, something so transformative it seems almost magical. And you know it is real, because the miracle workers are everywhere you look, peering at you from the covers of magazines, confident and smiling and looking like a billion bucks. To become like them, you need to take the steps they took; and because they took those steps and benefited, following in their footsteps doesn’t really seem risky at all.
The former CEO of Citigroup, Chuck Prince, notoriously said, even as he saw the housing collapse coming: “As long as the music is playing, you’ve got to get up and dance.” That remark has been taken as proof of his bank’s malfeasance and that of all Wall Street firms, and there is some of that in it; but it also describes perfectly the psychological condition of almost everyone during bubble time. If you’re not in, you’re out. Better to be in than out. At least you have skin in the game.
The other, contradictory reality is this: you know (because how can you not) that what you are seeing is not real, that something akin to a violation of the elementary laws of physics is happening before your eyes. When a piece of property that seemed overvalued at $250,000 costs $500,000 three years later, but nothing else has changed much—the economy isn’t growing all that quickly, you’re not all that much better off than you were, and your friends aren’t either—the cognitive dissonance should be overwhelming.
You know all this, but the anesthetic effect of the bubble’s music means that you don’t feel it. And when a mortgage broker tells you that you can afford a $500,000 mortgage on a salary of $52,000, you know for sure that someone is getting screwed as part of the deal, since that’s what happens when deals are too good to be true. And you know, what’s more, that it might be you who will be getting screwed; but what was true for Citi is true for you as well. The music is playing. You’ve got to get up and dance.
The point is that ordinary people didn’t just get up and dance because it was fashionable. They were presented with powerful motivations to do so, mostly in the form of lowered interest rates that not only made borrowing cheap but also allowed them to cash out the equity they had invested in their own homes without having to sell. People drained their own future wealth by spending it in short-term ways, but given the fact that housing prices were rising, it appeared they would make up for the lost equity in increased value. People didn’t believe falsely, or greedily, or hungrily, that money was free. Money was free. And the incentive to participate in this free-money game was general. America’s politicians have recently found it convenient to rage at the mortgage brokers and banks that were handing out subprime loans so cavalierly, but they too—and those who borrowed from them—were also acting in accord with incentives created by the Federal Reserve and federal government policy.
Lending money, borrowing money, creating derivatives from the mortgages—these were all entirely understandable acts based on the realities of the time. The only true failure was believing the notion that somehow there was little or no risk involved. There is always risk in any financial transaction. But the anesthetic quality of Chuck Prince’s music dulled the anxiety that should accompany any kind of risk-taking—the very anxiety that functions as a counterweight to the thrill, the still small voice that warns against doing something that poses a long-term danger.
The music ended, and now we are in the fourth year of life in the crushing silence that followed. And the odd thing is this: the emotional psychology of the silence is very similar to the emotional psychology of the music. Almost no one is up on the dance floor, and in part for the same reasons that everyone was up on his feet as long as the tinny piano was playing. It is part of human nature to extrapolate from the condition of the present moment to the limitless future; just as we could not feel that there would be an end to the bubble, we cannot feel that there will come a time when we will rise from the mire of the Slough of Despond.
The image of the “slough of despond” comes from John Bunyan’s 17th-century allegory, The Pilgrim’s Progress. As Bunyan’s hero, Christian, travels toward his redemption, along the way he is trapped in a bog where “scum and filth that attends conviction for sin doth continually run.” Getting mired in it is an element in Christian’s redemption, because “as the sinner is awakened about his lost condition, there ariseth in his soul many fears, and doubts, and discouraging apprehensions, which all of them get together, and settle in this place.”
Christian’s “many fears, and doubts, and discouraging apprehensions” are mirrored in the way we think about the problems facing the United States. We fear we cannot make our way back, we doubt the resilience of our political system, and we have apprehensions about a future in which health-care entitlements will swallow our economy whole unless we change course. And when we think about what it will mean to change course, we are all discouraged. It can’t be done.
Of course it can.
The evidence that a change in trajectory is more than possible can be found in the American political system over the past few years. The electorate has demonstrated a remarkable, almost unprecedented taste for shifting direction. Control of the House of Representatives, held for 40 uninterrupted years by the Democrats and then for 12 uninterrupted years by the Republicans, has switched hands twice since 2006. Democrats won 32 seats in a landslide in 2006 that George W. Bush called a “thumpin’”; Republicans won 63 seats in a landslide in 2010 that Barack Obama called a “shellacking.”
Republicans won control of the Senate in 2002, lost it in 2006, went some ways to winning it back in 2010 and will probably do so in 2012. At the presidential level, the conservative Republican won 51 percent of the vote in 2004, and in 2008 the liberal Democrat won 53 percent. Independent voters, obviously the most likely to bounce between the parties, preferred Obama over John McCain by 17 points—and then, in 2010, preferred the Republicans to the Democrats by 8 points, a 25-point shift in only two years.
Voters were not being flighty or silly or stupid. These dramatic shifts were substantive, the result of inarguable policy failures. Bush’s failure to win in Iraq and to handle Hurricane Katrina competently caused the 2006 Congressional thumpin’; the Republican party’s failure to manage the financial meltdown competently led to Obama’s easy victory in 2008; Obama’s failure to generate the recovery he had promised with his stimulus and his swelling of government caused the 2010 shellacking. Voters took a chance that Obama could bring about the change he had promised; the bet didn’t pay off, to put it mildly; and they tore up their tickets. If the 2012 election follows the same form, and at this moment there is no reason to think the dynamic will be different from what it has been since 2006, it will not go well for him.
Somehow, we still think of the United States as a young country, and in comparison with the other great nations of the Earth it is; but its political and social system is now among the world’s oldest. Indeed, the amazing durability of the American system over 235 years is the primary reason for optimism about the American future. The glory of the United States does not reside in the untold wonders of its people—that is politician-speak—but rather in the flexibility of the American system. The nation has weathered crises far worse than the present crisis and come out the better for them eventually because the spine of the American system is at once sufficiently ironclad and sufficiently flexible to bend, but not break—the exception, of course, being the Civil War, when that spine was fractured and, at enormous cost, put into traction and forced back into alignment.
That system, the direct outgrowth of the Declaration of Independence and the Constitution, extends beyond the country’s political structures to an idea that courses through all its public and private institutions—the primacy of the individual. The centrality of the individual over the collective in the American system has not been cost-free for this nation and its people. Taken to extremes, it can destroy communities and induce a hunger for the material and a taste for the superficial that can corrode the character of the nation’s citizenry. Still, the American system has functioned because its revolutionary acknowledgement of the primacy of the individual also confers on the individual a sense of responsibility for himself, his loved ones, and his community that is unique in history.
Finding the balance between liberty and license has been a national challenge for centuries. So has finding the balance between the freedom of the individual and the common needs of the larger society. Everyone, from right to left, seems to feel that the nation’s equilibrium has been lost in the past few years, that we are out of balance politically, socially, fiscally, and culturally. This is what undergirds Michael Lewis’s contention that Americans are fat, greedy, sloppy addicts who got themselves into all kinds of trouble knowingly and without forethought.
But that impressionistic sense is not borne out by the realities of life in the United States. There are surprisingly few signs of social instability even as the financial crisis enters its fifth year, and even when, as one census report suggests, household incomes have fallen dramatically throughout the country. Crime continues to decline; divorce rates are not rising; dropout rates are not rising; hospitals are not reporting an increase in domestic violence.
The American people do not seem unhealthy (though they could stand to lose a few, as could I). The political system does. But not because debates are ugly, and not because it is too partisan, and not because some fools call Obama a Kenyan or because Joe Biden, also a fool, dubs the Tea Party “terrorists.” These are all transitory unpleasantnesses, and they have their parallels in every era. The political system is uniquely unhealthy at the present moment because of twin temptations to which politicians at every level and in both parties have succumbed—temptations whose consequences were not all that visible during the boom times but have been cast in stark relief by the bum times.
The first temptation has been to direct the behavior of the citizenry through the manipulation of the tax code, which (over time) creates a system of perverse incentives. It may seem, for example, that the mortgage-interest deduction is a vital tax break, but it is an accident of history, a holdover from a time before modern levels of federal taxation when all interest payments were deductible. Its continued existence has undeniably had an inflationary effect; the result of its disappearance would be a revaluing of all property downward in equal proportion. The transition would be complicated and confusing and would require careful management, but the end result would be a more honest valuation. The real benefit of the home-mortgage deduction over time has been to the industries that compose the real-estate sector, because having the government favor ownership over renting has created greater demand for home construction and home flipping than would otherwise be the case.
The moral argument for favoring home ownership is that owners are better citizens than renters, and therefore that it supports a greater common good. But we have now seen the damage that can be done by driving people into home ownership who had no business making—and might even have had little desire to make—that kind of long-term commitment. If ownership is a good in itself, people will pursue it without the incentive of the tax break. Indeed, even as the value of the deduction grew in the post–World-War II period while income tax rates rose and more brackets were created, the level of home ownership remained startlingly constant, just over 60 percent of households. It was not until the push to broaden the numbers of borrowers began in the mid-1990s that the rate began to jump to nearly 70 percent.
The second temptation is to secure long-term control over public office by creating a constituency among public-sector workers through contracts that have, over time, made those in the employ of the government or those receiving retirement benefits from the government twice as wealthy as the people who are employing them. We are told, by Michael Lewis and others, that these problems are due to the fact that people want big government but do not want to pay for it. But what actual evidence, other than big government’s failure to shrink in size, is there for this contention? States and localities are beginning to go bankrupt due to pension obligations and absurdly generous deals with public-sector unions. When a firefighter in Vallejo, California (Lewis’s example), can join the ranks at 45 and retire at 50 with a full pension on the public dime—a case that sounds extreme but is replicated in many localities in many states—what benefit does the taxpayer get?
Of course, the most popular benefits are national ones—Social Security and Medicare. Medicare is far more dangerous to the public weal, especially with the baby boomers beginning to retire. And certainly the case for controlling the costs of Medicare (and to a lesser degree, Social Security) is vastly tougher than the case against the public-sector workforce. But the unjust transfer of wealth from the young to the old—something that has been an impossible subject to raise in political life over the past several decades—will be an inescapable reality in very short order. If it is not halted or redirected, it will, as Yuval Levin has put it simply, “leave us with a national debt larger than our economy in just a decade and twice as large in the 2030s.”
If the prospect of being hanged in a fortnight wonderfully concentrates the mind, as Dr. Johnson said, the fortnight is about to begin. And for the first time, in 2011, politicians have begun to address the crisis seriously. House Republicans passed Rep. Paul Ryan’s revolutionary budget outline, which eliminates the Medicare entitlement in favor of a voucher system. And even Barack Obama is using the term “tax reform,” though he surely doesn’t mean by it what it really means—a radical simplification of the tax code that largely reverses the long trend toward using it as a means of designing a social order in keeping with the wants and interests of politicians.
The American people are already witness to one possible future now playing itself out in the implosion of Europe. That ongoing nightmare is providing hard evidence to anyone with eyes to see that the United States must take a different path in relation to government spending and conduct before it is too late. That is true not only of the entitlements but also the incentives that dominate the tax code, including the home-mortgage deduction; right and left are finding surprising common ground in the notion that these incentives are dangerous distortions, little more than corporate welfare that supports banks and energy producers and home builders as well. Reducing or eliminating them is the work of the next decade—complicated and grueling work that will require a complete restructuring of the tax code and an alteration in the very notion of a government “benefit,” how it is received, and how it is paid out.
The battles over all this will, to some extent, dominate our politics henceforward. We got a glimpse of the nature of the fight over the debt ceiling in July, and the 2012 election will pivot on it. I say “to some extent” because unexpected events, probably in the realm of foreign policy, will surely come along to complicate the picture. But when it comes to matters of their own fiscal health and the country’s, we can be confident in this: the American people have made rational choices in the past, and there is no reason to believe they will cease making rational choices in the future. And you don’t have to be all that much of an optimist to see that the choice between national suicide and national salvation isn’t really all that difficult.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Are You Optimistic or Pessimistic About America’s Future?
Must-Reads from Magazine
t can be said that the Book of Samuel launched the American Revolution. Though antagonistic to traditional faith, Thomas Paine understood that it was not Montesquieu, or Locke, who was inscribed on the hearts of his fellow Americans. Paine’s pamphlet Common Sense is a biblical argument against British monarchy, drawing largely on the text of Samuel.
Today, of course, universal biblical literacy no longer exists in America, and sophisticated arguments from Scripture are all too rare. It is therefore all the more distressing when public intellectuals, academics, or religious leaders engage in clumsy acts of exegesis and political argumentation by comparing characters in the Book of Samuel to modern political leaders. The most common victim of this tendency has been the central character in the Book of Samuel: King David.
Most recently, this tendency was made manifest in the writings of Dennis Prager. In a recent defense of his own praise of President Trump, Prager wrote that “as a religious Jew, I learned from the Bible that God himself chose morally compromised individuals to accomplish some greater good. Think of King David, who had a man killed in order to cover up the adultery he committed with the man’s wife.” Prager similarly argued that those who refuse to vote for a politician whose positions are correct but whose personal life is immoral “must think God was pretty flawed in voting for King David.”
Prager’s invocation of King David was presaged on the left two decades ago. The records of the Clinton Presidential Library reveal that at the height of the Lewinsky scandal, an email from Dartmouth professor Susannah Heschel made its way into the inbox of an administration policy adviser with a similar comparison: “From the perspective of Jewish history, we have to ask how Jews can condemn President Clinton’s behavior as immoral, when we exalt King David? King David had Batsheva’s husband, Uriah, murdered. While David was condemned and punished, he was never thrown off the throne of Israel. On the contrary, he is exalted in our Jewish memory as the unifier of Israel.”
One can make the case for supporting politicians who have significant moral flaws. Indeed, America’s political system is founded on an awareness of the profound tendency to sinfulness not only of its citizens but also of its statesmen. “If men were angels, no government would be necessary,” James Madison informs us in the Federalist. At the same time, anyone who compares King David to the flawed leaders of our own age reveals a profound misunderstanding of the essential nature of David’s greatness. David was not chosen by God despite his moral failings; rather, David’s failings are the lens that reveal his true greatness. It is in the wake of his sins that David emerges as the paradigmatic penitent, whose quest for atonement is utterly unlike that of any other character in the Bible, and perhaps in the history of the world.
While the precise nature of David’s sins is debated in the Talmud, there is no question that they are profound. Yet it is in comparing David to other faltering figures—in the Bible or today—that the comparison falls flat. This point is stressed by the very Jewish tradition in whose name Prager claimed to speak.
It is the rabbis who note that David’s predecessor, Saul, lost the kingship when he failed to fulfill God’s command to destroy the egregiously evil nation of Amalek, whereas David commits more severe sins and yet remains king. The answer, the rabbis suggest, lies not in the sin itself but in the response. Saul, when confronted by the prophet Samuel, offers obfuscations and defensiveness. David, meanwhile, is similarly confronted by the prophet Nathan: “Thou hast killed Uriah the Hittite with the sword, and hast taken his wife to be thy wife, and hast slain him with the sword of the children of Ammon.” David’s immediate response is clear and complete contrition: “I have sinned against the Lord.” David’s penitence, Jewish tradition suggests, sets him apart from Saul. Soon after, David gave voice to what was in his heart at the moment, and gave the world one of the most stirring of the Psalms:
Have mercy upon me, O God, according to thy lovingkindness: according unto the multitude of thy tender mercies blot out my transgressions.
Wash me thoroughly from mine iniquity, and cleanse me from my sin. For I acknowledge my transgressions: and my sin is ever before me.
. . . Deliver me from bloodguiltiness, O God, thou God of my salvation: and my tongue shall sing aloud of thy righteousness.
O Lord, open thou my lips; and my mouth shall shew forth thy praise.
For thou desirest not sacrifice; else would I give it: thou delightest not in burnt offering.
The sacrifices of God are a broken spirit: a broken and a contrite heart, O God, thou wilt not despise.
The tendency to link David to our current age lies in the fact that we know more about David than any other biblical figure. The author Thomas Cahill has noted that in a certain literary sense, David is the only biblical figure that is like us at all. Prior to the humanist autobiographies of the Renaissance, he notes, “we can count only a few isolated instances of this use of ‘I’ to mean the interior self. But David’s psalms are full of I’s.” In David’s Psalms, Cahill writes, we “find a unique early roadmap to the inner spirit—previously mute—of ancient humanity.”
At the same time, a study of the Book of Samuel and of the Psalms reveals how utterly incomparable David is to anyone alive today. Haym Soloveitchik has noted that even the most observant of Jews today fail to feel a constant intimacy with God that the simplest Jew of the premodern age might have felt, that “while there are always those whose spirituality is one apart from that of their time, nevertheless I think it safe to say that the perception of God as a daily, natural force is no longer present to a significant degree in any sector of modern Jewry, even the most religious.” Yet for David, such intimacy with the divine was central to his existence, and the Book of Samuel and the Psalms are an eternal testament to this fact. This is why simple comparisons between David and ourselves, as tempting as they are, must be resisted. David Wolpe, in his book about David, attempts to make the case as to why King David’s life speaks to us today: “So versatile and enduring is David in our culture that rare is the week that passes without some public allusion to his life…We need to understand David better because we use his life to comprehend our own.”
The truth may be the opposite. We need to understand David better because we can use his life to comprehend what we are missing, and how utterly unlike our lives are to his own. For even the most religious among us have lost the profound faith and intimacy with God that David had. It is therefore incorrect to assume that because of David’s flaws it would have been, as Amos Oz has written, “fitting for him to reign in Tel Aviv.” The modern State of Israel was blessed with brilliant leaders, but to which of its modern warriors or statesmen should David be compared? To Ben Gurion, who stripped any explicit invocation of the Divine from Israel’s Declaration of Independence? To Moshe Dayan, who oversaw the reconquest of Jerusalem, and then immediately handed back the Temple Mount, the locus of King David’s dreams and desires, to the administration of the enemies of Israel? David’s complex humanity inspires comparison to modern figures, but his faith, contrition, and repentance—which lie at the heart of his story and success—defy any such engagement.
And so, to those who seek comparisons to modern leaders from the Bible, the best rule may be: Leave King David out of it.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Three attacks in Britain highlight the West’s inability to see the threat clearly
This lack of seriousness manifests itself in several ways. It’s perhaps most obvious in the failure to reform Britain’s chaotic immigration and dysfunctional asylum systems. But it’s also abundantly clear from the grotesque underfunding and under-resourcing of domestic intelligence. In MI5, Britain has an internal security service that is simply too small to do its job effectively, even if it were not handicapped by an institutional culture that can seem willfully blind to the ideological roots of the current terrorism problem.
In 2009, Jonathan Evans, then head of MI5, confessed at a parliamentary hearing about the London bus and subway attacks of 2005 that his organization only had sufficient resources to “hit the crocodiles close to the boat.” It was an extraordinary metaphor to use, not least because of the impression of relative impotence that it conveys. MI5 had by then doubled in size since 2001, but it still boasted a staff of only 3,500. Today it’s said to employ between 4,000 and 5,000, an astonishingly, even laughably, small number given a UK population of 65 million and the scale of the security challenges Britain now faces. (To be fair, the major British police forces all have intelligence units devoted to terrorism, and the UK government’s overall counterterrorism strategy involves a great many people, including social workers and schoolteachers.)
You can also see that unseriousness at work in the abject failure to coerce Britain’s often remarkably sedentary police officers out of their cars and stations and back onto the streets. Most of Britain’s big-city police forces have adopted a reactive model of policing (consciously rejecting both the New York Compstat model and British “bobby on the beat” traditions) that cripples intelligence-gathering and frustrates good community relations.
If that weren’t bad enough, Britain’s judiciary is led by jurists who came of age in the 1960s, and who have been inclined since 2001 to treat terrorism as an ordinary criminal problem being exploited by malign officials and politicians to make assaults on individual rights and to take part in “illegal” foreign wars. It has long been almost impossible to extradite ISIS or al-Qaeda–linked Islamists from the UK. This is partly because today’s English judges believe that few if any foreign countries—apart from perhaps Sweden and Norway—are likely to give terrorist suspects a fair trial, or able to guarantee that such suspects will be spared torture and abuse.
We have a progressive metropolitan media elite whose primary, reflexive response to every terrorist attack, even before the blood on the pavement is dry, is to express worry about an imminent violent anti-Muslim “backlash” on the part of a presumptively bigoted and ignorant indigenous working class. Never mind that no such “backlash” has yet occurred, not even when the young off-duty soldier Lee Rigby was hacked to death in broad daylight on a South London street in 2013.
Another sign of this lack of seriousness is the choice by successive British governments to deal with the problem of internal terrorism with marketing and “branding.” You can see this in the catchy consultant-created acronyms and pseudo-strategies that are deployed in place of considered thought and action. After every atrocity, the prime minister calls a meeting of the COBRA unit—an acronym that merely stands for Cabinet Office Briefing Room A but sounds like a secret organization of government superheroes. The government’s counterterrorism strategy is called CONTEST, which has four “work streams”: “Prevent,” “Pursue,” “Protect,” and “Prepare.”
Perhaps the ultimate sign of unseriousness is the fact that police, politicians, and government officials have all displayed more fear of being seen as “Islamophobic” than of any carnage that actual terror attacks might cause. Few are aware that this short-term, cowardly, and trivial tendency may ultimately foment genuine, dangerous popular Islamophobia, especially if attacks continue.R
ecently, three murderous Islamist terror attacks in the UK took place in less than a month. The first and third were relatively primitive improvised attacks using vehicles and/or knives. The second was a suicide bombing that probably required relatively sophisticated planning, technological know-how, and the assistance of a terrorist infrastructure. As they were the first such attacks in the UK, the vehicle and knife killings came as a particular shock to the British press, public, and political class, despite the fact that non-explosive and non-firearm terror attacks have become common in Europe and are almost routine in Israel.
The success of all three plots indicates troubling problems in British law-enforcement practice and culture, quite apart from any other failings on the parts of the state in charge of intelligence, border control, and the prevention of radicalization. At the time of writing, the British media have been full of encomia to police courage and skill, not least because it took “only” eight minutes for an armed Metropolitan Police team to respond to and confront the bloody mayhem being wrought by the three Islamist terrorists (who had ploughed their rented van into people on London Bridge before jumping out to attack passersby with knives). But the difficult truth is that all three attacks would be much harder to pull off in Manhattan, not just because all NYPD cops are armed, but also because there are always police officers visibly on patrol at the New York equivalents of London’s Borough Market on a Saturday night. By contrast, London’s Metropolitan police is a largely vehicle-borne, reactive force; rather than use a physical presence to deter crime and terrorism, it chooses to monitor closed-circuit street cameras and social-media postings.
Since the attacks in London and Manchester, we have learned that several of the perpetrators were “known” to the police and security agencies that are tasked with monitoring potential terror threats. That these individuals were nevertheless able to carry out their atrocities is evidence that the monitoring regime is insufficient.
It also seems clear that there were failures on the part of those institutions that come under the leadership of the Home Office and are supposed to be in charge of the UK’s border, migration, and asylum systems. Journalists and think tanks like Policy Exchange and Migration Watch have for years pointed out that these systems are “unfit for purpose,” but successive governments have done little to take responsible control of Britain’s borders. When she was home secretary, Prime Minister Theresa May did little more than jazz up the name, logo, and uniforms of what is now called the “Border Force,” and she notably failed to put in place long-promised passport checks for people flying out of the country. This dereliction means that it is impossible for the British authorities to know who has overstayed a visa or whether individuals who have been denied asylum have actually left the country.
It seems astonishing that Youssef Zaghba, one of the three London Bridge attackers, was allowed back into the country. The Moroccan-born Italian citizen (his mother is Italian) had been arrested by Italian police in Bologna, apparently on his way to Syria via Istanbul to join ISIS. When questioned by the Italians about the ISIS decapitation videos on his mobile phone, he declared that he was “going to be a terrorist.” The Italians lacked sufficient evidence to charge him with a crime but put him under 24-hour surveillance, and when he traveled to London, they passed on information about him to MI5. Nevertheless, he was not stopped or questioned on arrival and had not become one of the 3,000 official terrorism “subjects of interest” for MI5 or the police when he carried out his attack. One reason Zaghba was not questioned on arrival may have been that he used one of the new self-service passport machines installed in UK airports in place of human staff after May’s cuts to the border force. Apparently, the machines are not yet linked to any government watch lists, thanks to the general chaos and ineptitude of the Home Office’s efforts to use information technology.
The presence in the country of Zaghba’s accomplice Rachid Redouane is also an indictment of the incompetence and disorganization of the UK’s border and migration authorities. He had been refused asylum in 2009, but as is so often the case, Britain’s Home Office never got around to removing him. Three years later, he married a British woman and was therefore able to stay in the UK.
But it is the failure of the authorities to monitor ringleader Khuram Butt that is the most baffling. He was a known and open associate of Anjem Choudary, Britain’s most notorious terrorist supporter, ideologue, and recruiter (he was finally imprisoned in 2016 after 15 years of campaigning on behalf of al-Qaeda and ISIS). Butt even appeared in a 2016 TV documentary about ISIS supporters called The Jihadist Next Door. In the same year, he assaulted a moderate imam at a public festival, after calling him a “murtad” or apostate. The imam reported the incident to the police—who took six months to track him down and then let him off with a caution. It is not clear if Butt was one of the 3,000 “subjects of interest” or the additional 20,000 former subjects of interest who continue to be the subject of limited monitoring. If he was not, it raises the question of what a person has to do to get British security services to take him seriously as a terrorist threat; if he was in fact on the list of “subjects of interest,” one has to wonder if being so designated is any barrier at all to carrying out terrorist atrocities. It’s worth remembering, as few do here in the UK, that terrorists who carried out previous attacks were also known to the police and security services and nevertheless enjoyed sufficient liberty to go at it again.B
ut the most important reason for the British state’s ineffectiveness in monitoring terror threats, which May addressed immediately after the London Bridge attack, is a deeply rooted institutional refusal to deal with or accept the key role played by Islamist ideology. For more than 15 years, the security services and police have chosen to take note only of people and bodies that explicitly espouse terrorist violence or have contacts with known terrorist groups. The fact that a person, school, imam, or mosque endorses the establishment of a caliphate, the stoning of adulterers, or the murder of apostates has not been considered a reason to monitor them.
This seems to be why Salman Abedi, the Manchester Arena suicide bomber, was not being watched by the authorities as a terror risk, even though he had punched a girl in the face for wearing a short skirt while at university, had attended the Muslim Brotherhood-controlled Didsbury Mosque, was the son of a Libyan man whose militia is banned in the UK, had himself fought against the Qaddafi regime in Libya, had adopted the Islamist clothing style (trousers worn above the ankle, beard but no moustache), was part of a druggy gang subculture that often feeds individuals into Islamist terrorism, and had been banned from a mosque after confronting an imam who had criticized ISIS.
It was telling that the day after the Manchester Arena suicide-bomb attack, you could hear security officials informing radio and TV audiences of the BBC’s flagship morning-radio news show that it’s almost impossible to predict and stop such attacks because the perpetrators “don’t care who they kill.” They just want to kill as many people as possible, he said.
Surely, anyone with even a basic familiarity with Islamist terror attacks over the last 15 or so years and a nodding acquaintance with Islamist ideology could see that the terrorist hadn’t just chosen the Ariana Grande concert in Manchester Arena because a lot of random people would be crowded into a conveniently small area. Since the Bali bombings of 2002, nightclubs, discotheques, and pop concerts attended by shameless unveiled women and girls have been routinely targeted by fundamentalist terrorists, including in Britain. Among the worrying things about the opinion offered on the radio show was that it suggests that even in the wake of the horrific Bataclan attack in Paris during a November 2015 concert, British authorities may not have been keeping an appropriately protective eye on music venues and other places where our young people hang out in their decadent Western way. Such dereliction would make perfect sense given the resistance on the part of the British security establishment to examining, confronting, or extrapolating from Islamist ideology.
The same phenomenon may explain why authorities did not follow up on community complaints about Abedi. All too often when people living in Britain’s many and diverse Muslim communities want to report suspicious behavior, they have to do so through offices and organizations set up and paid for by the authorities as part of the overall “Prevent” strategy. Although criticized by the left as “Islamophobic” and inherently stigmatizing, Prevent has often brought the government into cooperative relationships with organizations even further to the Islamic right than the Muslim Brotherhood. This means that if you are a relatively secular Libyan émigré who wants to report an Abedi and you go to your local police station, you are likely to find yourself speaking to a bearded Islamist.
From its outset in 2003, the Prevent strategy was flawed. Its practitioners, in their zeal to find and fund key allies in “the Muslim community” (as if there were just one), routinely made alliances with self-appointed community leaders who represented the most extreme and intolerant tendencies in British Islam. Both the Home Office and MI5 seemed to believe that only radical Muslims were “authentic” and would therefore be able to influence young potential terrorists. Moderate, modern, liberal Muslims who are arguably more representative of British Islam as a whole (not to mention sundry Shiites, Sufis, Ahmmadis, and Ismailis) have too often found it hard to get a hearing.
Sunni organizations that openly supported suicide-bomb attacks in Israel and India and that justified attacks on British troops in Iraq and Afghanistan nevertheless received government subsidies as part of Prevent. The hope was that in return, they would alert the authorities if they knew of individuals planning attacks in the UK itself.
It was a gamble reminiscent of British colonial practice in India’s northwest frontier and elsewhere. Not only were there financial inducements in return for grudging cooperation; the British state offered other, symbolically powerful concessions. These included turning a blind eye to certain crimes and antisocial practices such as female genital mutilation (there have been no successful prosecutions relating to the practice, though thousands of cases are reported every year), forced marriage, child marriage, polygamy, the mass removal of girls from school soon after they reach puberty, and the epidemic of racially and religiously motivated “grooming” rapes in cities like Rotherham. (At the same time, foreign jihadists—including men wanted for crimes in Algeria and France—were allowed to remain in the UK as long as their plots did not include British targets.)
This approach, simultaneously cynical and naive, was never as successful as its proponents hoped. Again and again, Muslim chaplains who were approved to work in prisons and other institutions have sometimes turned out to be Islamist extremists whose words have inspired inmates to join terrorist organizations.
Much to his credit, former Prime Minister David Cameron fought hard to change this approach, even though it meant difficult confrontations with his home secretary (Theresa May), as well as police and the intelligence agencies. However, Cameron’s efforts had little effect on the permanent personnel carrying out the Prevent strategy, and cooperation with Islamist but currently nonviolent organizations remains the default setting within the institutions on which the United Kingdom depends for security.
The failure to understand the role of ideology is one of imagination as well as education. Very few of those who make government policy or write about home-grown terrorism seem able to escape the limitations of what used to be called “bourgeois” experience. They assume that anyone willing to become an Islamist terrorist must perforce be materially deprived, or traumatized by the experience of prejudice, or provoked to murderous fury by oppression abroad. They have no sense of the emotional and psychic benefits of joining a secret terror outfit: the excitement and glamor of becoming a kind of Islamic James Bond, bravely defying the forces of an entire modern state. They don’t get how satisfying or empowering the vengeful misogyny of ISIS-style fundamentalism might seem for geeky, frustrated young men. Nor can they appreciate the appeal to the adolescent mind of apocalyptic fantasies of power and sacrifice (mainstream British society does not have much room for warrior dreams, given that its tone is set by liberal pacifists). Finally, they have no sense of why the discipline and self-discipline of fundamentalist Islam might appeal so strongly to incarcerated lumpen youth who have never experienced boundaries or real belonging. Their understanding is an understanding only of themselves, not of the people who want to kill them.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Review of 'White Working Class' By Joan C. Williams
Williams is a prominent feminist legal scholar with degrees from Yale, MIT, and Harvard. Unbending Gender, her best-known book, is the sort of tract you’d expect to find at an intersectionality conference or a Portlandia bookstore. This is why her insightful, empathic book comes as such a surprise.
Books and essays on the topic have accumulated into a highly visible genre since Donald Trump came on the American political scene; J.D. Vance’s Hillbilly Elegy planted itself at the top of bestseller lists almost a year ago and still isn’t budging. As with Vance, Williams’s interest in the topic is personal. She fell “madly in love with” and eventually married a Harvard Law School graduate who had grown up in an Italian neighborhood in pre-gentrification Brook-lyn. Williams, on the other hand, is a “silver-spoon girl.” Her father’s family was moneyed, and her maternal grandfather was a prominent Reform rabbi.
The author’s affection for her “class-migrant” spouse and respect for his family’s hardships—“My father-in-law grew up on blood soup,” she announces in her opening sentence—adds considerable warmth to what is at bottom a political pamphlet. Williams believes that elite condescension and “cluelessness” played a big role in Trump’s unexpected and dreaded victory. Enlightening her fellow elites is essential to the task of returning Trump voters to the progressive fold where, she is sure, they rightfully belong.
Liberals were not always so dense about the working class, Williams observes. WPA murals and movies like On the Waterfront showed genuine fellow feeling for the proletariat. In the 1970s, however, the liberal mood changed. Educated boomers shifted their attention to “issues of peace, equal rights, and environmentalism.” Instead of feeling the pain of Arthur Miller and John Steinbeck characters, they began sneering at the less enlightened. These days, she notes, elite sympathies are limited to the poor, people of color (POC), and the LGBTQ population. Despite clear evidence of suffering—stagnant wages, disappearing manufacturing jobs, declining health and well-being—the working class gets only fly-over snobbery at best and, more often, outright loathing.
Williams divides her chapters into a series of explainers to questions she has heard from her clueless friends and colleagues: “Why Does the Working Class Resent the Poor?” “Why Does the Working Class Resent Professionals but Admire the Rich?” “Why Doesn’t the Working Class Just Move to Where the Jobs Are?” “Is the Working Class Just Racist?” She weaves her answers into a compelling picture of a way of life and worldview foreign to her targeted readers. Working-class Americans have had to struggle for whatever stability and comfort they have, she explains. Clocking in for midnight shifts year after year, enduring capricious bosses, plant closures, and layoffs, they’re reliant on tag-team parenting and stressed-out relatives for child care. The campus go-to word “privileged” seems exactly wrong.
Proud of their own self-sufficiency and success, however modest, they don’t begrudge the self-made rich. It’s snooty professionals and the dysfunctional poor who get their goat. From their vantage point, subsidizing the day care for a welfare mother when they themselves struggle to manage care on their own dime mocks both their hard work and their beliefs. And since, unlike most professors, they shop in the same stores as the dependent poor, they’ve seen that some of them game the system. Of course that stings.
White Working Class is especially good at evoking the alternate economic and mental universe experienced by Professional and Managerial Elites, or “PMEs.” PMEs see their non-judgment of the poor, especially those who are “POC,” as a mark of their mature understanding that we live in an unjust, racist system whose victims require compassion regardless of whether they have committed any crime. At any rate, their passions lie elsewhere. They define themselves through their jobs and professional achievements, hence their obsession with glass ceilings.
Williams tells the story of her husband’s faux pas at a high-school reunion. Forgetting his roots for a moment, the Ivy League–educated lawyer asked one of his Brooklyn classmates a question that is the go-to opener in elite social settings: “What do you do?” Angered by what must have seemed like deliberate humiliation by this prodigal son, the man hissed: “I sell toilets.”
Instead of stability and backyard barbecues with family and long-time neighbors and maybe the occasional Olive Garden celebration, PMEs are enamored of novelty: new foods, new restaurants, new friends, new experiences. The working class chooses to spend its leisure in comfortable familiarity; for the elite, social life is a lot like networking. Members of the professional class may view themselves as sophisticated or cosmopolitan, but, Williams shows, to the blue-collar worker their glad-handing is closer to phony social climbing and their abstract, knowledge-economy jobs more like self-important pencil-pushing.
White Working Class has a number of proposals for creating the progressive future Williams would like to see. She wants to get rid of college-for-all dogma and improve training for middle-skill jobs. She envisions a working-class coalition of all races and ethnicities bolstered by civics education with a “distinctly celebratory view of American institutions.” In a saner political environment, some of this would make sense; indeed, she echoes some of Marco Rubio’s 2016 campaign themes. It’s little wonder White Working Class has already gotten the stink eye from liberal reviewers for its purported sympathies for racists.
Alas, impressive as Williams’s insights are, they do not always allow her to transcend her own class loyalties. Unsurprisingly, her own PME biases mostly come to light in her chapters on race and gender. She reduces immigration concerns to “fear of brown people,” even as she notes elsewhere that a quarter of Latinos also favor a wall at the southern border. This contrasts startlingly with her succinct observation that “if you don’t want to drive working-class whites to be attracted to the likes of Limbaugh, stop insulting them.” In one particularly obtuse moment, she asserts: “Because I study social inequality, I know that even Malia and Sasha Obama will be disadvantaged by race, advantaged as they are by class.” She relies on dubious gender theories to explain why the majority of white women voted for Trump rather than for his unfairly maligned opponent. That Hillary Clinton epitomized every elite quality Williams has just spent more than a hundred pages explicating escapes her notice. Williams’s own reflexive retreat into identity politics is itself emblematic of our toxic divisions, but it does not invalidate the power of this astute book.
Choose your plan and pay nothing for six Weeks!
When music could not transcend evil
he story of European classical music under the Third Reich is one of the most squalid chapters in the annals of Western culture, a chronicle of collective complaisance that all but beggars belief. Without exception, all of the well-known musicians who left Germany and Austria in protest when Hitler came to power in 1933 were either Jewish or, like the violinist Adolf Busch, Rudolf Serkin’s father-in-law, had close family ties to Jews. Moreover, most of the small number of non-Jewish musicians who emigrated later on, such as Paul Hindemith and Lotte Lehmann, are now known to have done so not out of principle but because they were unable to make satisfactory accommodations with the Nazis. Everyone else—including Karl Böhm, Wilhelm Furtwängler, Walter Gieseking, Herbert von Karajan, and Richard Strauss—stayed behind and served the Reich.
The Berlin and Vienna Philharmonics, then as now Europe’s two greatest orchestras, were just as willing to do business with Hitler and his henchmen, firing their Jewish members and ceasing to perform the music of Jewish composers. Even after the war, the Vienna Philharmonic was notorious for being the most anti-Semitic orchestra in Europe, and it was well known in the music business (though never publicly discussed) that Helmut Wobisch, the orchestra’s principal trumpeter and its executive director from 1953 to 1968, had been both a member of the SS and a Gestapo spy.
The management of the Berlin Philharmonic made no attempt to cover up the orchestra’s close relationship with the Third Reich, no doubt because the Nazi ties of Karajan, who was its music director from 1956 until shortly before his death in 1989, were a matter of public record. Yet it was not until 2007 that a full-length study of its wartime activities, Misha Aster’s The Reich’s Orchestra: The Berlin Philharmonic 1933–1945, was finally published. As for the Vienna Philharmonic, its managers long sought to quash all discussion of the orchestra’s Nazi past, steadfastly refusing to open its institutional archives to scholars until 2008, when Fritz Trümpi, an Austrian scholar, was given access to its records. Five years later, the Viennese, belatedly following the precedent of the Berlin Philharmonic, added a lengthy section to their website called “The Vienna Philharmonic Under National Socialism (1938–1945),” in which the damning findings of Trümpi and two other independent scholars were made available to the public.
Now Trümpi has published The Political Orchestra: The Vienna and Berlin Philharmonics During the Third Reich, in which he tells how they came to terms with Nazism, supplying pre- and postwar historical context for their transgressions.1 Written in a stiff mixture of academic jargon and translatorese, The Political Orchestra is ungratifying to read. Even so, the tale that it tells is both compelling and disturbing, especially to anyone who clings to the belief that high art is ennobling to the spirit.U
nlike the Vienna Philharmonic, which has always doubled as the pit orchestra for the Vienna State Opera, the Berlin Philharmonic started life in 1882 as a fully independent, self-governing entity. Initially unsubsidized by the state, it kept itself afloat by playing a grueling schedule of performances, including “popular” non-subscription concerts for which modest ticket prices were levied. In addition, the orchestra made records and toured internationally at a time when neither was common.
These activities made it possible for the Berlin Philharmonic to develop into an internationally renowned ensemble whose fabled collective virtuosity was widely seen as a symbol of German musical distinction. Furtwängler, the orchestra’s principal conductor, declared in 1932 that the German music in which it specialized was “one of the very few things that actually contribute to elevating [German] prestige.” Hence, he explained, the need for state subsidy, which he saw as “a matter of [national] prestige, that is, to some extent a requirement of national prudence.” By then, though, the orchestra was already heavily subsidized by the city of Berlin, thus paving the way for its takeover by the Nazis.
The Vienna Philharmonic, by contrast, had always been subsidized. Founded in 1842 when the orchestra of what was then the Vienna Court Opera decided to give symphonic concerts on its own, it performed the Austro-German classics for an elite cadre of longtime subscribers. By restricting membership to local players and their pupils, the orchestra cultivated what Furtwängler, who spent as much time conducting in Vienna as in Berlin, described as a “homogeneous and distinct tone quality.” At once dark and sweet, it was as instantly identifiable—and as characteristically Viennese—as the strong, spicy bouquet of a Gewürztraminer wine.
Unlike the Berlin Philharmonic, which played for whoever would pay the tab and programmed new music as a matter of policy, the Vienna Philharmonic chose not to diversify either its haute-bourgeois audience or its conservative repertoire. Instead, it played Beethoven, Brahms, Haydn, Mozart, and Schubert (and, later, Bruckner and Richard Strauss) in Vienna for the Viennese. Starting in the ’20s, the orchestra’s recordings consolidated its reputation as one of the world’s foremost instrumental ensembles, but its internal culture remained proudly insular.
What the two orchestras had in common was a nationalistic ethos, a belief in the superiority of Austro-German musical culture that approached triumphalism. One of the darkest manifestations of this ethos was their shared reluctance to hire Jews. The Berlin Philharmonic employed only four Jewish players in 1933, while the Vienna Philharmonic contained only 11 Jews at the time of the Anschluss, none of whom was hired after 1920. To be sure, such popular Jewish conductors as Otto Klemperer and Bruno Walter continued to work in Vienna for as long as they could. Two months before the Anschluss, Walter led and recorded a performance of the Ninth Symphony of Gustav Mahler, his musical mentor and fellow Jew, who from 1897 to 1907 had been the director of the Vienna Court Opera and one of the Philharmonic’s most admired conductors. But many members of both orchestras were open supporters of fascism, and not a few were anti-Semites who ardently backed Hitler. By 1942, 62 of the 123 active members of the Vienna Philharmonic were Nazi party members.
The admiration that Austro-German classical musicians had for Hitler is not entirely surprising since he was a well-informed music lover who declared in 1938 that “Germany has become the guardian of European culture and civilization.” He made the support of German art, music very much included, a key part of his political program. Accordingly, the Berlin Philharmonic was placed under the direct supervision of Joseph Goebbels, who ensured the cooperation of its members by repeatedly raising their salaries, exempting them from military service, and guaranteeing their old-age pensions. But there had never been any serious question of protest, any more than there would be among the members of the Vienna Philharmonic when the Nazis gobbled up Austria. Save for the Jews and one or two non-Jewish players who were fired for reasons of internal politics, the musicians went along unhesitatingly with Hitler’s desires.
With what did they go along? Above all, they agreed to the scrubbing of Jewish music from their programs and the dismissal of their Jewish colleagues. Some Jewish players managed to escape with their lives, but seven of the Vienna Philharmonic’s 11 Jews were either murdered by the Nazis or died as a direct result of official persecution. In addition, both orchestras performed regularly at official government functions and made tours and other public appearances for propaganda purposes, and both were treated as gems in the diadem of Nazi culture.
As for Furtwängler, the most prominent of the Austro-German orchestral conductors who served the Reich, his relationship to Nazism continues to be debated to this day. He had initially resisted the firing of the Berlin Philharmonic’s Jewish members and protected them for as long as he could. But he was also a committed (if woolly-minded) nationalist who believed that German music had “a different meaning for us Germans than for other nations” and notoriously declared in an open letter to Goebbels that “we all welcome with great joy and gratitude . . . the restoration of our national honor.” Thereafter he cooperated with the Nazis, by all accounts uncomfortably but—it must be said—willingly. A monster of egotism, he saw himself as the greatest living exponent of German music and believed it to be his duty to stay behind and serve a cause higher than what he took to be mere party politics. “Human beings are free wherever Wagner and Beethoven are played, and if they are not free at first, they are freed while listening to these works,” he naively assured a horrified Arturo Toscanini in 1937. “Music transports them to regions where the Gestapo can do them no harm.”O
nce the war was over, the U.S. occupation forces decided to enlist the Berlin Philharmonic in the service of a democratic, anti-Soviet Germany. Furtwängler and Herbert von Karajan, who succeeded him as principal conductor, were officially “de-Nazified” and their orchestra allowed to function largely undisturbed, though six Nazi Party members were fired. The Vienna Philharmonic received similarly privileged treatment.
Needless to say, there was more to this decision than Cold War politics. No one questioned the unique artistic stature of either orchestra. Moreover, the Vienna Philharmonic, precisely because of its insularity, was now seen as a living museum piece, a priceless repository of 19th-century musical tradition. Still, many musicians and listeners, Jews above all, looked askance at both orchestras for years to come, believing them to be tainted by Nazism.
Indeed they were, so much so that they treated many of their surviving Jewish ex-members in a way that can only be described as vicious. In the most blatant individual case, the violinist Szymon Goldberg, who had served as the Berlin Philharmonic’s concertmaster under Furtwängler, was not allowed to reassume his post in 1945 and was subsequently denied a pension. As for the Vienna Philharmonic, the fact that it made Helmut Wobisch its executive director says everything about its deep-seated unwillingness to face up to its collective sins.
Be that as it may, scarcely any prominent musicians chose to boycott either orchestra. Leonard Bernstein went so far as to affect a flippant attitude toward the morally equivocal conduct of the Austro-German artists whom he encountered in Europe after the war. Upon meeting Herbert von Karajan in 1954, he actually told his wife Felicia that he had become “real good friends with von Karajan, whom you would (and will) adore. My first Nazi.”
At the same time, though, Bernstein understood what he was choosing to overlook. When he conducted the Vienna Philharmonic for the first time in 1966, he wrote to his parents:
I am enjoying Vienna enormously—as much as a Jew can. There are so many sad memories here; one deals with so many ex-Nazis (and maybe still Nazis); and you never know if the public that is screaming bravo for you might contain someone who 25 years ago might have shot me dead. But it’s better to forgive, and if possible, forget. The city is so beautiful, and so full of tradition. Everyone here lives for music, especially opera, and I seem to be the new hero.
Did Bernstein sell his soul for the opportunity to work with so justly renowned an orchestra—and did he get his price by insisting that its members perform the symphonies of Mahler, with which he was by then closely identified? It is a fair question, one that does not lend itself to easy answers.
Even more revealing is the case of Bruno Walter, who never forgave Furtwängler for staying behind in Germany, informing him in an angry letter that “your art was used as a conspicuously effective means of propaganda for the regime of the Devil.” Yet Walter’s righteous anger did not stop him from conducting in Vienna after the war. Born in Berlin, he had come to identify with the Philharmonic so closely that it was impossible for him to seriously consider quitting its podium permanently. “Spiritually, I was a Viennese,” he wrote in Theme and Variations, his 1946 autobiography. In 1952, he made a second recording with the Vienna Philharmonic of Mahler’s Das Lied von der Erde, whose premiere he had conducted in 1911 and which he had recorded in Vienna 15 years earlier. One wonders what Walter, who had converted to Christianity but had been driven out of both his native lands for the crime of being Jewish, made of the text of the last movement: “My friend, / On this earth, fortune has not been kind to me! / Where do I go?”
As for the two great orchestras of the Third Reich, both have finally acknowledged their guilt and been forgiven, at least by those who know little of their past. It would occur to no one to decline on principle to perform with either group today. Such a gesture would surely be condemned as morally ostentatious, an exercise in what we now call virtue-signaling. Yet it is impossible to forget what Samuel Lipman wrote in 1993 in Commentary apropos the wartime conduct of Furtwängler: “The ultimate triumph of totalitarianism, I suppose it can be said, is that under its sway only a martyred death can be truly moral.” For the only martyrs of the Berlin and Vienna Philharmonics were their Jews. The orchestras themselves live on, tainted and beloved.
Choose your plan and pay nothing for six Weeks!
He knows what to reveal and what to conceal, understands the importance of keeping the semblance of distance between oneself and the story of the day, and comprehends the ins and outs of anonymous sourcing. Within days of his being fired by President Trump on May 9, for example, little green men and women, known only as his “associates,” began appearing in the pages of the New York Times and Washington Post to dispute key points of the president’s account of his dismissal and to promote Comey’s theory of the case.
“In a Private Dinner, Trump Demanded Loyalty,” the New York Times reported on May 11. “Comey Demurred.” The story was a straightforward narrative of events from Comey’s perspective, capped with an obligatory denial from the White House. The next day, the Washington Post reported, “Comey associates dispute Trump’s account of conversations.” The Post did not identify Comey’s associates, other than saying that they were “people who have worked with him.”
Maybe they were the same associates who had gabbed to the Times. Or maybe they were different ones. Who can tell? Regardless, the story these particular associates gave to the Post was readable and gripping. Comey, the Post reported, “was wary of private meetings and discussions with the president and did not offer the assurance, as Trump has claimed, that Trump was not under investigation as part of the probe into Russian interference in last year’s election.”
On May 16, Michael S. Schmidt of the Times published his scoop, “Comey Memo Says Trump Asked Him to End Flynn Investigation.” Schmidt didn’t see the memo for himself. Parts of it were read to him by—you guessed it—“one of Mr. Comey’s associates.” The following day, Robert Mueller was appointed special counsel to oversee the Russia investigation. On May 18, the Times, citing “two people briefed” on a call between Comey and the president, reported, “Comey, Unsettled by Trump, Is Said to Have Wanted Him Kept at a Distance.” And by the end of that week, Comey had agreed to testify before the Senate Intelligence Committee.
As his testimony approached, Comey’s people became more aggressive in their criticisms of the president. “Trump Should Be Scared, Comey Friend Says,” read the headline of a CNN interview with Brookings Institution fellow Benjamin Wittes. This “Comey friend” said he was “very shocked” when he learned that President Trump had asked Comey for loyalty. “I have no doubt that he regarded the group of people around the president as dishonorable,” Wittes said.
Comey, Wittes added, was so uncomfortable at the White House reception in January honoring law enforcement—the one where Comey lumbered across the room and Trump whispered something in his ear—that, as CNN paraphrased it, he “stood in a position so that his blue blazer would blend in with the room’s blue drapes in an effort for Trump to not notice him.” The integrity, the courage—can you feel it?
On June 6, the day before Comey’s prepared testimony was released, more “associates” told ABC that the director would “not corroborate Trump’s claim that on three separate occasions Comey told the president he was not under investigation.” And a “source with knowledge of Comey’s testimony” told CNN the same thing. In addition, ABC reported that, according to “a source familiar with Comey’s thinking,” the former director would say that Trump’s actions stopped short of obstruction of justice.
Maybe those sources weren’t as “familiar with Comey’s thinking” as they thought or hoped? To maximize the press coverage he already dominated, Comey had authorized the Senate Intelligence Committee to release his testimony ahead of his personal interview. That testimony told a different story than what had been reported by CNN and ABC (and by the Post on May 12). Comey had in fact told Trump the president was not under investigation—on January 6, January 27, and March 30. Moreover, the word “obstruction” did not appear at all in his written text. The senators asked Comey if he felt Trump obstructed justice. He declined to answer either way.
My guess is that Comey’s associates lacked Comey’s scalpel-like, almost Jesuitical ability to make distinctions, and therefore misunderstood what he was telling them to say to the press. Because it’s obvious Comey was the one behind the stories of Trump’s dishonesty and bad behavior. He admitted as much in front of the cameras in a remarkable exchange with Senator Susan Collins of Maine.
Comey said that, after Trump tweeted on May 12 that he’d better hope there aren’t “tapes” of their conversations, “I asked a friend of mine to share the content of the memo with a reporter. Didn’t do it myself, for a variety of reasons. But I asked him to, because I thought that might prompt the appointment of a special counsel. And so I asked a close friend of mine to do it.”
Collins asked whether that friend had been Wittes, known to cable news junkies as Comey’s bestie. Comey said no. The source for the New York Times article was “a good friend of mine who’s a professor at Columbia Law School,” Daniel Richman.
Every time I watch or read that exchange, I am amazed. Here is the former director of the FBI just flat-out admitting that, for months, he wrote down every interaction he had with the president of the United States because he wanted a written record in case the president ever fired or lied about him. And when the president did fire and lie about him, that director set in motion a series of public disclosures with the intent of not only embarrassing the president, but also forcing the appointment of a special counsel who might end up investigating the president for who knows what. And none of this would have happened if the president had not fired Comey or tweeted about him. He told the Senate that if Trump hadn’t dismissed him, he most likely would still be on the job.
Rarely, in my view, are high officials so transparent in describing how Washington works. Comey revealed to the world that he was keeping a file on his boss, that he used go-betweens to get his story into the press, that “investigative journalism” is often just powerful people handing documents to reporters to further their careers or agendas or even to get revenge. And as long as you maintain some distance from the fallout, and stick to the absolute letter of the law, you will come out on top, so long as you have a small army of nightingales singing to reporters on your behalf.
“It’s the end of the Comey era,” A.B. Stoddard said on Special Report with Bret Baier the other day. On the contrary: I have a feeling that, as the Russia investigation proceeds, we will be hearing much more from Comey. And from his “associates.” And his “friends.” And persons “familiar with his thinking.”
Choose your plan and pay nothing for six Weeks!
In April, COMMENTARY asked a wide variety of writers,
thinkers, and broadcasters to respond to this question: Is free speech under threat in the United States? We received twenty-seven responses. We publish them here in alphabetical order.
Floyd AbramsFree expression threatened? By Donald Trump? I guess you could say so.
When a president engages in daily denigration of the press, when he characterizes it as the enemy of the people, when he repeatedly says that the libel laws should be “loosened” so he can personally commence more litigation, when he says that journalists shouldn’t be allowed to use confidential sources, it is difficult even to suggest that he has not threatened free speech. And when he says to the head of the FBI (as former FBI director James Comey has said that he did) that Comey should consider “putting reporters in jail for publishing classified information,” it is difficult not to take those threats seriously.
The harder question, though, is this: How real are the threats? Or, as Michael Gerson put it in the Washington Post: Will Trump “go beyond mere Twitter abuse and move against institutions that limit his power?” Some of the president’s threats against the institution of the press, wittingly or not, have been simply preposterous. Surely someone has told him by now that neither he nor Congress can “loosen” libel laws; while each state has its own libel law, there is no federal libel law and thus nothing for him to loosen. What he obviously takes issue with is the impact that the Supreme Court’s 1964 First Amendment opinion in New York Times v. Sullivan has had on state libel laws. The case determined that public officials who sue for libel may not prevail unless they demonstrate that the statements made about them were false and were made with actual knowledge or suspicion of that falsity. So his objection to the rules governing libel law is to nothing less than the application of the First Amendment itself.
In other areas, however, the Trump administration has far more power to imperil free speech. We live under an Espionage Act, adopted a century ago, which is both broad in its language and uncommonly vague in its meaning. As such, it remains a half-open door through which an administration that is hostile to free speech might walk. Such an administration could initiate criminal proceedings against journalists who write about defense- or intelligence-related topics on the basis that classified information was leaked to them by present or former government employees. No such action has ever been commenced against a journalist. Press lawyers and civil-liberties advocates have strong arguments that the law may not be read so broadly and still be consistent with the First Amendment. But the scope of the Espionage Act and the impact of the First Amendment upon its interpretation remain unknown.
A related area in which the attitude of an administration toward the press may affect the latter’s ability to function as a check on government relates to the ability of journalists to protect the identity of their confidential sources. The Obama administration prosecuted more Espionage Act cases against sources of information to journalists than all prior administrations combined. After a good deal of deserved press criticism, it agreed to expand the internal guidelines of the Department of Justice designed to limit the circumstances under which such source revelation is demanded. But the guidelines are none too protective and are, after all, simply guidelines. A new administration is free to change or limit them or, in fact, abandon them altogether. In this area, as in so many others, it is too early to judge the ultimate treatment of free expression by the Trump administration. But the threats are real, and there is good reason to be wary.
Floyd Abrams is the author of The Soul of the First Amendment (Yale University Press, 2017).
Ayaan Hirsi AliFreedom of speech is being threatened in the United States by a nascent culture of hostility to different points of view. As political divisions in America have deepened, a conformist mentality of “right thinking” has spread across the country. Increasingly, American universities, where no intellectual doctrine ought to escape critical scrutiny, are some of the most restrictive domains when it comes to asking open-ended questions on subjects such as Islam.
Legally, speech in the United States is protected to a degree unmatched in almost any industrialized country. The U.S. has avoided unpredictable Canadian-style restrictions on speech, for example. I remain optimistic that as long as we have the First Amendment in the U.S., any attempt at formal legal censorship will be vigorously challenged.
Culturally, however, matters are very different in America. The regressive left is the forerunner threatening free speech on any issue that is important to progressives. The current pressure coming from those who call themselves “social-justice warriors” is unlikely to lead to successful legislation to curb the First Amendment. Instead, censorship is spreading in the cultural realm, particularly at institutions of higher learning.
The way activists of the regressive left achieve silence or censorship is by creating a taboo, and one of the most pernicious taboos in operation today is the word “Islamophobia.” Islamists are similarly motivated to rule any critical scrutiny of Islamic doctrine out of order. There is now a university center (funded by Saudi money) in the U.S. dedicated to monitoring and denouncing incidences of “Islamophobia.”
The term “Islamophobia” is used against critics of political Islam, but also against progressive reformers within Islam. The term implies an irrational fear that is tainted by hatred, and it has had a chilling effect on free speech. In fact, “Islamophobia” is a poorly defined term. Islam is not a race, and it is very often perfectly rational to fear some expressions of Islam. No set of ideas should be beyond critical scrutiny.
To push back in this cultural realm—in our universities, in public discourse—those favoring free speech should focus more on the message of dawa, the set of ideas that the Islamists want to promote. If the aims of dawa are sufficiently exposed, ordinary Americans and Muslim Americans will reject it. The Islamist message is a message of divisiveness, misogyny, and hatred. It’s anachronistic and wants people to live by tribal norms dating from the seventh century. The best antidote to Islamic extremism is the revelation of what its primary objective is: a society governed by Sharia. This is the opposite of censorship: It is documenting reality. What is life like in Saudi Arabia, Iran, the Northern Nigerian States? What is the true nature of Sharia law?
Islamists want to hide the true meaning of Sharia, Jihad, and the implications for women, gays, religious minorities, and infidels under the veil of “Islamophobia.” Islamists use “Islamophobia” to obfuscate their vision and imply that any scrutiny of political Islam is hatred and bigotry. The antidote to this is more exposure and more speech.
As pressure on freedom of speech increases from the regressive left, we must reject the notions that only Muslims can speak about Islam, and that any critical examination of Islamic doctrines is inherently “racist.”
Instead of contorting Western intellectual traditions so as not to offend our Muslim fellow citizens, we need to defend the Muslim dissidents who are risking their lives to promote the human rights we take for granted: equality for women, tolerance of all religions and orientations, our hard-won freedoms of speech and thought.
It is by nurturing and protecting such speech that progressive reforms can emerge within Islam. By accepting the increasingly narrow confines of acceptable discourse on issues such as Islam, we do dissidents and progressive reformers within Islam a grave disservice. For truly progressive reforms within Islam to be possible, full freedom of speech will be required.
Ayaan Hirsi Ali is a research fellow at the Hoover Institution, Stanford University, and the founder of the AHA Foundation.
Lee C. BollingerI know it is too much to expect that political discourse mimic the measured, self-questioning, rational, footnoting standards of the academy, but there is a difference between robust political debate and political debate infected with fear or panic. The latter introduces a state of mind that is visceral and irrational. In the realm of fear, we move beyond the reach of reason and a sense of proportionality. When we fear, we lose the capacity to listen and can become insensitive and mean.
Our Constitution is well aware of this fact about the human mind and of its negative political consequences. In the First Amendment jurisprudence established over the past century, we find many expressions of the problematic state of mind that is produced by fear. Among the most famous and potent is that of Justice Brandeis in Whitney v. California in 1927, one of the many cases involving aggravated fears of subversive threats from abroad. “It is the function of (free) speech,” he said, “to free men from the bondage of irrational fears.” “Men feared witches,” Brandeis continued, “and burned women.”
Today, our “witches” are terrorists, and Brandeis’s metaphorical “women” include the refugees (mostly children) and displaced persons, immigrants, and foreigners whose lives have been thrown into suspension and doubt by policies of exclusion.
The same fears of the foreign that take hold of a population inevitably infect our internal interactions and institutions, yielding suppression of unpopular and dissenting voices, victimization of vulnerable groups, attacks on the media, and the rise of demagoguery, with its disdain for facts, reason, expertise, and tolerance.
All of this poses a very special obligation on those of us within universities. Not only must we make the case in every venue for the values that form the core of who we are and what we do, but we must also live up to our own principles of free inquiry and fearless engagement with all ideas. This is why recent incidents on a handful of college campuses disrupting and effectively censoring speakers is so alarming. Such acts not only betray a basic principle but also inflame a rising prejudice against the academic community, and they feed efforts to delegitimize our work, at the very moment when it’s most needed.
I do not for a second support the view that this generation has an unhealthy aversion to engaging differences of opinion. That is a modern trope of polarization, as is the portrayal of universities as hypocritical about academic freedom and political correctness. But now, in this environment especially, universities must be at the forefront of defending the rights of all students and faculty to listen to controversial voices, to engage disagreeable viewpoints, and to make every effort to demonstrate our commitment to the sort of fearless and spirited debate that we are simultaneously asking of the larger society. Anyone with a voice can shout over a speaker; but being able to listen to and then effectively rebut those with whom we disagree—particularly those who themselves peddle intolerance—is one of the greatest skills our education can bestow. And it is something our democracy desperately needs more of. That is why, I say to you now, if speakers who are being denied access to other campuses come here, I will personally volunteer to introduce them, and listen to them, however much I may disagree with them. But I will also never hesitate to make clear why I disagree with them.
Lee C. Bollinger is the 19th president of Columbia University and the author of Uninhibited, Robust, and Wide-Open: A Free Press for a New Century. This piece has been excerpted from President Bollinger’s May 17 commencement address.
Richard A. Epstein
Today, the greatest threat to the constitutional protection of freedom of speech comes from campus rabble-rousers who invoke this very protection. In their book, the speech of people like Charles Murray and Heather Mac Donald constitutes a form of violence, bordering on genocide, that receives no First Amendment protection. Enlightened protestors are both bound and entitled to shout them down, by force or other disruptive actions, if their universities are so foolish as to extend them an invitation to speak. Any indignant minority may take the law into its own hands to eradicate the intellectual cancer before it spreads on their own campus.
By such tortured logic, a new generation of vigilantes distorts the First Amendment doctrine: Speech becomes violence, and violence becomes heroic acts of self-defense. The standard First Amendment interpretation emphatically rejects that view. Of course, the First Amendment doesn’t let you say what you want when and wherever you want to. Your freedom of speech is subject to the same limitations as your freedom of action. So you have no constitutional license to assault other people, to lie to them, or to form cartels to bilk them in the marketplace. But folks such as Murray, Mac Donald, and even Yiannopoulos do not come close to crossing into that forbidden territory. They are not using, for example, “fighting words,” rightly limited to words or actions calculated to provoke immediate aggression against a known target. Fighting words are worlds apart from speech that provokes a negative reaction in those who find your speech offensive solely because of the content of its message.
This distinction is central to the First Amendment. Fighting words have to be blocked by well-tailored criminal and civil sanctions lest some people gain license to intimidate others from speaking or peaceably assembling. The remedy for mere offense is to speak one’s mind in response. But it never gives anyone the right to block the speech of others, lest everyone be able to unilaterally increase his sphere of action by getting really angry about the beliefs of others. No one has the right to silence others by working himself into a fit of rage.
Obviously, it is intolerable to let mutual animosity generate factional warfare, whereby everyone can use force to silence rivals. To avoid this war of all against all, each side claims that only its actions are privileged. These selective claims quickly degenerate into a form of viewpoint discrimination, which undermines one of the central protections that traditional First Amendment law erects: a wall against each and every group out to destroy the level playing field on which robust political debate rests. Every group should be at risk for having its message fall flat. The new campus radicals want to upend that understanding by shutting down their adversaries if their universities do not. Their aggression must be met, if necessary, by counterforce. Silence in the face of aggression is not an acceptable alternative.
Richard A. Epstein is the Laurence A. Tisch Professor of Law at the New York University School of Law.
David FrenchWe’re living in the midst of a troubling paradox. At the exact same time that First Amendment jurisprudence has arguably never been stronger and more protective of free expression, millions of Americans feel they simply can’t speak freely. Indeed, talk to Americans living and working in the deep-blue confines of the academy, Hollywood, and the tech sector, and you’ll get a sense of palpable fear. They’ll explain that they can’t say what they think and keep their jobs, their friends, and sometimes even their families.
The government isn’t cracking down or censoring; instead, Americans are using free speech to destroy free speech. For example, a social-media shaming campaign is an act of free speech. So is an economic boycott. So is turning one’s back on a public speaker. So is a private corporation firing a dissenting employee for purely political reasons. Each of these actions is largely protected from government interference, and each one represents an expression of the speaker’s ideas and values.
The problem, however, is obvious. The goal of each of these kinds of actions isn’t to persuade; it’s to intimidate. The goal isn’t to foster dialogue but to coerce conformity. The result is a marketplace of ideas that has been emptied of all but the approved ideological vendors—at least in those communities that are dominated by online thugs and corporate bullies. Indeed, this mindset has become so prevalent that in places such as Portland, Berkeley, Middlebury, and elsewhere, the bullies and thugs have crossed the line from protected—albeit abusive—speech into outright shout-downs and mob violence.
But there’s something else going on, something that’s insidious in its own way. While politically correct shaming still has great power in deep-blue America, its effect in the rest of the country is to trigger a furious backlash, one characterized less by a desire for dialogue and discourse than by its own rage and scorn. So we’re moving toward two Americas—one that ruthlessly (and occasionally illegally) suppresses dissenting speech and the other that is dangerously close to believing that the opposite of political correctness isn’t a fearless expression of truth but rather the fearless expression of ideas best calculated to enrage your opponents.
The result is a partisan feedback loop where right-wing rage spurs left-wing censorship, which spurs even more right-wing rage. For one side, a true free-speech culture is a threat to feelings, sensitivities, and social justice. The other side waves high the banner of “free speech” to sometimes elevate the worst voices to the highest platforms—not so much to protect the First Amendment as to infuriate the hated “snowflakes” and trigger the most hysterical overreactions.
The culturally sustainable argument for free speech is something else entirely. It reminds the cultural left of its own debt to free speech while reminding the political right that a movement allegedly centered around constitutional values can’t abandon the concept of ordered liberty. The culture of free speech thrives when all sides remember their moral responsibilities—to both protect the right of dissent and to engage in ideological combat with a measure of grace and humility.
David French is a senior writer at National Review.
Pamela GellerThe real question isn’t whether free speech is under threat in the United States, but rather, whether it’s irretrievably lost. Can we get it back? Not without war, I suspect, as is evidenced by the violence at colleges whenever there’s the shamefully rare event of a conservative speaker on campus.
Free speech is the soul of our nation and the foundation of all our other freedoms. If we can’t speak out against injustice and evil, those forces will prevail. Freedom of speech is the foundation of a free society. Without it, a tyrant can wreak havoc unopposed, while his opponents are silenced.
With that principle in mind, I organized a free-speech event in Garland, Texas. The world had recently been rocked by the murder of the Charlie Hebdo cartoonists. My version of “Je Suis Charlie” was an event here in America to show that we can still speak freely and draw whatever we like in the Land of the Free. Yet even after jihadists attacked our event, I was blamed—by Donald Trump among others—for provoking Muslims. And if I tried to hold a similar event now, no arena in the country would allow me to do so—not just because of the security risk, but because of the moral cowardice of all intellectual appeasers.
Under what law is it wrong to depict Muhammad? Under Islamic law. But I am not a Muslim, I don’t live under Sharia. America isn’t under Islamic law, yet for standing for free speech, I’ve been:
- Prevented from running our advertisements in every major city in this country. We have won free-speech lawsuits all over the country, which officials circumvent by prohibiting all political ads (while making exceptions for ads from Muslim advocacy groups);
- Shunned by the right, shut out of the Conservative Political Action Conference;
- Shunned by Jewish groups at the behest of terror-linked groups such as the Council on American-Islamic Relations;
- Blacklisted from speaking at universities;
- Prevented from publishing books, for security reasons and because publishers fear shaming from the left;
- Banned from Britain.
A Seattle court accused me of trying to shut down free speech after we merely tried to run an FBI poster on global terrorism, because authorities had banned all political ads in other cities to avoid running ours. Seattle blamed us for that, which was like blaming a woman for being raped because she was wearing a short skirt.
This kind of vilification and shunning is key to the left’s plan to shut down all dissent from its agenda—they make legislation restricting speech unnecessary.
The same refusal to allow our point of view to be heard has manifested itself elsewhere. The foundation of my work is individual rights and equality for all before the law. These are the foundational principles of our constitutional republic. That is now considered controversial. Truth is the new hate speech. Truth is going to be criminalized.
The First Amendment doesn’t only protect ideas that are sanctioned by the cultural and political elites. If “hate speech” laws are enacted, who would decide what’s permissible and what’s forbidden? The government? The gunmen in Garland?
There has been an inversion of the founding premise of this nation. No longer is it the subordination of might to right, but right to might. History is repeatedly deformed with the bloody consequences of this transition.
Pamela Geller is the editor in chief of the Geller Report and president of the American Freedom Defense Initiative.
Jonah GoldbergOf course free speech is under threat in America. Frankly, it’s always under threat in America because it’s always under threat everywhere. Ronald Reagan was right when he said in 1961, “Freedom is never more than one generation away from extinction. We didn’t pass it on to our children in the bloodstream. It must be fought for, protected, and handed on for them to do the same.”
This is more than political boilerplate. Reagan identified the source of the threat: human nature. God may have endowed us with a right to liberty, but he didn’t give us all a taste for it. As with most finer things, we must work to acquire a taste for it. That is what civilization—or at least our civilization—is supposed to do: cultivate attachments to certain ideals. “Cultivate” shares the same Latin root as “culture,” cultus, and properly understood they mean the same thing: to grow, nurture, and sustain through labor.
In the past, threats to free speech have taken many forms—nationalist passion, Comstockery (both good and bad), political suppression, etc.—but the threat to free speech today is different. It is less top-down and more bottom-up. We are cultivating a generation of young people to reject free speech as an important value.
One could mark the beginning of the self-esteem movement with Nathaniel Branden’s 1969 paper, “The Psychology of Self-Esteem,” which claimed that “feelings of self-esteem were the key to success in life.” This understandable idea ran amok in our schools and in our culture. When I was a kid, Saturday-morning cartoons were punctuated with public-service announcements telling kids: “The most important person in the whole wide world is you, and you hardly even know you!”
The self-esteem craze was just part of the cocktail of educational fads. Other ingredients included multiculturalism, the anti-bullying crusade, and, of course, that broad phenomenon known as “political correctness.” Combined, they’ve produced a generation that rejects the old adage “sticks and stones can break my bones but words can never harm me” in favor of the notion that “words hurt.” What we call political correctness has been on college campuses for decades. But it lacked a critical mass of young people who were sufficiently receptive to it to make it a fully successful ideology. The campus commissars welcomed the new “snowflakes” with open arms; truly, these are the ones we’ve been waiting for.
“Words hurt” is a fashionable concept in psychology today. (See Psychology Today: “Why Words Can Hurt at Least as Much as Sticks and Stones.”) But it’s actually a much older idea than the “sticks and stones” aphorism. For most of human history, it was a crime to say insulting or “injurious” things about aristocrats, rulers, the Church, etc. That tendency didn’t evaporate with the Divine Right of Kings. Jonathan Haidt has written at book length about our natural capacity to create zones of sanctity, immune from reason.
And that is the threat free speech faces today. Those who inveigh against “hate speech” are in reality fighting “heresy speech”—ideas that do “violence” to sacred notions of self-esteem, racial or gender equality, climate change, and so on. Put whatever label you want on it, contemporary “social justice” progressivism acts as a religion, and it has no patience for blasphemy.
When Napoleon’s forces converted churches into stables, the clergy did not object on the grounds that regulations regarding the proper care and feeding of animals had been violated. They complained of sacrilege and blasphemy. When Charles Murray or Christina Hoff Summers visits college campuses, the protestors are behaving like the zealous acolytes of St. Jerome. Appeals to the First Amendment have as much power over the “antifa” fanatics as appeals to Odin did to champions of the New Faith.
That is the real threat to free speech today.
Jonah Goldberg is a senior editor at National Review and a fellow at the American Enterprise Institute.
KC JohnsonIn early May, the Washington Post urged universities to make clear that “racist signs, symbols, and speech are off-limits.” Given the extraordinarily broad definition of what constitutes “racist” speech at most institutions of higher education, this demand would single out most right-of-center (and, in some cases, even centrist and liberal) discourse on issues of race or ethnicity. The editorial provided the highest-profile example of how hostility to free speech, once confined to the ideological fringe on campus, has migrated to the liberal mainstream.
The last few years have seen periodic college protests—featuring claims that significant amounts of political speech constitute “violence,” thereby justifying censorship—followed by even more troubling attempts to appease the protesters. After the mob scene that greeted Charles Murray upon his visit to Middlebury College, for instance, the student government criticized any punishment for the protesters, and several student leaders wanted to require that future speakers conform to the college’s “community standard” on issues of race, gender, and ethnicity. In the last few months, similar attempts to stifle the free exchange of ideas in the name of promoting diversity occurred at Wesleyan, Claremont McKenna, and Duke. Offering an extreme interpretation of this point of view, one CUNY professor recently dismissed dialogue as “inherently conservative,” since it reinforced the “relations of power that presently exist.”
It’s easy, of course, to dismiss campus hostility to free speech as affecting only a small segment of American public life—albeit one that trains the next generation of judges, legislators, and voters. But, as Jonathan Chait observed in 2015, denying “the legitimacy of political pluralism on issues of race and gender” has broad appeal on the left. It is only most apparent on campus because “the academy is one of the few bastions of American life where the political left can muster the strength to impose its political hegemony upon others.” During his time in office, Barack Obama generally urged fellow liberals to support open intellectual debate. But the current campus environment previews the position of free speech in a post-Obama Democratic Party, increasingly oriented around identity politics.
Waning support on one end of the ideological spectrum for this bedrock American principle should provide a political opening for the other side. The Trump administration, however, seems poorly suited to make the case. Throughout his public career, Trump has rarely supported free speech, even in the abstract, and has periodically embraced legal changes to facilitate libel lawsuits. Moreover, the right-wing populism that motivates Trump’s base has a long tradition of ideological hostility to civil liberties of all types. Even in campus contexts, conservatives have defended free speech inconsistently, as seen in recent calls that CUNY disinvite anti-Zionist fanatic Linda Sarsour as a commencement speaker.
In a sharply polarized political environment, awash in dubiously-sourced information, free speech is all the more important. Yet this same environment has seen both sides, most blatantly elements of the left on campuses, demand restrictions on their ideological foes’ free speech in the name of promoting a greater good.
KC Johnson is a professor of history at Brooklyn College and the CUNY Graduate Center.
Laura KipnisI find myself with a strange-bedfellows problem lately. Here I am, a left-wing feminist professor invited onto the pages of Commentary—though I’d be thrilled if it were still 1959—while fielding speaking requests from right-wing think tanks and libertarians who oppose child-labor laws.
Somehow I’ve ended up in the middle of the free-speech-on-campus debate. My initial crime was publishing a somewhat contentious essay about campus sexual paranoia that put me on the receiving end of Title IX complaints. Apparently I’d created a “hostile environment” at my university. I was investigated (for 72 days). Then I wrote up what I’d learned about these campus inquisitions in a second essay. Then I wrote about it all some more, in a book exposing the kangaroo-court elements of the Title IX process—and the extra-legal gag orders imposed on everyone caught in its widening snare.
I can’t really comment on whether more charges have been filed against me over the book. I’ll just say that writing about being a Title IX respondent could easily become a life’s work. I learned, shortly after writing this piece, that I and my publisher were being sued for defamation, among other things.
Is free speech under threat on American campuses? Yes. We know all about student activists who wish to shut down talks by people with opposing views. I got smeared with a bit of that myself, after a speaking invitation at Wellesley—some students made a video protesting my visit before I arrived. The talk went fine, though a group of concerned faculty circulated an open letter afterward also protesting the invitation: My views on sexual politics were too heretical, and might have offended students.
I didn’t take any of this too seriously, even as right-wing pundits crowed, with Wellesley as their latest outrage bait. It was another opportunity to mock student activists, and the fact that I was myself a feminist rather than a Charles Murray or a Milo Yiannopoulos, made them positively gleeful.
I do find myself wondering where all my new free-speech pals were when another left-wing professor, Steven Salaita, was fired (or if you prefer euphemism, “his job offer was withdrawn”) from the University of Illinois after he tweeted criticism of Israel’s Gaza policy. Sure the tweets were hyperbolic, but hyperbole and strong opinions are protected speech, too.
I guess free speech is easy to celebrate until it actually challenges something. Funny, I haven’t seen Milo around lately—so beloved by my new friends when he was bashing minorities and transgender kids. Then he mistakenly said something authentic (who knew he was capable of it!), reminiscing about an experience a lot of gay men have shared: teenage sex with older men. He tried walking it back—no, no, he’d been a victim, not a participant—but his fan base was shrieking about pedophilia and fleeing in droves. Gee, they were all so against “political correctness” a few minutes before.
It’s easy to be a free-speech fan when your feathers aren’t being ruffled. No doubt what makes me palatable to the anti-PC crowd is having thus far failed to ruffle them enough. I’m just going to have to work harder.
Laura Kipnis’s latest book is Unwanted Advances: Sexual Paranoia Comes to Campus.
Eugene KontorovichThe free and open exchange of views—especially politically conservative or traditionally religious ones—is being challenged. This is taking place not just at college campuses but throughout our public spaces and cultural institutions. James Watson was fired from the lab he led since 1968 and could not speak at New York University because of petty, censorious students who would not know DNA from LSD. Our nation’s founders and heroes are being “disappeared” from public commemoration, like Trotsky from a photograph of Soviet rulers.
These attacks on “free speech” are not the result of government action. They are not what the First Amendment protects against. The current methods—professional and social shaming, exclusion, and employment termination—are more inchoate, and their effects are multiplied by self-censorship. A young conservative legal scholar might find himself thinking: “If the late Justice Antonin Scalia can posthumously be deemed a ‘bigot’ by many academics, what chance have I?”
Ironically, artists and intellectuals have long prided themselves on being the first defenders of free speech. Today, it is the institutions of both popular and high culture that are the censors. Is there one poet in the country who would speak out for Ann Coulter?
The inhibition of speech at universities is part of a broader social phenomenon of making longstanding, traditional views and practices sinful overnight. Conservatives have not put up much resistance to this. To paraphrase Martin Niemöller’s famous dictum: “First they came for Robert E. Lee, and I said nothing, because Robert E. Lee meant nothing to me.”
The situation with respect to Israel and expressions of support for it deserves separate discussion. Even as university administrators give political power to favored ideologies by letting them create “safe spaces” (safe from opposing views), Jews find themselves and their state at the receiving end of claims of apartheid—modern day blood libels. It is not surprising if Jewish students react by demanding that they get a safe space of their own. It is even less surprising if their parents, paying $65,000 a year, want their children to have a nicer time of it. One hears Jewish groups frequently express concern about Jewish students feeling increasingly isolated and uncomfortable on campus.
But demanding selective protection from the new ideological commissars is unlikely to bring the desired results. First, this new ideology, even if it can be harnessed momentarily to give respite to harassed Jews on campus, is ultimately illiberal and will be controlled by “progressive” forces. Second, it is not so terrible for Jews in the Diaspora to feel a bit uncomfortable. It has been the common condition of Jews throughout the millennia. The social awkwardness that Jews at liberal arts schools might feel in being associated with Israel is of course one of the primary justifications for the Jewish State. Facing the snowflakes incapable of hearing a dissonant view—but who nonetheless, in the grip of intersectional ecstasy, revile Jewish self-determination—Jewish students should toughen up.
Eugene Kontorovich teaches constitutional law at Northwestern University and heads the international law department of the Kohelet Policy Forum in Jerusalem.
Nicholas LemannThere’s an old Tom Wolfe essay in which he describes being on a panel discussion at Princeton in 1965 and provoking the other panelists by announcing that America, rather than being in crisis, is in the middle of a “happiness explosion.” He was arguing that the mass effects of 20 years of post–World War II prosperity made for a larger phenomenon than the Vietnam War, the racial crisis, and the other primary concerns of intellectuals at the time.
In the same spirit, I’d say that we are in the middle of a free-speech explosion, because of 20-plus years of the Internet and 10-plus years of social media. If one understands speech as disseminated individual opinion, then surely we live in the free-speech-est society in the history of the world. Anybody with access to the unimpeded World Wide Web can say anything to a global audience, and anybody can hear anything, too. All threats to free speech should be understood in the context of this overwhelmingly reality.
It is a comforting fantasy that a genuine free-speech regime will empower mainly “good,” but previously repressed, speech. Conversely, repressive regimes that are candid enough to explain their anti-free-speech policies usually say that they’re not against free speech, just “bad” speech. We have to accept that more free speech probably means, in the aggregate, more bad speech, and also a weakening of the power, authority, and economic support for information professionals such as journalists. Welcome to the United States in 2017.
I am lucky enough to live and work on the campus of a university, Columbia, that has been blessedly free of successful attempts to repress free speech. Just in the last few weeks, Charles Murray and Dinesh D’Souza have spoken here without incident. But, yes, the evidently growing popularity of the idea that “hate speech” shouldn’t be permitted on campuses is a problem, especially, it seems, at small private liberal-arts colleges. We should all do our part, and I do, by frequently and publicly endorsing free-speech principles. Opposing the BDS movement falls squarely into that category.
It’s not just on campuses that free-speech vigilance is needed, though. The number-one threat to free speech, to my mind, is that the wide-open Web has been replaced by privately owned platforms such as Facebook and Google as the way most people experience the public life of the Internet. These companies are committed to banning “hate speech,” and they are eager to operate freely in countries, like China, that don’t permit free political speech. That makes for a far more consequential constrained environment than any campus’s speech code.
Also, Donald Trump regularly engages in presidentially unprecedented rhetoric demonizing people who disagree with him. He seems to think this is all in good fun, but, as we have already seen at his rallies, not everybody hears it that way. The place where Trumpism will endanger free speech isn’t in the center—the White House press room—but at the periphery, for example in the way that local police handle bumptious protestors and the journalists covering them. This is already happening around the country. If Trump were as disciplined and knowledgeable as Vladimir Putin or Recep Tayyip Erdogan, which so far he seems not to be, then free speech could be in even more serious danger from government, which in most places is its usual main enemy.
Nicholas Lemann is a professor at Columbia Journalism School and a staff writer for the New Yorker.
Michael J. LewisFree speech is a right but it is also a habit, and where the habit shrivels so will the right. If free speech today is in headlong retreat—everywhere threatened by regulation, organized harassment, and even violence—it is in part because our political culture allowed the practice of persuasive oratory to atrophy. The process began in 1973, an unforeseen side effect of Roe v. Wade. Legislators were delighted to learn that by relegating this divisive matter of public policy to the Supreme Court and adopting a merely symbolic position, they could sit all the more safely in their safe seats.
Since then, one crucial question of public policy after another has been punted out of the realm of politics and into the judicial. Issues that might have been debated with all the rhetorical agility of a Lincoln and a Douglas, and then subjected to a process of negotiation, compromise, and voting, have instead been settled by decree: e.g., Chevron, Kelo, Obergefell. The consequences for speech have been pernicious. Since the time of Pericles, deliberative democracy has been predicated on the art of persuasion, which demands the forceful clarity of thought and expression without which no one has ever been persuaded. But a legislature that relegates its authority to judges and regulators will awaken to discover its oratorical culture has been stunted. When politicians, rather than seeking to convince and win over, prefer to project a studied and pleasant vagueness, debate withers into tedious defensive performance. It has been decades since any presidential debate has seen any sustained give and take over a matter of policy. If there is any suspense at all, it is only the possibility that a fatigued or peeved candidate might blurt out that tactless shard of truth known as a gaffe.
A generation accustomed to hearing platitudes smoothly dispensed from behind a teleprompter will find the speech of a fearless extemporaneous speaker to be startling, even disquieting; unfamiliar ideas always are. Unhappily, they have been taught to interpret that disquiet as an injury done to them, rather than as a premise offered to them to consider. All this would not have happened—certainly not to this extent—had not our deliberative democracy decided a generation ago that it preferred the security of incumbency to the risks of unshackled debate. The compulsory contraction of free speech on college campuses is but the logical extension of the voluntary contraction of free speech in our political culture.
Michael J. Lewis’s new book is City of Refuge: Separatists and Utopian Town Planning (Princeton University Press).
Heather Mac DonaldThe answer to the symposium question depends on how powerful the transmission belt is between academia and the rest of the country. On college campuses, violence and brute force are silencing speakers who challenge left-wing campus orthodoxies. These totalitarian outbreaks have been met with listless denunciations by college presidents, followed by . . . virtually nothing. As of mid-May, the only discipline imposed for 2017’s mass attacks on free speech at UC Berkeley, Middlebury, and Clare-mont McKenna College was a letter of reprimand inserted—sometimes only temporarily—into the files of several dozen Middlebury students, accompanied by a brief period of probation. Previous outbreaks of narcis-sistic incivility, such as the screaming-girl fit at Yale and the assaults on attendees of Yale’s Buckley program, were discreetly ignored by college administrators.
Meanwhile, the professoriate unapologetically defends censorship and violence. After the February 1 riot in Berkeley to prevent Milo Yiannapoulos from speaking, Déborah Blocker, associate professor of French at UC Berkeley, praised the rioters. They were “very well-organized and very efficient,” Blocker reported admiringly to her fellow professors. “They attacked property but they attacked it very sparingly, destroying just enough University property to obtain the cancellation order for the MY event and making sure no one in the crowd got hurt” (emphasis in original). (In fact, perceived Milo and Donald Trump supporters were sucker-punched and maced; businesses downtown were torched and vandalized.) New York University’s vice provost for faculty, arts, humanities, and diversity, Ulrich Baer, displayed Orwellian logic by claiming in a New York Times op-ed that shutting down speech “should be understood as an attempt to ensure the conditions of free speech for a greater group of people.”
Will non-academic institutions take up this zeal for outright censorship? Other ideological products of the left-wing academy have been fully absorbed and operationalized. Racial victimology, which drives much of the campus censorship, is now standard in government and business. Corporate diversity trainers counsel that bias is responsible for any lack of proportional racial representation in the corporate ranks. Racial disparities in school discipline and incarceration are universally attributed to racism rather than to behavior. Public figures have lost jobs for violating politically correct taboos.
Yet Americans possess an instinctive commitment to the First Amendment. Federal judges, hardly an extension of the Federalist Society, have overwhelmingly struck down campus speech codes. It is hard to imagine that they would be any more tolerant of the hate-speech legislation so prevalent in Europe. So the question becomes: At what point does the pressure to conform to the elite worldview curtail freedom of thought and expression, even without explicit bans on speech?
Social stigma against conservative viewpoints is not the same as actual censorship. But the line can blur. The Obama administration used regulatory power to impose a behavioral conformity on public and private entities. School administrators may have technically still possessed the right to dissent from novel theories of gender, but they had to behave as if they were fully on board with the transgender revolution when it came to allowing boys to use girls’ bathrooms and locker rooms.
Had Hillary Clinton had been elected president, the federal bureaucracy would have mimicked campus diversocrats with even greater zeal. That threat, at least, has been avoided. Heresies against left-wing dogma may still enter the public arena, if only by the back door. The mainstream media have lurched even further left in the Trump era, but the conservative media, however mocked and marginalized, are expanding (though Twitter and Facebook’s censorship of conservative speakers could be a harbinger of more official silencing).
Outside the academy, free speech is still legally protected, but its exercise requires ever greater determination.
Heather Mac Donald is a fellow at the Manhattan Institute and the author of The War on Cops.
John McWhorterThere is a certain mendacity, as Brick put it in Cat on a Hot Tin Roof, in our discussion of free speech on college campuses. Namely, none of us genuinely wish that absolutely all issues be aired in the name of education and open-mindedness. To insist so is to pretend that civilized humanity makes nothing we could call advancement in philosophical consensus.
I doubt we need “free speech” on issues such as whether slavery and genocide are okay, whether it has been a mistake to view women as men’s equals, or to banish as antique the idea that whites are a master race while other peoples represent a lower rung on the Darwinian scale. With all due reverence of John Stuart Mill’s advocacy for the regular airing of even noxious views in order to reinforce clarity on why they were rejected, we are also human beings with limited time. A commitment to the Enlightenment justifiably will decree that certain views are, indeed, no longer in need of discussion.
However, our modern social-justice warriors are claiming that this no-fly zone of discussion is vaster than any conception of logic or morality justifies. We are being told that questions regarding the modern proposals about cultural appropriation, about whether even passing infelicitous statements constitute racism in the way that formalized segregation and racist disparagement did, or about whether social disparities can be due to cultural legacies rather than structural impediments, are as indisputably egregious, backwards, and abusive as the benighted views of the increasingly distant past.
That is, the new idea is not only that discrimination and inequality still exist, but that to even question the left’s utopian expectation on such matters justifies the same furious, sloganistic and even physically violent resistance that was once levelled against those designated heretics by a Christian hegemony.
Of course the protesters in question do not recognize themselves in a portrait as opponents of something called heresy. They suppose that Galileo’s opponents were clearly wrong but that they, today, are actually correct in a way that no intellectual or moral argument could coherently deny.
As such, we have students allowed to decree college campuses as “racist” when they are the least racist spaces on the planet—because they are, predictably given the imperfection of humans, not perfectly free of passingly unsavory interactions. Thinkers invited to talk for a portion of an hour from the right rather than the left and then have dinner with a few people and fly home are treated as if they were reanimated Hitlers. The student of color who hears a few white students venturing polite questions about the leftist orthodoxy is supported in fashioning these questions as “racist” rhetoric.
The people on college campuses who openly and aggressively spout this new version of Christian (or even Islamist) crusading—ironically justifying it as a barricade against “fascist” muzzling of freedom when the term applies ominously well to the regime they are fostering—are a minority. However, the sawmill spinning blade of their rhetoric has succeeding in rendering opposition as risky as espousing pedophilia, such that only those natively open to violent criticism dare speak out. The latter group is small. The campus consensus thereby becomes, if only at moralistic gunpoint à la the ISIS victim video, a strangled hard-leftism.
Hence freedom of speech is indeed threatened on today’s college campuses. I have lost count of how many of my students, despite being liberal Democrats (many of whom sobbed at Hillary Clinton’s loss last November), have told me that they are afraid to express their opinions about issues that matter, despite the fact that their opinions are ones that any liberal or even leftist person circa 1960 would have considered perfectly acceptable.
Something has shifted of late, and not in a direction we can legitimately consider forwards.
John McWhorter teaches linguistics, philosophy, and music history at Columbia University and is the author of The Language Hoax, Words on the Move, and Talking Back, Talking Black.
Kate Bachelder OdellIt’s 2021, and Harvard Square has devolved into riots: Some 120 people are injured in protests, and the carnage includes fire-consumed cop cars and smashed-in windows. The police discharge canisters of tear gas, and, after apprehending dozens of protesters, enforce a 1:45 A.M. curfew. Anyone roaming the streets after hours is subject to arrest. About 2,000 National Guardsmen are prepared to intervene. Such violence and disorder is also roiling Berkeley and other elite and educated areas.
Oh, that’s 1970. The details are from the Harvard Crimson’s account of “anti-war” riots that spring. The episode is instructive in considering whether free speech is under threat in the United States. Almost daily, there’s a new YouTube installment of students melting down over viewpoints of speakers invited to one campus or another. Even amid speech threats from government—for example, the IRS’s targeting of political opponents—nothing has captured the public’s attention like the end of free expression at America’s institutions of higher learning.
Yet disruption, confusion, and even violence are not new campus phenomena. And it’s hard to imagine that young adults who deployed brute force in the 1960s and ’70s were deeply committed to the open and peaceful exchange of ideas.
There may also be reason for optimism. The rough and tumble on campus in the 1960s and ’70s produced a more even-tempered ’80s and ’90s, and colleges are probably heading for another course correction. In covering the ruckuses at Yale, Missouri, and elsewhere, I’ve talked to professors and students who are figuring out how to respond to the illiberalism, even if the reaction is delayed. The University of Chicago put out a set of free-speech principles last year, and others schools such as Princeton and Purdue have endorsed them.
The NARPs—Non-Athletic Regular People, as they are sometimes known on campus—still outnumber the social-justice warriors, who appear to be overplaying their hand. Case in point is the University of Missouri, which experienced a precipitous drop in enrollment after instructor Melissa Click and her ilk stoked racial tensions last spring. The college has closed dorms and trimmed budgets. Which brings us to another silver lining: The economic model of higher education (exorbitant tuition to pay ever more administrators) may blow up traditional college before the fascists can.
Note also that the anti-speech movement is run by rich kids. A Brookings Institution analysis from earlier this year discovered that “the average enrollee at a college where students have attempted to restrict free speech comes from a family with an annual income $32,000 higher than that of the average student in America.” Few rank higher in average income than those at Middlebury College, where students evicted scholar Charles Murray in a particularly ugly scene. (The report notes that Murray was received respectfully at Saint Louis University, “where the median income of students’ families is half Middlebury’s.”) The impulses of over-adulated 20-year-olds may soon be tempered by the tyranny of having to show up for work on a daily basis.
None of this is to suggest that free speech is enjoying some renaissance either on campus or in America. But perhaps as the late Wall Street Journal editorial-page editor Robert Bartley put it in his valedictory address: “Things could be worse. Indeed, they have been worse.”
Kate Bachelder Odell is an editorial writer for the Wall Street Journal.
Jonathan RauchIs free speech under threat? The one-syllable answer is “yes.” The three-syllable answer is: “Yes, of course.” Free speech is always under threat, because it is not only the single most successful social idea in all of human history, it is also the single most counterintuitive. “You mean to say that speech that is offensive, untruthful, malicious, seditious, antisocial, blasphemous, heretical, misguided, or all of the above deserves government protection?” That seemingly bizarre proposition is defensible only on the grounds that the marketplace of ideas turns out to be the most powerful engine of knowledge, prosperity, liberty, social peace, and moral advancement that our species has had the good fortune to discover.
Every new generation of free-speech advocates will need to get up every morning and re-explain the case for free speech and open inquiry—today, tomorrow, and forever. That is our lot in life, and we just need to be cheerful about it. At discouraging moments, it is helpful to remember that the country has made great strides toward free speech since 1798, when the Adams administration arrested and jailed its political critics; and since the 1920s, when the U.S. government banned and burned James Joyce’s great novel Ulysses; and since 1954, when the government banned ONE, a pioneering gay journal. (The cover article was a critique of the government’s indecency censors, who censored it.) None of those things could happen today.
I suppose, then, the interesting question is: What kind of threat is free speech under today? In the present age, direct censorship by government bodies is rare. Instead, two more subtle challenges hold sway, especially, although not only, on college campuses. The first is a version of what I called, in my book Kindly Inquisitors, the humanitarian challenge: the idea that speech that is hateful or hurtful (in someone’s estimation) causes pain and thus violates others’ rights, much as physical violence does. The other is a version of what I called the egalitarian challenge: the idea that speech that denigrates minorities (again, in someone’s estimation) perpetuates social inequality and oppression and thus also is a rights violation. Both arguments call upon administrators and other bureaucrats to defend human rights by regulating speech rights.
Both doctrines are flawed to the core. Censorship harms minorities by enforcing conformity and entrenching majority power, and it no more ameliorates hatred and injustice than smashing thermometers ameliorates global warming. If unwelcome words are the equivalent of bludgeons or bullets, then the free exchange of criticism—science, in other words—is a crime. I could go on, but suffice it to say that the current challenges are new variations on ancient themes—and they will be followed, in decades and centuries to come, by many, many other variations. Memo to free-speech advocates: Our work is never done, but the really amazing thing, given the proposition we are tasked to defend, is how well we are doing.
Jonathan Rauch is a senior fellow at the Brookings Institution and the author of Kindly Inquisitors: The New Attacks on Free Thought.
Nicholas Quinn RosenkranzSpeech is under threat on American campuses as never before. Censorship in various forms is on the rise. And this year, the threat to free speech on campus took an even darker turn, toward actual violence. The prospect of Milo Yiannopoulos speaking at Berkeley provoked riots that caused more than $100,000 worth of property damage on the campus. The prospect of Charles Murray speaking at Middlebury led to a riot that put a liberal professor in the hospital with a concussion. Ann Coulter’s speech at Berkeley was cancelled after the university determined that none of the appropriate venues could be protected from “known security threats” on the date in question.
The free-speech crisis on campus is caused, at least in part, by a more insidious campus pathology: the almost complete lack of intellectual diversity on elite university faculties. At Yale, for example, the number of registered Republicans in the economics department is zero; in the psychology department, there is one. Overall, there are 4,410 faculty members at Yale, and the total number of those who donated to a Republican candidate during the 2016 primaries was three.
So when today’s students purport to feel “unsafe” at the mere prospect of a conservative speaker on campus, it may be easy to mock them as “delicate snowflakes,” but in one sense, their reaction is understandable: If students are shocked at the prospect of a Republican behind a university podium, perhaps it is because many of them have never before laid eyes on one.
To see the connection between free speech and intellectual diversity, consider the recent commencement speech of Harvard President Drew Gilpin Faust:
Universities must be places open to the kind of debate that can change ideas….Silencing ideas or basking in intellectual orthodoxy independent of facts and evidence impedes our access to new and better ideas, and it inhibits a full and considered rejection of bad ones. . . . We must work to ensure that universities do not become bubbles isolated from the concerns and discourse of the society that surrounds them. Universities must model a commitment to the notion that truth cannot simply be claimed, but must be established—established through reasoned argument, assessment, and even sometimes uncomfortable challenges that provide the foundation for truth.
Faust is exactly right. But, alas, her commencement audience might be forgiven a certain skepticism. After all, the number of registered Republicans in several departments at Harvard—e.g., history and psychology—is exactly zero. In those departments, the professors themselves may be “basking in intellectual orthodoxy” without ever facing “uncomfortable challenges.” This may help explain why some students will do everything in their power to keep conservative speakers off campus: They notice that faculty hiring committees seem to do exactly the same thing.
In short, it is a promising sign that true liberal academics like Faust have started speaking eloquently about the crucial importance of civil, reasoned disagreement. But they will be more convincing on this point when they hire a few colleagues with whom they actually disagree.
Nicholas Quinn Rosenkranz is a professor of law at Georgetown. He serves on the executive committee of Heterodox Academy, which he co-founded, on the board of directors of the Federalist Society, and on the board of directors of the Foundation for Individual Rights in Education (FIRE).
Ben ShapiroIn February, I spoke at California State University in Los Angeles. Before my arrival, professors informed students that a white supremacist would be descending on the school to preach hate; threats of violence soon prompted the administration to cancel the event. I vowed to show up anyway. One hour before the event, the administration backed down and promised to guarantee that the event could go forward, but police officers were told not to stop the 300 students, faculty, and outside protesters who blocked and assaulted those who attempted to attend the lecture. We ended up trapped in the auditorium, with the authorities telling students not to leave for fear of physical violence. I was rushed from campus under armed police guard.
Is free speech under assault?
Of course it is.
On campus, free speech is under assault thanks to a perverse ideology of intersectionality that claims victim identity is of primary value and that views are a merely secondary concern. As a corollary, if your views offend someone who outranks you on the intersectional hierarchy, your views are treated as violence—threats to identity itself. On campus, statements that offend an individual’s identity have been treated as “microaggressions”–actual aggressions against another, ostensibly worthy of violence. Words, students have been told, may not break bones, but they will prompt sticks and stones, and rightly so.
Thus, protesters around the country—leftists who see verbiage as violence—have, in turn, used violence in response to ideas they hate. Leftist local authorities then use the threat of violence as an excuse to ideologically discriminate against conservatives. This means public intellectuals like Charles Murray being run off of campus and his leftist professorial cohort viciously assaulted; it means Ann Coulter being targeted for violence at Berkeley; it means universities preemptively banning me and Ayaan Hirsi Ali and Condoleezza Rice and even Jason Riley.
The campus attacks on free speech are merely the most extreme iteration of an ideology that spans from left to right: the notion that your right to free speech ends where my feelings begin. Even Democrats who say that Ann Coulter should be allowed to speak at Berkeley say that nobody should be allowed to contribute to a super PAC (unless you’re a union member, naturally).
Meanwhile, on the right, the president’s attacks on the press have convinced many Republicans that restrictions on the press wouldn’t be altogether bad. A Vanity Fair/60 Minutes poll in late April found that 36 percent of Americans thought freedom of the press “does more harm than good.” Undoubtedly, some of that is due to the media’s obvious bias. CNN’s Jeff Zucker has targeted the Trump administration for supposedly quashing journalism, but he was silent when the Obama administration’s Department of Justice cracked down on reporters from the Associated Press and Fox News, and when hacks like Deputy National Security Adviser Ben Rhodes openly sold lies regarding Iran. But for some on the right, the response to press falsities hasn’t been to call for truth, but to instead echo Trumpian falsehoods in the hopes of damaging the media. Free speech is only important when people seek the truth. Leftists traded truth for tribalism long ago; in response, many on the right seem willing to do the same. Until we return to a common standard under which facts matter, free speech will continue to rest on tenuous grounds.
Ben Shapiro is the editor in chief of The Daily Wire and the host of The Ben Shapiro Show.
Judith ShulevitzIt’s tempting to blame college and university administrators for the decline of free speech in America, and for years I did just that. If the guardians of higher education won’t inculcate the habits of mind required for serious thinking, I thought, who will? The unfettered but civil exchange of ideas is the basic operation of education, just as addition is the basic operation of arithmetic. And universities have to teach both the unfettered part and the civil part, because arguing in a respectful manner isn’t something anyone does instinctively.
So why change my mind now? Schools still cling to speech codes, and there still aren’t enough deans like the one at the University of Chicago who declared his school a safe-space-free zone. My alma mater just handed out prizes for “enhancing race and/or ethnic relations” to two students caught on video harassing the dean of their residential college, one screaming at him that he’d created “a space for violence to happen,” the other placing his face inches away from the dean’s and demanding, “Look at me.” All this because they deemed a thoughtful if ill-timed letter about Halloween costumes written by the dean’s wife to be an act of racist aggression. Yale should discipline students who behave like that, even if they’re right on the merits (I don’t think they were, but that’s not the point). They certainly don’t deserve awards. I can’t believe I had to write that sentence.
But in abdicating their responsibilites, the universities have enabled something even worse than an attack on free speech. They’ve unleashed an assault on themselves. There’s plenty of free speech around; we know that because so much bad speech—low-minded nonsense—tests our constitutional tolerance daily, and that’s holding up pretty well. (As Nicholas Lemann observes elsewhere in this symposium, Facebook and Google represent bigger threats to free speech than students and administrators.) What’s endangered is good speech.
Universities were setting themselves up to be used. Provocateurs exploit the atmosphere on campus to goad overwrought students, then gleefully trash the most important bastion of our crumbling civil society. Higher education and everything it stands for—logical argument, the scientific method, epistemological rigor—start to look illegitimate. Voters perceive tenure and research and higher education itself as hopelessly partisan and unworthy of taxpayers’ money.
The press is a secondary victim of this process of delegitimization. If serious inquiry can be waved off as ideology, then facts won’t be facts and reporting can’t be trusted. All journalism will be equal to all other journalism, and all journalists will be reduced to pests you can slam to the ground with near impunity. Politicians will be able to say anything and do just about anything and there will be no countervailing authority to challenge them. I’m pretty sure that that way lies Putinism and Erdoganism. And when we get to that point, I’m going to start worrying about free speech again.
Judith Shulevitz is a critic in New York.
Harvey SilverglateFree speech is, and has always been, threatened. The title of Nat Hentoff’s 1993 book Free Speech for Me – but Not for Thee is no less true today than at any time, even as the Supreme Court has accorded free speech a more absolute degree of protection than in any previous era.
Since the 1980s, the high court has decided most major free-speech cases in favor of speech, with most of the major decisions being unanimous or nearly so.
Women’s-rights advocates were turned back by the high court in 1986 when they sought to ban the sale of printed materials that, because deemed pornographic by some, were alleged to promote violence against women. Censorship in the name of gender–based protection thus failed to gain traction.
Despite the demands of civil-rights activists, the Supreme Court in 1992 declared cross-burning to be a protected form of expression in R.A.V. v. City of St. Paul, a decision later refined to strengthen a narrow exception for when cross-burning occurs primarily as a physical threat rather than merely an expression of hatred.
Other attempts at First Amendment circumvention have been met with equally decisive rebuff. When the Reverend Jerry Falwell sued Hustler magazine publisher Larry Flynt for defamation growing out of a parody depicting Falwell’s first sexual encounter as a drunken tryst with his mother in an outhouse, a unanimous Supreme Court lectured on the history of parody as a constitutionally protected, even if cruel, form of social and political criticism.
When the South Boston Allied War Veterans, sponsor of Boston’s Saint Patrick’s Day parade, sought to exclude a gay veterans’ group from marching under its own banner, the high court unanimously held that as a private entity, even though marching in public streets, the Veterans could exclude any group marching under a banner conflicting with the parade’s socially conservative message, notwithstanding public-accommodations laws. The gay group could have its own parade but could not rain on that of the conservatives.
Despite such legal clarity, today’s most potent attacks on speech are coming, ironically, from liberal-arts colleges. Ubiquitous “speech codes” limit speech that might insult, embarrass, or “harass,” in particular, members of “historically disadvantaged” groups. “Safe spaces” and “trigger warnings” protect purportedly vulnerable students from hearing words and ideas they might find upsetting. Student demonstrators and threats of violence have forced the cancellation of controversial speakers, left and right.
It remains unclear how much campus censorship results from politically correct faculty, control-obsessed student-life administrators, or students socialized and indoctrinated into intolerance. My experience suggests that the bureaucrats are primarily, although not entirely, to blame. When sued, colleges either lose or settle, pay a modest amount, and then return to their censorious ways.
This trend threatens the heart and soul of liberal education. Eventually it could infect the entire society as these students graduate and assume influential positions. Whether a resulting flood of censorship ultimately overcomes legal protections and weakens democracy remains to be seen.
Harvey Silverglate, a Boston-based lawyer and writer, is the co-author of The Shadow University: The Betrayal of Liberty on America’s Campuses (Free Press, 1998). He co-founded the Foundation for Individual Rights in Education in 1999 and is on FIRE’s board of directors. He spent some three decades on the board of the ACLU of Massachusetts, two of those years as chairman. Silverglate taught at Harvard Law School for a semester during a sabbatical he took in the mid-1980s.
Christina Hoff SommersWhen Heather Mac Donald’s “blue lives matter” talk was shut down by a mob at Claremont McKenna College, the president of neighboring Pomona College sent out an email defending free speech. Twenty-five students shot back a response: “Heather Mac Donald is a fascist, a white supremacist . . . classist, and ignorant of interlocking systems of domination that produce the lethal conditions under which oppressed peoples are forced to live.”
Some blame the new campus intolerance on hypersensitive, over-trophied millennials. But the students who signed that letter don’t appear to be fragile. Nor do those who recently shut down lectures at Berkeley, Middlebury, DePaul, and Cal State LA. What they are is impassioned. And their passion is driven by a theory known as intersectionality.
Intersectionality is the source of the new preoccupation with microaggressions, cultural appropriation, and privilege-checking. It’s the reason more than 200 colleges and universities have set up Bias Response Teams. Students who overhear potentially “otherizing” comments or jokes are encouraged to make anonymous reports to their campus BRTs. A growing number of professors and administrators have built their careers around intersectionality. What is it exactly?
Intersectionality is a neo-Marxist doctrine that views racism, sexism, ableism, heterosexism, and all forms of “oppression” as interconnected and mutually reinforcing. Together these “isms” form a complex arrangement of advantages and burdens. A white woman is disadvantaged by her gender but advantaged by her race. A Latino is burdened by his ethnicity but privileged by his gender. According to intersectionality, American society is a “matrix of domination,” with affluent white males in control. Not only do they enjoy most of the advantages, they also determine what counts as “truth” and “knowledge.”
But marginalized identities are not without resources. According to one of intersectionality’s leading theorists, Patricia Collins (former president of the American Sociology Association), disadvantaged groups have access to deeper, more liberating truths. To find their voice, and to enlighten others to the true nature of reality, they require a safe space—free of microaggressive put-downs and imperious cultural appropriations. Here they may speak openly about their “lived experience.” Lived experience, according to intersectional theory, is a better guide to the truth than self-serving Western and masculine styles of thinking. So don’t try to refute intersectionality with logic or evidence: That only proves that you are part of the problem it seeks to overcome.
How could comfortably ensconced college students be open to a convoluted theory that describes their world as a matrix of misery? Don’t they flinch when they hear intersectional scholars like bell hooks refer to the U.S. as an “imperialist, white-supremacist, capitalist patriarchy”? Most take it in stride because such views are now commonplace in high-school history and social studies texts. And the idea that knowledge comes from lived experience rather than painstaking study and argument is catnip to many undergrads.
Silencing speech and forbidding debate is not an unfortunate by-product of intersectionality—it is a primary goal. How else do you dismantle a lethal system of oppression? As the protesting students at Claremont McKenna explained in their letter: “Free speech . . . has given those who seek to perpetuate systems of domination a platform to project their bigotry.” To the student activists, thinkers like Heather MacDonald and Charles Murray are agents of the dominant narrative, and their speech is “a form of violence.”
It is hard to know how our institutions of higher learning will find their way back to academic freedom, open inquiry, and mutual understanding. But as long as intersectional theory goes unchallenged, campus fanaticism will intensify.
Christina Hoff Sommers is a resident scholar at the American Enterprise Institute. She is the author of several books, including Who Stole Feminism? and The War Against Boys. She also hosts The Factual Feminist, a video blog. @Chsommers
John StosselYes, some college students do insane things. Some called police when they saw “Trump 2016” chalked on sidewalks. The vandals at Berkeley and the thugs who assaulted Charles Murray are disgusting. But they are a minority. And these days people fight back.
Someone usually videotapes the craziness. Yale’s “Halloween costume incident” drove away two sensible instructors, but videos mocking Yale’s snowflakes, like “Silence U,” make such abuse less likely. Groups like Young America’s Foundation (YAF) publicize censorship, and the Foundation for Individual Rights in Education (FIRE) sues schools that restrict speech.
Consciousness has been raised. On campus, the worst is over. Free speech has always been fragile. I once took cameras to Seton Hall law school right after a professor gave a lecture on free speech. Students seemed to get the concept. Sean, now a lawyer, said, “Protect freedom for thought we hate; otherwise you never have a society where ideas clash, and we come up with the best idea.” So I asked, “Should there be any limits?” Students listed “fighting words,” “shouting fire in a theater,” malicious libel, etc.— reasonable court-approved exceptions. But then they went further. Several wanted bans on “hate” speech, “No value comes out of hate speech,” said Javier. “It inevitably leads to violence.”
No it doesn’t, I argued, “Also, doesn’t hate speech bring ideas into the open, so you can better argue about them, bringing you to the truth?”
“No,” replied Floyd, “With hate speech, more speech is just violence.”
So I pulled out a big copy of the First Amendment and wrote, “exception: hate speech.”
Two students wanted a ban on flag desecration “to respect those who died to protect it.”
One wanted bans on blasphemy:
“Look at the gravity of the harm versus the value in blasphemy—the harm outweighs the value.”
Several wanted a ban on political speech by corporations because of “the potential for large corporations to improperly influence politicians.”
Finally, Jillian, also now a lawyer, wanted hunting videos banned.
“It encourages harm down the road.”
I asked her, incredulously, “you’re comfortable locking up people who make a hunting film?”
“Oh, yeah,” she said. “It’s unnecessary cruelty to feeling and sentient beings.”
So, I picked up my copy of the Bill of Rights again. After “no law . . . abridging freedom of speech,” I added: “Except hate speech, flag burning, blasphemy, corporate political speech, depictions of hunting . . . ”
That embarrassed them. “We may have gone too far,” said Sean. Others agreed. One said, “Cross out the exceptions.” Free speech survived, but it was a close call. Respect for unpleasant speech will always be thin. Then-Senator Hillary Clinton wanted violent video games banned. John McCain and Russ Feingold tried to ban political speech. Donald Trump wants new libel laws, and if you burn a flag, he tweeted, consequences might be “loss of citizenship or a year in jail!” Courts or popular opinion killed those bad ideas.
Free speech will survive, assuming those of us who appreciate it use it to fight those who would smother it.
John Stossel is a FOX News/FOX Business Network Contributor.
Warren TreadgoldEven citizens of dictatorships are free to praise the regime and to talk about the weather. The only speech likely to be threatened anywhere is the sort that offends an important and intolerant group. What is new in America today is a leftist ideology that threatens speech precisely because it offends certain important and intolerant groups: feminists and supposedly oppressed minorities.
So far this new ideology is clearly dominant only in colleges and universities, where it has become so strong that most controversies concern outside speakers invited by students, not faculty speakers or speakers invited by administrators. Most academic administrators and professors are either leftists or have learned not to oppose leftism; otherwise they would probably never have been hired. Administrators treat even violent leftist protestors with respect and are ready to prevent conservative and moderate outsiders from speaking rather than provoke protests. Most professors who defend conservative or moderate speakers argue that the speakers’ views are indeed noxious but say that students should be exposed to them to learn how to refute them. This is very different from encouraging a free exchange of ideas.
Although the new ideology began on campuses in the ’60s, it gained authority outside them largely by means of several majority decisions of the Supreme Court, from Roe (1973) to Obergefell (2015). The Supreme Court decisions that endanger free speech are based on a presumed consensus of enlightened opinion that certain rights favored by activists have the same legitimacy as rights explicitly guaranteed by the Constitution—or even more legitimacy, because the rights favored by activists are assumed to be so fundamental that they need no grounding in specific constitutional language. The Court majorities found restricting abortion rights or homosexual marriage, as large numbers of Americans wish to do, to be constitutionally equivalent to restricting black voting rights or interracial marriage. Any denial of such equivalence therefore opposes fundamental constitutional rights and can be considered hate speech, advocating psychological and possibly physical harm to groups like women seeking abortions or homosexuals seeking approval. Such speech may still be constitutionally protected, but acting upon it is not.
This ideology of forbidding allegedly offensive speech has spread to most of the Democratic Party and the progressive movement. Rather than seeing themselves as taking one side in a free debate, progressives increasingly argue (for example) that opposing abortion is offensive to women and supporting the police is offensive to blacks. Some politicians object so strongly to such speech that despite their interest in winning votes, they attack voters who disagree with them as racists or sexists. Expressing views that allegedly discriminate against women, blacks, homosexuals, and various other minorities can now be grounds for a lawsuit.
Speech that supposedly offends women or minorities has already cost some people their careers, their businesses, and their opportunities to deliver or hear speeches. Such intimidation is the intended result of an ideology that threatens free speech.
Warren Treadgold is a professor of history at Saint Louis University.
Matt WelchLike a sullen zoo elephant rocking back and forth from leg to leg, there is an oversized paradox we’d prefer not to see standing smack in the sightlines of most our policy debates. Day by day, even minute by minute, America simultaneously gets less free in the laboratory, but more free in the field. Individuals are constantly expanding the limits and applications of their own autonomy, even as government transcends prior restraints on how far it can reach into our intimate business.
So it is that the Internal Revenue Service can charge foreign banks with collecting taxes on U.S. citizens (therefore causing global financial institutions to shun many of the estimated 6 million-plus Americans who live abroad), even while block-chain virtuosos make illegal transactions wholly undetectable to authorities. It has never been easier for Americans to travel abroad, and it’s never been harder to enter the U.S. without showing passports, fingerprints, retinal scans, and even social-media passwords.
What’s true for banking and tourism is doubly true for free speech. Social media has given everyone not just a platform but a megaphone (as unreadable as our Facebook timelines have all become since last November). At the same time, the federal government during this unhappy 21st century has continuously ratcheted up prosecutorial pressure against leakers, whistleblowers, investigative reporters, and technology companies.
A hopeful bulwark against government encroachment unique to the free-speech field is the Supreme Court’s very strong First Amendment jurisprudence in the past decade or two. Donald Trump, like Hillary Clinton before him, may prattle on about locking up flag-burners, but Antonin Scalia and the rest of SCOTUS protected such expression back in 1990. Barack Obama and John McCain (and Hillary Clinton—she’s as bad as any recent national politician on free speech) may lament the Citizens United decision, but it’s now firmly legal to broadcast unfriendly documentaries about politicians without fear of punishment, no matter the electoral calendar.
But in this very strength lies what might be the First Amendment’s most worrying vulnerability. Barry Friedman, in his 2009 book The Will of the People, made the persuasive argument that the Supreme Court typically ratifies, post facto, where public opinion has already shifted. Today’s culture of free speech could be tomorrow’s legal framework. If so, we’re in trouble.
For evidence of free-speech slippage, just read around you. When both major-party presidential nominees react to terrorist attacks by calling to shut down corners of the Internet, and when their respective supporters are actually debating the propriety of sucker punching protesters they disagree with, it’s hard to escape the conclusion that our increasingly shrill partisan sorting is turning the very foundation of post-1800 global prosperity into just another club to be swung in our national street fight.
In the eternal cat-and-mouse game between private initiative and government control, the former is always advantaged by the latter’s fundamental incompetence. But what if the public willingly hands government the power to muzzle? It may take a counter-cultural reformation to protect this most noble of American experiments.
Matt Welch is the editor at large of Reason.
Adam. J. WhiteFree speech is indeed under threat on our university campuses, but the threat did not begin there and it will not end there. Rather, the campus free-speech crisis is a particularly visible symptom of a much more fundamental crisis in American culture.
The problem is not that some students, teachers, and administrators reject traditional American values and institutions, or even that they are willing to menace or censor others who defend those values and institutions. Such critics have always existed, and they can be expected to use the tools and weapons at their disposal. The problem is that our country seems to produce too few students, teachers, and administrators who are willing or able to respond to them.
American families produce children who arrive on campus unprepared for, or uninterested in, defending our values and institutions. For our students who are focused primarily on their career prospects (if on anything at all), “[c]ollege is just one step on the continual stairway of advancement,” as David Brooks observed 16 years ago. “They’re not trying to buck the system; they’re trying to climb it, and they are streamlined for ascent. Hence they are not a disputatious group.”
Meanwhile, parents bear incomprehensible financial burdens to get their kids through college, without a clear sense of precisely what their kids will get out of these institutions in terms of character formation or civic virtue. With so much money at stake, few can afford for their kids to pursue more than career prospects.
Those problems are not created on campus, but they are exacerbated there, as too few college professors and administrators see their institutions as cultivators of American culture and republicanism. Confronted with activists’ rage, they offer no competing vision of higher education—let alone a compelling one.
Ironically, we might borrow a solution from the Left. Where progressives would leverage state power in service of their health-care agenda, we could do the same for education. State legislatures and governors, recognizing the present crisis, should begin to reform and renegotiate the fundamental nature of state universities. By making state universities more affordable, more productive, and more reflective of mainstream American values, they will attract students—and create incentives for competing private universities to follow suit.
Let’s hope they do it soon, for what’s at stake is much more than just free speech on campus, or even free speech writ large. In our time, as in Tocqueville’s, “the instruction of the people powerfully contributes to the support of a democratic republic,” especially “where instruction which awakens the understanding is not separated from moral education which amends the heart.” We need our colleges to cultivate—not cut down—civic virtue and our capacity for self-government. “Republican government presupposes the existence of these qualities in a higher degree than any other form,” Madison wrote in Federalist 55. If “there is not sufficient virtue among men for self-government,” then “nothing less than the chains of despotism” can restrain us “from destroying and devouring one another.”
Adam J. White is a research fellow at the Hoover Institution.
Cathy YoungA writer gets expelled from the World Science Fiction Convention for criticizing the sci-fi community’s preoccupation with racial and gender “inclusivity” while moderating a panel. An assault on free speech, or an exercise of free association? How about when students demand the disinvitation of a speaker—or disrupt the speech? When a critic of feminism gets banned from a social-media platform for unspecified “abuse”?
Such questions are at the heart of many recent free-speech controversies. There is no censorship by government; but how concerned should we be when private actors effectively suppress unpopular speech? Even in the freest society, some speech will—and should—be considered odious and banished to unsavory fringes. No one weeps for ostracized Holocaust deniers or pedophilia apologists.
But shunned speech needs to remain a narrow exception—or acceptable speech will inexorably shrink. As current Federal Communications Commission chairman Ajit Pai cautioned last year, First Amendment protections will be hollowed out unless undergirded by cultural values that support a free marketplace of ideas.
Sometimes, attacks on speech come from the right. In 2003, an Iraq War critic, reporter Chris Hedges, was silenced at Rockford College in Illinois by hecklers who unplugged the microphone and rushed the stage; some conservative pundits defended this as robust protest. Yet the current climate on the left—in universities, on social media, in “progressive” journalism, in intellectual circles—is particularly hostile to free expression. The identity-politics left, fixated on subtle oppressions embedded in everyday attitudes and language, sees speech-policing as the solution.
Is hostility to free-speech values on the rise? New York magazine columnist Jesse Singal argues that support for restrictions on public speech offensive to minorities has remained steady, and fairly high, since the 1970s. Perhaps. But the range of what qualifies as offensive—and which groups are to be shielded—has expanded dramatically. In our time, a leading liberal magazine, the New Republic, can defend calls to destroy a painting of lynching victim Emmett Till because the artist is white and guilty of “cultural appropriation,” and a feminist academic journal can be bullied into apologizing for an article on transgender issues that dares to mention “male genitalia.”
There is also a distinct trend of “bad” speech being squelched by coercion, not just disapproval. That includes the incidents at Middlebury College in Vermont and at Claremont McKenna in California, where mobs not only prevented conservative speakers—Charles Murray and Heather Mac Donald—from addressing audiences but physically threatened them as well. It also includes the use of civil-rights legislation to enforce goodthink in the workplace: Businesses may face stiff fines if they don’t force employees to call a “non-binary” co-worker by the singular “they,” even when talking among themselves.
These trends make a mockery of liberalism and enable the kind of backlash we have seen with Donald Trump’s election. But the backlash can bring its own brand of authoritarianism. It’s time to start rebuilding the culture of free speech across political divisions—a project that demands, above all, genuine openness and intellectual consistency. Otherwise it will remain, as the late, great Nat Hentoff put it, a call for “free speech for me, but not for thee.”
Cathy Young is a contributing editor at Reason.
Robert J. ZimmerFree speech is not a natural feature of human society. Many people are comfortable with free expression for views they agree with but would withhold this privilege for those they deem offensive. People justify such restrictions by various means: the appeal to moral certainty, political agendas, demand for change, opposing change, retaining power, resisting authority, or, more recently, not wanting to feel uncomfortable. Moral certainty about one’s views or a willingness to indulge one’s emotions makes it easy to assert that others are doing true damage or creating unacceptable offense simply by presenting a fundamentally different perspective.
The resulting challenges to free expression may come in the form of laws, threats, pressure (whether societal, group, or organizational), or self-censorship in the face of a prevailing consensus. Specific forms of challenge may be more or less pronounced as circumstances vary. But the widespread temptation to consider the silencing of “objectionable” viewpoints as acceptable implies that the challenge to free expression is always present.
The United States today is no exception. We benefit from the First Amendment, which asserts that the government shall make no law abridging the freedom of speech. However, fostering a society supporting free expression involves matters far beyond the law. The ongoing and increasing demonization of one group by another creates a political and social environment conducive to suppressing speech. Even violent acts opposing speech can become acceptable or encouraged. Such behavior is evident at both political rallies and university events. Our greatest current threat to free expression is the emergence of a national culture that accepts the legitimacy of suppression of speech deemed objectionable by a segment of the population.
University and college campuses present a particularly vivid instance of this cultural shift. There have been many well-publicized episodes of speakers being disinvited or prevented from speaking because of their views. However, the problem is much deeper, as there is significant self-censorship on many campuses. Both faculty and students sometimes find themselves silenced by social and institutional pressures to conform to “acceptable” views. Ironically, the very mission of universities and colleges to provide a powerful and deeply enriching education for their students demands that they embrace and protect free expression and open discourse. Failing to do so significantly diminishes the quality of the education they provide.
My own institution, the University of Chicago, through the words and actions of its faculty and leaders since its founding, has asserted the importance of free expression and its essential role in embracing intellectual challenge. We continue to do so today as articulated by the Chicago Principles, which strongly affirm that “the University’s fundamental commitment is to the principle that debate or deliberation may not be suppressed because the ideas put forth are thought by some or even by most members of the University community to be offensive, unwise, immoral, or wrong-headed.” It is only in such an environment that universities can fulfill their own highest aspirations and provide leadership by demonstrating the value of free speech within society more broadly. A number of universities have joined us in reinforcing these values. But it remains to be seen whether the faculty and leaders of many institutions will truly stand up for these values, and in doing so provide a model for society as a whole.
Robert J. Zimmer is the president of the University of Chicago.