For the past several decades, public and private institutions in the United States have operated under a system according to…
For the past several decades, public and private institutions in the United States have operated under a system according to which designated minority groups receive special advantages in employment and education. Although much has been written about this highly controversial policy—the first articles in COMMENTARY dealing with it go back over 25 years—astonishingly it has never been debated in the United States Congress, and only lately, as signs of public opposition have grown more intense and found expression both in the political arena and in the courts, has it become a subject of discussion at the national level.
In an effort to clarify the issues in a debate which, though it has a long history, has in some sense only now begun, we asked a number of prominent intellectuals to address the following questions:
- President Clinton has defined affirmative action as nothing more than a “hand up” to “people who have had a hard time.” Does this comport with your own understanding of the nature, the history, and the practice of affirmative action?
- How, at this late date, would you weigh the costs and the advantages of this system, both for its intended beneficiaries and for American society?
- Some defenders of affirmative action, while acknowledging that it has taken “bad” forms (that is, quotas), wish now to preserve the system by “mending” it. Others concede the damage it has caused in settings like universities and corporations—or even, more broadly, the harm that has been done to our self-understanding as one nation—but still uphold the desirability of continued racial preferences in police departments and other urban agencies. What is your opinion of these matters?
- Is affirmative action really on the way out? If so, what if any public policy should replace it?
The responses, twenty in all, are printed below in alphabetical order.
This symposium is sponsored by the Edwin Morris Gale Memorial Fund.
William J. Bennett
To begin at the beginning: what this President says, on race, affirmative action, or virtually anything else, means almost nothing. The relevant issue is not his benign words regarding a “hand up” to “people who have had a hard time,” but his policy’s malignant real-world effects.
As has been documented by many others, in COMMENTARY and elsewhere, affirmative action as originally conceived was an attempt to cast a wide net and create greater opportunity for blacks. We know (courtesy of William Safire’s New Political Dictionary) that the phrase, conceived during the Eisenhower administration, was first used officially in President Kennedy’s Executive Order 10925 of March 1961: “The contractor will take affirmative action to ensure that applicants are employed, and that employees are treated during employment, without regard to their race, creed, color, or national origin” (emphasis added).
Three decades later, our affirmative-action regime is, in all important respects, a malignant mutation. It is the antithesis of the original civil-rights movement in reasoning, rhetoric, and appeal to moral principle. The practice of affirmative action has been transmogrified to the point that race-based discrimination has become the centerpiece of the liberal civil-rights agenda, so that today a Jesse Jackson can, with a straight face, make the bizarre claim that “to ignore race and sex is racist and sexist.”
President Clinton’s rhetoric may not be as ludicrous as Reverend Jackson’s, but he marches under the same counting-by-race banner. He never tires of reminding us of “gaps,” “gulfs,” and “disparities.” But rather than pursuing constructive policies that would help us to overcome our divisions, the President’s efforts amplify and celebrate them. The Clinton administration defends itself by defending “diversity,” a famously ambiguous term that politely suggests the futility of the project shared by the American founders and by Martin Luther King, Jr.: the project of e pluribus unum.
Subverting the ideal of “out of many, one” has wrought tremendous social damage. Numerical equality at the expense of moral equality. Balkanization. The erosion of our national self-understanding. The subordination of individual rights to group rights. The disfigurement of the concept of equal justice. Fanning the flames of racial resentment. And placing on many blacks what the author Shelby Steele calls “the stigma of questionable competence.”
Whatever its original intentions, and whatever benefits have accrued to those who have been on the receiving end of racial preferences (and there are surely some), it should be clear for all to see that affirmative action has created far more injustice than it has dispelled. The most blatant forms of injustice are seen in cases like that of Jennifer Gratz, a young white woman who was denied admission to the University of Michigan. This young woman, the daughter of a police officer, did everything expected of her: she graduated from high school with a 3.765 grade-point average; she finished near the top of her class; and she was involved in extracurricular activities like tutoring. Here was a clearly qualified individual who was nevertheless denied admission to a university—that is, to a place of presumably reasoned reflection—on one ground: she was of the wrong skin color. That is exactly what was done to black applicants at the University of Alabama in the 1950’s. Today, Hollywood cannot make enough movies condemning the racially-segregated South of yesteryear, even as the liberal intelligentsia defends the racist policies of Ann Arbor or the Ivy League.
It seems to me that in our democracy, we should not leave seventeen-year-olds like Miss Gratz alone to face the formidable and intimidating force of a prestigious university. There is a civic obligation to share her burden. We should not ask her to plead for justice by herself. We should join her.
The President insists that we need a “national conversation” on race. In fact, one thing we do not need is more talk about race. Our problem is racial obsession. It is hard to think of any issue—education, economics, welfare, the criminal-justice system, celebrity trials, you name it—where the conversation is not eventually reduced to race, whether by Bill Clinton, Al Gore, their surrogates, the civil-rights establishment, or the media. To take but one recent example: a few months ago, in a much-publicized incident, the National Basketball Association all-star guard Latrell Sprewell, a black, physically assaulted his coach, P. J. Carlesimo, a white. In response to questions, everyone directly involved in the incident said that race was not an issue. But they could have saved their breath. Much, probably most, of the media coverage of the Sprewell assault eventually turned on race. Sports Illustrated, for example, ran a cover story, the headline of which included the sentence, “The Sprewell incident raises other issues that could pose threats to the NBA’s future, issues of power and money and—most dangerous of all—race.”
In so many areas of America’s social and cultural life, conservatives have been hoping merely to implement a strategy of containment—to limit the damage of contemporary liberalism’s long march through America’s institutions. But on the issue of racial preferences the tide is beginning to flow, swiftly and strongly, in the opposite direction, and all conservatives have to do is abet it.
We glimpse the drift of sentiment in recent court decisions, in public-opinion polls, and most especially in state-sponsored ballot initiatives. Last year I traveled numerous times to California to speak on behalf of Proposition 209. In recent months I went to Washington state to campaign on behalf of a Proposition 209-like initiative that has now qualified for the 1998 ballot. And I will travel to other states that are planning to take similar steps to overturn the current racial-preference regime. The states are where the real action is, where rollback is occurring. Surely this will matter more than the muddy national conversation on race the President is leading.
Why are race-based preferences beginning to be beaten back? Part of the answer, I think, can be found in a lesser-known passage in Martin Luther King’s Letter from Birmingham City Jail:
Like a boil that can never be cured so long as it is covered up but must be opened with all its ugliness to the natural medicines of air and light, injustice must likewise be exposed, with all of the tension its exposing creates, to the light of human conscience and the air of national opinion before it can be cured.
For many years now, the policy of racial preferences has perforce been carried out in secret—behind closed doors, away from air and light. But through (among other things) the Freedom of Information Act, we are learning just how prevalent, and decisive, race has been in university admissions, federal contracting, and all the rest. Sunlight is the best disinfectant, Justice Brandeis once said, and the defenders of racial preferences will not be able to withstand scrutiny, empirical evidence, the truth.
Unfortunately, among congressional Republicans in particular, there is still great reluctance to do away with the current system of racial preferences until we have adopted a “replacement strategy.” I am all for a replacement strategy, in general. It should consist of, among other things, a thoroughgoing reform of American education in the direction of parental choice, accountability, national standards, alternative certification, and a solid core curriculum; Giuliani-like efforts to reclaim control of our urban streets; economic and social policies that support two-parent families.
But why wait to act against blatant injustice? In The Morality of Consent, Alexander M. Bickel wrote these words:
The lesson of the great decisions of the Supreme Court and the lesson of contemporary history have been the same at least for a generation: discrimination on the basis of race is illegal, immoral, unconstitutional, inherently wrong, and destructive of democratic society.
So it is. And so should we move—without delay or apology, with purpose and conviction—to lift the burden of the current regime of racial preferences from the shoulders of every man and woman, young and old.
The GOP leadership insists that it cannot overturn this system until it has an adequate replacement strategy in place. It might consider this time-honored strategy: equal justice under the law.
It is time to end government-mandated affirmative action, not to try to mend it. There is simply no way for the state to be in the business of picking winners and losers on the basis of skin color or sex without doing great violence to the principle of equal protection of the laws. For nearly 30 years, the federal government has coerced both the public and the private sectors to prefer some groups over others, all in the name of ending discrimination. We now have literally hundreds of federal, state, and local laws, regulations, and programs that purport to promote nondiscrimination but instead give preference to some individuals over others in hiring, promotion, contracting, and admission to higher education based on the individual’s race, ethnicity, or sex. It is a system that cannot be reformed and ought simply to be abandoned.
What began as an effort to recruit and train minorities and women has been transmuted into an elaborate system of preferences, double standards, and quotas. Nowhere is this system more entrenched than in higher education. For the last two years, the Center for Equal Opportunity has been gathering data on admission standards at public colleges and universities by race and ethnicity. We have now analyzed and published those data for 22 schools in three states: California, Colorado, and Michigan. They reveal a pervasive pattern of preferences in which black and Hispanic students are routinely admitted into college with significantly lower grade-point averages (GPA’s) and test scores than whites or Asians.
At the University of California at Berkeley, for example, the median SAT score of incoming black freshmen in 1995 was 340 points lower (on a scale of 1600) than that of both whites and Asians. At the University of Michigan at Ann Arbor, the median SAT score for black enrollees in 1995 was 230 points lower than for whites. And at both schools, the median high-school grade-point average of blacks trailed by about half a point (on a scale of four).
While the disparities in test scores and grades of blacks and whites were greatest at the most elite schools we studied, they were present at every school. Hispanic students fared somewhat better than blacks at all schools, but they too had lower test scores and GPA’s than their white and Asian peers at all but a few institutions. It bears stressing here that for years, college administrators denied that black and Hispanic students were being admitted with lower grades and test scores than whites, and refused to make their records public. Indeed, we have been able to obtain admissions data only after filing requests under state freedom-of-information laws.
Proponents argue that if we eliminate affirmative-action programs at colleges and universities, American public higher education will become resegregated. That is nonsense. In fact, our studies show that if race-neutral standards were used, black and Hispanic enrollment would decline significantly only at the most competitive schools, but average black and Hispanic students would still find ample opportunity at good schools with less demanding entry criteria. Given the variety and range of colleges and universities available in most states, virtually any student, no matter how poor his grades or test scores, can find some school that will take him.
More importantly, however, our studies suggest that eliminating racial and ethnic preferences in admissions might actually improve black and Hispanic college-graduation rates. In Colorado, where we were able to obtain comprehensive data for all state schools, we found a high correlation between preferential admission standards and low graduation rates for black students and a moderate correlation in the case of Hispanic students.
If we do eliminate racial and ethnic preferences in college admission, however, we should not ignore the underlying problems that leave many more black and Hispanic students ill-prepared to compete at our best universities. Instead of insisting on racial and ethnic double standards in college admissions, civil-rights groups ought to be more focused on what goes on in elementary and secondary schools across the country. The quality of education in our inner cities is appalling and shows little sign of improving. Yet the same groups that advocate preferential admission policies at universities and colleges oppose all efforts to provide poor black and Hispanic students the opportunity to attend private or religious schools through vouchers or tax credits.
Eliminating government-mandated affirmative action will not be easy. Even the Republican majority in Congress shows no enthusiasm for doing so. But opinion polls suggest the American people want to abolish preferences now, and it is only a matter of time until the elected leadership of the country catches up to public opinion.
What is affirmative action?
Affirmative action has long had many meanings. The ambiguity, sometimes deliberate, has muddled the discussion of the central issue, which is this: is there any justification for preference by race?
The Civil Rights Act of 1964 authorized courts to take “affirmative action” to uproot racially discriminatory practices. That objective was, and remains, morally right. But by that same statute, race preference was forbidden. Affirmative action and race preference are thus plainly distinguishable; the former (in its original sense) is right and lawful, the latter is neither.
Preference and affirmative action are widely confounded in the public mind because race preferences were introduced (beginning about 1970) in the honorable name of affirmative action. What was to have been eliminated was given, in a complete inversion, the name of what had been conceived to eliminate it. Most folks today, with unintended irony, mean by affirmative action that very preference by skin color that affirmative action was devised to eradicate. The result is doubly unfortunate: immoral practices fly the flag of justice; and policies that deserve support are tainted by association with what everyone sees intuitively to be unfair. Henceforth let us be clear: it is race preference (by whatever name it is called) that is to be condemned.
Race preference is not justifiable, in morals or in law. Compensation to individuals who have been damaged is sometimes a demand of justice, but that is redress for injury, not entitlement by color. Skin-color groups cannot be entitled to redress, because rights are possessed by individuals, not groups. Race preferences are never just because they inescapably reward some who deserve no reward, and penalize some who deserve no penalty. Race preferences are never benign because when goods are in short supply, to give to some by race is to take from others by race. The University of Texas, claiming to compensate minorities for wrongs earlier done to them by Texas schools, gave preference to affluent applicants who had never lived in Texas, and to foreigners who had never lived in the United States, because their skins were dark. For the race preferences commonly given today, the retrospective justification grounded in alleged compensation is almost invariably a fraud.
The prospective justification of race preference, based on an idealized redistribution of goods by race, is wholly without merit. It supposes that, absent racial oppression, attainments and advantages would be homogeneously distributed among all persons and all ethnic groups, a notion that is wildly false. On this view, the test for social justice is numerical proportionality among racial groups, impossible to sustain without perpetual social engineering. The ideal is profoundly wrong-headed, but it is also internally incoherent because ethnic groups cannot be sorted so as to give each a proportional share of social goods; there are too many ways to cut the pie.
The demand for racial balance imposes ugly costs. It entails some formal determination of the racial category to which each individual is assigned, and some formal division of the spoils by group. It makes racial quarrels inevitable, and rules (one drop of blood?) for resolving disputes over ethnic membership essential. Claims are made by race, arguments by blood. No redistributive system could be more unwholesome.
Who reaps the benefits and who bears the burdens of race preference?
The beneficiaries of race preference are a few members of the preferred groups, and the newly emerged corps of administrators whose livelihood is derived from the oversight and enforcement of preferences. The vast majority of the members of the minority groups in question—in whose interests preferences had purportedly been designed—receive no benefits whatsoever.
The burdens of preference, on the other hand, are borne by four large groups, for each of whom the costs are very great.
The cruelest and most damaging burdens are those imposed upon the members of the preferred minority group as a whole, who are inescapably undermined by racial preferences. When persons are appointed, or admitted, or promoted because of their racial group, it is inevitable that the members of that group will, in the institution giving such preference, perform less well on average. Membership in the minority group most certainly does not imply inferiority; that is a canard—but that stereotype is reinforced by preferences. Since the standards for the selection of minorities are inevitably lower when diluted by considerations of color, sex, or nationality, it is a certainty that, overall, the average performance of those in the preferred group will be weaker—not because of their ethnicity, of course, but because many among them were selected on grounds having no bearing on the work or study to be pursued. Preference thus creates a link between the minority preferred and inferior performance.
This burden is borne not only by those individuals preferred, but by every member of the minority group, including the many among them who genuinely excel. The general knowledge that persons with black or brown skins are given preference ensures lower expectations from all whose skins are of those colors. Every minority member is made suspect. No one (including the minorities themselves, of course) can know for sure that any given member of a preferred group has not been awarded special favor. Skin color, the most prominent of personal characteristics, is thus transformed by preference into permanent and public onus. If some demon had sought to concoct a scheme aimed at undermining the credentials of minority businessmen, professionals, and students, to stigmatize them permanently and to humiliate them publicly, there could have been no more ingenious plan devised than the preferences now so widely given in the name of affirmative action.
Unfair burdens are also imposed upon deserving white applicants and employees who, because of their skin color, do not win the places that would otherwise have been theirs. One often hears the claim that the burdens of preference are insignificant because they are widely shared by very many among the white majority. That is false; most among the majority bear no fraction of the burden. Those who do bear it are a small subset whose members are rarely identifiable by name. If a university gives admission preference to blacks, for example, some whites, who would have been admitted but for that favoritism, will not be admitted. The unfairness to those unidentifiable individuals who lose out because of their race is not reduced because we cannot learn their names. And every applicant with a pale skin who was not admitted or appointed may rightly wonder whether he was the one from whom the penalty had been exacted.
Institutions that give preference pay a heavy price as well. Inferior performance results in many inefficiencies and hidden costs. In academic institutions intellectual standards are lowered, explicitly or in secret; student performance is unavoidably lower, on average, than it would have been without the preferences, as are faculty productivity and satisfaction. The political need to profess equal treatment for all, while knowingly treating applicants and faculty members unequally because of their race, produces pervasive hypocrisy. Even great public institutions hide their policies, describe them deceptively, and sometimes lie about them. Part of the price of race preference is the loss of institutional integrity and public respect.
Finally, society at large suffers from the distrust and hostility that race preference engenders. Members of ethnic groups tussling for a larger slice of the preferential pie come to resent and despise their opposite numbers in competing minorities who always seem to get more than their “share” of the spoils. In schools, in playgrounds and parks, in commerce and sports, in industrial employment, even in legislatures and courts, the outcome is exacerbated tension, increasing self-segregation. More and more we come even to abandon the ideal of an America in which persons and not groups are the focus of penalty and reward. Preference ostensibly given to overcome the legacy of racism takes the form of racism, nurtures racism, embitters the national community, and infects every facet of public life with racial criteria whose counterproductivity is matched only by their immorality.
Is RACE PREFERENCE justifiable in some contexts?
In the 1954 case of Brown v. Board of Education, Thurgood Marshall submitted the brief for the Legal Defense Fund of the NAACP, of which he was then executive director. He wrote there:
Distinctions by race are so evil, so arbitrary and invidious that a state bound to defend the equal protection of the laws must not invoke them in any public sphere.
I cheered when I read that in 1954, as I cheer again today. The truth of this principle does not change with the times. Let us respond justly and compassionately to injury, giving remedy where remedy is due, and credit where credit is due, without regard to race. But if we are ever to heal our racial wounds, it will be through a national determination, morally resolute and backed by law where that is appropriate, never again to give preference by race or color or sex.
The long-term success of a democratic polity requires a deep and widespread commitment to the principle that the laws protect all equally. In his controlling Bakke decision in 1978, Justice Lewis Powell wrote:
The guarantee of equal protection cannot mean one thing when applied to one individual and something else when applied to a person of another color. If both are not accorded the same protection, then it is not equal.
Is this so difficult to understand? We begin to transcend racism when we stop the practice of every form of it, by every public body, now.
Is RACE PREFERENCE on the way out?
Citizens of the United States, black and white, in preponderant majority find skin-color preference morally objectionable. Ours is a reasonably healthy democracy. I conclude that our bodies politic will tolerate public discrimination not much longer. By court orders, by legislative acts, or by voter demand race preference will go. It needs to be replaced by its absence.
No one has better illustrated the cost of affirmative action, to society as a whole and to its intended beneficiaries, than President Clinton himself. He did this when in the course of a meeting of his task force on the problem, he challenged Abigail Thernstrom to tell him how she could be opposed to a system that had produced Colin Powell.
How Colin Powell felt about being used as an example of the success of such measures as preferential quotas, lowered standards of admission, and race-norming—the leading techniques of affirmative action for blacks—he did not share with the public. But may I be permitted the presumption of saying that I can guess his reaction? Deeply insulted and furious would be my bet, as any high-achieving black must be at the possibility that his standing in the world is being chalked up to his having been given special consideration for the mere color of his skin. And this was Colin Powell the President was talking about, for God’s sake, someone who has on various occasions, and by people of various political stripes, been publicly urged to run for the presidency of the United States.
Well, perhaps Powell has his own sources of inner protection from humiliation. But imagine the experience of a distinguished black professor, say, who may with good reason suspect that he is regarded by the liberal racists in charge of virtually all university administrations as nothing more than a statistic to pride themselves on—indistinguishable, from their point of view, from any of the black charlatans trading on race who may have been hired to teach alongside him.
Those most entitled to sympathy, however, are not so much the truly distinguished individuals—although most of them at one time or another have probably been treated to the same kind of insult as Colin Powell was by the President—but the perfectly OK black graduates of law or medical or business or other professional schools. Thanks to the public atmosphere created by affirmative action, they will have to work night and day trying to be better than their white colleagues in order to overcome widely-held (though hotly-denied) suspicions about their qualification to practice their professions.
As for the black students who end up in schools they are not qualified to attend, all the race-norming in the world will be of no help to them. Why have blacks in universities introduced their own system of Jim Crow, with separate dining tables, separate classes, and, where they have succeeded in imposing their demands, separate dorms? However militantly they may talk, what they are doing is huddling against failure, and against the exposure of that failure. Anyone capable of reminding himself that these are just young kids afraid they may have found themselves somewhere they do not belong is also capable of understanding what their self-segregation is really about.
The authors of the system that has brought these kids to such a pass should be required by law to remove their political blinders and take a long, long human look at what they have wrought. If they ever did, their hearts might break. Instead, they merely look away and pat themselves on the back for being part of the solution. The solution to what? To the problem of how to operate a sustainably racist way of appearing to improve life for people who have had their faith in themselves kicked out of them.
As for what affirmative action has done to society in general, again the President has offered a pretty good example (though, to be sure, there are many better) of the kind of all-around cynicism it has bred. His appointment of a task force for the alleged purpose of studying this problem is itself an act of cynicism, part of a larger manipulative invitation to the nation to hold a “dialogue on race”—as if the nation had been discussing much else for the past four decades. Nor do I think it an exaggeration to say that his all-too-thoughtless allusion to Colin Powell grew out of a whole generation’s cynical bent of mind.
But, insist those with the bent of mind I am referring to, this nation has a problem with the maldistribution of achievement and award. Very well, then, let us make everybody happy and simply concede that there is no such thing as true achievement. From which there follow not only meretricious diplomas and unearned government contracts but, among other things, phony appraisals of art and literature; laughable prize-givings; a much-celebrated philosophy that says anything can mean anything or nothing, what difference does it make?; a growing attitude that hard work or special talent counts for nothing, while networking (or even just race) counts for everything; and the gerrymandering of voting districts in the name of the higher democracy.
Can anyone doubt that the unearned sophistication and world-weariness of the young people known as Generation X have arisen from their having picked up the message from the air around them that achievement does not count—that, in short, everything is a racket? And where does anyone suppose that message came from?
How we get out of this I do not know. But until we do, a heavy spiritual cloud will continue to darken our days, rich and poor, men and women, and blacks and whites together.
Affirmative action can mean giving a “hand up” to “people who have had a hard time.” But this hardly suffices as an accurate description, and it obscures why the policy has proved so controversial.
When it was begun in the early 60’s, affirmative action was an attempt to remedy the ill effects of past discrimination against blacks. In its earliest forms it emphasized the elimination of blatant barriers to employment as well as outreach, recruitment, and job training. But very quickly, new forms of affirmative action developed that favored blacks over equally- or (in the more common scenario) better-qualified whites. These racially-preferential policies were adopted not only in the workplace but also in higher education (both in admissions and in faculty and staff hiring) and, eventually, in government contracting. And often the preferential treatment was numerically articulated in percentage goals and even quotas.
By the early 70’s, blacks had ceased to be the only affirmative-action group as Hispanics, Asian-Americans, and American Indians (often lumped together with Eskimos and Aleuts as “Native Americans”) came, usually for political reasons, to be included. But there were obvious differences both among and within the groups. Blacks had endured worse and more enduring discrimination than Hispanics, for example, who were anyway not a racial group but an ethnic one. And, within the Hispanic category, Cubans had not experienced the harsh prejudice often faced by Mexican-Americans in the Southwest. Yet as a result of immigration triggered by reform legislation passed in 1965, the numbers of Hispanics—and of Asian-Americans—were beginning a rapid growth.
Clearly, the remedial rationale advanced in behalf of blacks was a weak frame for the expanding affirmative-action edifice. Other rationales were therefore introduced and since then have been widely utilized. The chief one is promoting “diversity.”
Affirmative action on this ground is different from the remedial variety. The latter implicitly envisions an eventual end to the program—once the medicine does its job, it is presumably discontinued. By contrast, affirmative action undertaken to achieve diversity must be never-ending, since the racial and ethnic mix of our society is constantly changing. Ironically, this form of affirmative action in higher education often discriminates against Asian-Americans, who long ago ceased to be included in admissions programs. It even threatens to discriminate against blacks, the original affirmative-action group. In 1994, President Clinton, commenting on the Justice Department’s position in the notorious Piscataway case, observed that it would be fine if a school board that found itself with a faculty containing “too many” blacks and “too few” whites “decided to keep the white teacher . . . to preserve racial diversity.” Clearly, we are here some distance removed from the idea of offering a “hand up” to “people who have had a hard time.”
In the 60’s and 70’s, preferential affirmative action, ordered by the courts in remedial contexts, probably did break down racist barriers to employment, particularly in the South. Such affirmative action may also have contributed to the growth of the black middle class, though the extent of the contribution and even whether it was necessary are debatable. But preferential policies have done nothing for the two to three million members of the so-called black underclass who lack the threshold skills even to be considered for jobs governed by affirmative action. And one should note this paradox: though preferential affirmative action no doubt has sometimes “worked” in the sense of putting its “beneficiaries” in places they otherwise could not have reached, it truly works only if a beneficiary eventually graduates from a particular program and labors under the same rules as everyone else.
The main costs of preferential affirmative action—so considerable in my view that they counsel in favor of ending it—are twofold. First, the policy is discriminatory: whoever would have been admitted to a school or won a job or a contract or been promoted, except for considerations of race or ethnicity, has suffered discrimination. Justice cannot abide a racial double standard.
Second, the policy has contributed substantially to the salience of race and ethnicity in American life, making it that much harder to overcome the very tendency the civil-rights movement once so rightly condemned—that of regarding and judging people in terms of their race and ethnicity. The fact is that wherever affirmative action governs, people (lacking particular knowledge about individuals) make generalizations about minority achievement. The main generalization is that “minority” plus “affirmative action” equals “lower standards,” and the inference drawn is that whatever success is achieved by members of a minority is traceable to the workings of that equation. Tragically, this inference dogs even those who make their way without the help of any preferential treatment.
The Clinton administration’s effort to “mend” federal affirmative-action programs has not entailed the replacement of their preferential elements with nonpreferential ones. Instead, it has meant tinkering around the edges of preferences and in some instances extending them. Moreover, it has meant defending even the least defensible pro-preference positions—such as that California’s Proposition 209 was unconstitutional—and aggressively arguing for a radical interpretation of federal civil-rights law so as to recognize diversity-based affirmative action. Rarely has the Clinton administration seen a preference it could not defend in court, though seldom, and never in the Supreme Court, have its views prevailed.
Even as Clinton has preserved and expanded preferences, he has sometimes sought to persuade the public otherwise. “I’ve done more,” he said during the San Diego presidential debate in 1996, “to eliminate programs—affirmative-action programs—I didn’t think were fair and to tighten others up than my predecessors have since affirmative action’s been around.” Actually, however, Clinton has eliminated only one preferential program—a decision forced on him by litigation.
Should we abandon preferences almost everywhere but in police departments serving racially mixed communities? The usual argument for this exception is that to perform effectively in such communities, the police must be not only white but also, in substantial numbers, black or brown or both, and that when the use of race-neutral employment criteria does not result in the necessary outcomes by race and ethnicity, preferences must be introduced.
The “operational-needs” rationale is, however, a dangerous one, and not simply because it aims to justify discrimination. Rejecting this rationale in a 1993 case, a federal court noted that it had once been used to confine blacks to certain jobs, and expressed the fear that “others could use this same rationale for a much less benign purpose.” The court added: “Such a result could promote racial polarization and the stereotypical view that only members of the same race can police themselves.”
To many, the argument for preferences in the police context seems benign (though clearly it is not benign toward those who are discriminated against). And certainly it aims to produce laudable ends. Moreover, it is possible to envision other contexts in which well-intentioned people could find apparently good reasons for drawing racial distinctions and even utilizing preferences.
Nevertheless, from the time of the abolitionists to the present day, the argument for colorblind law—law that does not countenance racial distinctions or discrimination of any kind—has always been that, precisely because race has proved so difficult and dangerous, it should be made off-limits, and that an immigrant nation committed by its founding charter to equal rights for all would ultimately be better served by such a rule. One might debate whether color-blind law should apply to the public sector only or to substantial parts of the private sector as well. But in my view there can be no serious quibble about its general advisability. Three-and-a-half decades ago, that was precisely the kind of law the civil-rights movement pushed for.
Despite the exertions against color-blind law by the Clinton administration, we may recover it yet. Supreme Court decisions in 1989 and 1995 renewed the tradition of such law, making it very difficult for governments to employ racial preferences. Whether this trend continues will turn on the composition of the Court, for it could backslide if even one of the jurists currently skeptical toward preferences is replaced by a Clinton appointee.
Meanwhile, despite public opinion still emphatically opposed to preferences, the Republican Congress is reluctant merely to enact a law imposing the color-blind rule upon the federal government, much less to pass law making clear that the Civil Rights Act of 1964, which reaches into the private sector, means today what it originally meant: absolutely no discrimination on the basis of race.
The issue of preferences could emerge in the presidential race in 2000, but this was predicted the last time around, and it did not happen. Given the vagaries of the political arena, those who oppose preferences are probably doing as I am—holding my breath in hopes that no Justice leaves the Court before 2001. No regress on race by the Court must be regarded as progress.
Before avoiding the editors’ carefully formulated questions, I should like to indulge two mildly digressive impulses. The first is to salute the unknown genius—I assume he is unknown, I know he is a genius—who coined the phrase “affirmative action.” It first appeared, I am told, in a 1961 executive order by President John F. Kennedy (ask not what you can do to obfuscate the language), and in my opinion it is a honey. As empty public-relations euphemisms go, it has everything: alliteration, a nice lilt, memorability, and just about as much falsity as one could hope to pack into seventeen letters of the English alphabet. It is up there with “pacification program” and “meaningful relationship” and “student unrest.” So here’s to you, fella, whoever and wherever you might be.
My second impulse is to wonder what, if affirmative action were to be ended, would happen to the numerous administrators—in government, corporations, and educational institutions—whose job it now is to see that the policy is enforced, to make “diversity” the norm, and to give instruction in ethnic sensitivity. The lot of these people, it occurs to me, would be not dissimilar to that of the 150,000-odd men and women who taught Marxism-Leninism in the Soviet Union before Communism collapsed. One could only hope they would not find enjoyable new work too quickly.
President Clinton will go down in history as the President of good intentions and no principles whatsoever; and the latter, as even liberals have learned to their chagrin, is hell on the former. I think he may be perfectly in earnest when he defines affirmative action as giving a “hand up” to “people who have had a hard time.” Not being a theologian or an engineer, I do not know for sure whether the road to hell is paved with good intentions, but the good intention that began life as affirmative action has worked for the most part to bring about slipping standards, mediocrity, cynicism, and resentment.
I know affirmative action chiefly in its academic context, in my case as a member of a university English department. Here it has tended to skew normal meritocratic procedures of hiring and job-getting. I have seen my own university—Northwestern—go to extraordinary lengths to hire young minority-group teachers (chiefly African-American) over young white males, providing them housing not generally available even to senior faculty and offering them salaries above the normal range in hopes of keeping those whom it hires. Usually, by the way, the ploy has not worked; if a better offer is made—and, in a slightly perverse twist on the free market, it always is—they are gone.
I cannot gauge how cynical a young African-American academic becomes when he knows that interest in him is not principally for his qualities as a scholar or a teacher but purely for his race. If it were I, I should be very cynical indeed, and make the bastards pay everything I could. But the cynicism of young African-Americans must be as nothing compared with the young white males who know that, for the sake of a “level playing field” (that charmless and misguided metaphor), they and their intellectual ambitions have been put permanently on the sidelines. (White men, it turns out, not only can’t jump; they can’t teach, either.) What a discouragement of scholarly passion, what a denial of merit, what a shuck and a sham, a hustle and a scam, affirmative action must seem to them!
Then there are the effects of affirmative action on undergraduates. I teach courses in Henry James, Joseph Conrad, and Willa Cather as well as in advanced prose composition and creative writing. Not all that many African-American students enroll in them, but when they do, I, like everyone else in the university, find myself pulling for them—wanting them to do well. Some have; but more, especially those who understand their wildly privileged position, know that for them the pressure to perform is not quite what it is for other students.
An attractive young man from the South who turned up in a composition course of mine handed in work at a consistent B level. I pushed him to improve himself in small ways: to be more meticulous and self-critical. But it was no-go—or at any rate, the distance was rather farther than he wished to travel. A year later, we met in my office and he told me he had been applying to graduate business schools, and with great success. His situation was rather like that of the conductor Herbert von Karajan, who, when asked by a cab driver in Paris where he wanted to go, is said to have replied: “It doesn’t matter—they want me everywhere.” And so did they want this student; clearly he did not need lessons from me in getting ahead in the world.
I recall, too, a young woman who showed up in my course on the writing of short stories. Short, her hair in dreadlocks, she wrote brilliantly, though much of what she turned in was obviously influenced by the prose of Zora Neale Hurston. But the larger problem was that she soon ceased attending class regularly, appearing at perhaps one out of every four sessions and handing in her stories well beyond deadline. I called her into my office for a little sermonette; when I finished, I asked her about herself. It turned out she was originally from Tennessee, but had gone to the same private preparatory school that Jackie Kennedy once attended. In brief, she had been given an affirmative-action ride for many years now. After our meeting she went on missing classes. I probably should have failed her. Instead, I gave her a C, which, in the grade-inflated environment of our day, is an act tantamount to intellectual harassment. But not to worry: I am certain my C did not stop her from getting into Harvard, Yale, Stanford, or any other graduate school she might have wished to enter.
On a larger scale, I cannot help noting the obvious sociological fact of contemporary campus life: a nearly complete, wholly voluntary, racial segregation, to a degree quite at odds with almost any other setting one can think of in the United States. Why should this be so, in the one institution where “diversity” has been more carefully cultivated than anywhere else? The fact is that both voluntary segregation and political correctness are powerful side-effects of the social drug called affirmative action, and it may be time to ask if the cure is worth the price—particularly since the drug itself seems thus far to have shown an alarmingly low rate of efficacy.
Affirmative action has revealed that human beings are perfectly capable of behaving quite as deviously—and maybe more so—when they think they are doing good as when they are out to do evil. To achieve the particular presumed good called affirmative action, we now know that much outright cheating has been going on. At the University of Texas law school, “radically different admissions standards” (in the words of an internal law-school memorandum) have been applied to enlarge minority enrollment. In a recent article in the Public Interest, David P. Bryden cites another law school that, implicitly deprecating grades and test results, has decided to assign an overall value of 40 percent to an applicant’s “interpersonal skills, career involvement, extra activities, career focus, motivation, and problem-solving skills.” The slipshod language says everything.
Such sophism—and we have been learning that it is widespread—suggests that affirmative action’s day is far from done. My hesitant prophecy is that we shall struggle along for several years with various compromises and never find the correct one. The reason for this is that we live in an intractably unjust world, and all artificial attempts to make it just seem chiefly to have the effect of creating new, subtler, but often no less real injustices—sometimes even, sadly, for those they set out to help.
Affirmative action has in part been a “hand up” to “people who have had a hard time,” but its original design was so faulty that it soon became much more than that.
The categories placed under the special terms of affirmative action—the four minority groups of African-Americans, Hispanics, Asians, and American Indians (“Native Americans”), plus women—are so widely various in their histories, in the prejudice and indignities they have faced, in the obstacles they meet today, and in their claims to special treatment, that inevitably a great number of the policy’s beneficiaries have not had a hard time, and have not needed a hand up. Two of these groups—Hispanics and Asians—are composed in large measure of immigrants and the children of immigrants who have come here since affirmative action began, and some are the beneficiaries of special programs for refugees. Have they needed the additional hand up? Asian immigrants in general so clearly did not need it that fifteen years ago colleges stopped recognizing their claim to beneficiary status and began discriminating against them—something that under the Civil Rights Act they were not allowed to do.
Did women need a hand up, beyond a general protection against discrimination? In some cases, perhaps. But in academia the preference for women, now abetted by the women’s movement, has become excessive, and benefits a class that on the whole meets no discrimination.
Even the category for which affirmative action was originally instituted, African-Americans, includes many today from middle-class backgrounds who have not faced any noteworthy discrimination in academia for decades. Unfortunately, because affirmative action covers so many areas—jobs at all levels, contracts with federal and other government agencies, admission to colleges, universities, and professional schools—we do not know in detail just who has benefited and on what grounds. How many of the black students who have come under the aegis of affirmative action, for example, are from well-to-do homes in which they have received the standard array of middle-class assistance? Many? Few? Very few? We do not know.
We have a similar problem in weighing the general social costs and advantages of affirmative action.
A great black middle class has been created. But is it because of affirmative action? Stephan and Abigail Thernstrom, following on the research of Finis Welch and others, argue that the economic improvement in the black situation preceded affirmative action and would have continued without it. This seems reasonable to me. Although affirmative action has probably contributed something, one can hardly doubt that a good part of the movement by blacks into public jobs of all sorts would have occurred independently, as discrimination declined and as blacks became dominant, demographically and then politically, in so many cities.
On the cost side, has the level of public services declined as a result of the weakening of civil-service tests on which black applicants scored poorly and which were made less difficult because of the pressure of affirmative-action goals? The decline has perhaps been clearest in central-city teaching, where the number of black and Hispanic teachers and administrators has increased. Yet this change was inevitable as black and Hispanic students came to dominate central-city school systems, and as whites who might once have gone into teaching found more attractive employment opportunities elsewhere. It is hard to believe that the situation in New York and other big cities, where whites held almost all the administrative positions in school systems with a majority of black and Hispanic students, could have endured.
Perhaps the greatest cost has been exacted by the requirements for affirmative-action plans, with their elaborate record-keeping of recruitment, employment, and promotion, and the openings thus made possible for litigation. Affirmative action has undoubtedly burdened every employer, public and private. Some of the costs of this burden have been tallied, and they seem to me to exceed the benefit, however we calculate it.
Still another cost—the decline in the reputation of blacks for competence among nonblack citizens—cannot be easily quantified. But would this reputation have improved in the absence of affirmative action, and in the face of evidence that disproportionately fewer blacks were qualifying for jobs through traditional means? One cannot be sure.
This brings me to the question of “mending” or “ending” affirmative action. Certain forms seem to me necessary. We do need black members of police departments if we are to police black areas effectively; one recalls that there was a time when white firemen were harassed when they entered black residential areas. Similarly, we do need black teachers and administrators in our schools. Whatever the abstract force of the argument that the quality of the teacher is more important than his race or ethnicity, the degree of racial self-consciousness among African-Americans is so strong that, except under special circumstances, an educational system that does not take account of race and ethnicity will not succeed.’
This position can be disputed by the experience of inner-city Catholic schools, which teach black children effectively even though, as I believe, their staffs include few blacks. But the Catholic schools, being both private and religious, do not face, and do not have to respond to, demands for black teachers and administrators. The public schools are another matter entirely. (In any case, wages are so poor in Catholic schools that they cannot compete with the public schools for staff.)
Brief as it has been, this discussion of just one area suggests how complicated it is to decide how and where to “mend” affirmative action. To me, the least satisfactory area is contracting. One does not create an entrepreneurial class by granting contracts to those who cannot otherwise compete; on the contrary, one invites fraud and corruption. I would therefore be happy to see affirmative-action contracting fully eliminated. Nevertheless, the political realities are such that most cities and other public bodies now granting these preferences fight to maintain them, even in the face of Supreme Court proscriptions. It has become the current and accepted form of distributing political pork.
One of the most disputed areas of affirmative action, and also the most challenged, involves preferences in admission to selective colleges, universities, and professional schools. Thanks to widespread public belief in the legitimacy of granting admission according to academic qualifications, and the easily-obtainable facts of racial preference, we have seen such preference outlawed in the public higher-education systems of California (by action of the board of regents and by popular vote) and Texas (by action of a federal circuit court). Ironically, however, this form of affirmative action has on the whole been voluntarily adopted by the institutions involved, though of course not without pressure from militants, and also has its roots in at least one tradition of American higher education, which (unlike the European model) has never stressed the exclusive role of academic qualifications in admissions.
To my mind there is a good argument for maintaining this racial preference, or some part of it: because of its voluntarism, because of the diversity of the American system of higher education, and because the presence of blacks does change and on the whole benefit the education of all. But my principal reason is that our colleges, universities, and professional schools are the central gateways to positions of power, wealth, and influence, and applying strict meritocratic principles would lead to a catastrophic drop in the number of black students in these crucial institutions. That would deliver a terrible message to blacks, and would be bad for the country.
Is affirmative action on the way out? As we know, Congress has simply stayed away from the issue—it is too dangerous. It adopts occasional preferences itself (as in subcontracting for highway construction), with no extended discussion. The President will also do nothing. Indeed, even Presidents Reagan and Bush, both of whom opposed affirmative action, did nothing to weaken its central trunk—the executive order bringing the policy into being—which they could have modified or eliminated at the stroke of a pen. Why should anyone expect more from Clinton? Nothing has been mended yet.
If neither legislature nor executive will act, there remain the Supreme Court and the people. With regard to the Court, the most likely possibility is that, even if it has not yet done so, it will follow what seem to be the clear mandates of the Fourteenth Amendment and the Civil Rights Act to ban racial preference, as the circuit court did in the Texas case. I believe that would be going too far—something the Court has, however, often done. It is odd that conservatives, who have for so long criticized the Court for its sweeping judgments on abortion, school prayer, or flag burning, now see it as the means of eliminating affirmative action.
As for the people, they remain mixed in their views, as we see in, on the one hand, the California vote in favor of Proposition 209 and, on the other hand, the more recent defeat of a similar—though differently phrased—initiative in Houston. In other words, they oppose racial preferences, but may be persuaded to support some undefined form of affirmative action.
I would like colleges and universities, and employers and public contractors, to be freed to do what they feel impelled to do, on the basis of their ideology and their sentiments, as affected by political and market forces that shape them. That would be for the best, and much better than either a universal standard that says there must be preferences—we have lived under something like this for 25 years—or a universal standard that says there must not be preferences. With the free and voluntary action of autonomous institutions, I think we can move ahead, slowly, in dealing with our race problem.
Lino A. Graglia
Insofar as it is controversial, affirmative action is a euphemism for discrimination: the granting of preference to some individuals and therefore the disfavoring of others on the basis of their race. Suggested definitions that fail to recognize this seek to evade rather than confront the only point really in contention: how is it possible to justify an official policy of classifying people by race for differential treatment?
The overwhelming objection to any race-based policy is simply that it makes one’s assigned membership in a racial group, not one’s individuality, the basis of governmental treatment. It leads to—indeed, it virtually compels—the organizing of racial blocs in legislatures and elsewhere in order to contend for group advantage and defend against disadvantage. It is a prescription for racial consciousness and conflict inconsistent with the maintenance of a viable multiracial society; it means abandoning hope for an integrated society and accepting the inevitability of separatism.
For most people, it is simply morally wrong for government to treat people on the basis of race. Powerful arguments should be required to overcome this obstacle, and yet the arguments offered for racial preferences are surprisingly weak. The primary argument—that preferences compensate for past unjust disadvantage—is patently invalid and obviously uncandid. It is not po
1 Later, feminists and other ethnic groups got into the act, but it would take more space than I have available to do justice to the additional complications this has involved. Hence I will confine myself here to affirmative action for blacks, for whom it was designed anyway.
2 “Why Racial Preference Is Illegal and Immoral,” COMMENTARY, June 1979.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Is Affirmative Action on the Way Out? Should It Be?
Must-Reads from Magazine
Terror is a choice.
Ari Fuld described himself on Twitter as a marketer and social media consultant “when not defending Israel by exposing the lies and strengthening the truth.” On Sunday, a Palestinian terrorist stabbed Fuld at a shopping mall in Gush Etzion, a settlement south of Jerusalem. The Queens-born father of four died from his wounds, but not before he chased down his assailant and neutralized the threat to other civilians. Fuld thus gave the full measure of devotion to the Jewish people he loved. He was 45.
The episode is a grim reminder of the wisdom and essential justice of the Trump administration’s tough stance on the Palestinians.
Start with the Taylor Force Act. The act, named for another U.S. citizen felled by Palestinian terror, stanched the flow of American taxpayer fund to the Palestinian Authority’s civilian programs. Though it is small consolation to Fuld’s family, Americans can breathe a sigh of relief that they are no longer underwriting the PA slush fund used to pay stipends to the family members of dead, imprisoned, or injured terrorists, like the one who murdered Ari Fuld.
No principle of justice or sound statesmanship requires Washington to spend $200 million—the amount of PA aid funding slashed by the Trump administration last month—on an agency that financially induces the Palestinian people to commit acts of terror. The PA’s terrorism-incentive budget—“pay-to-slay,” as Douglas Feith called it—ranges from $50 million to $350 million annually. Footing even a fraction of that bill is tantamount to the American government subsidizing terrorism against its citizens.
If we don’t pay the Palestinians, the main line of reasoning runs, frustration will lead them to commit still more and bloodier acts of terror. But U.S. assistance to the PA dates to the PA’s founding in the Oslo Accords, and Palestinian terrorists have shed American and Israeli blood through all the years since then. What does it say about Palestinian leaders that they would unleash more terror unless we cross their palms with silver?
President Trump likewise deserves praise for booting Palestinian diplomats from U.S. soil. This past weekend, the State Department revoked a visa for Husam Zomlot, the highest-ranking Palestinian official in Washington. The State Department cited the Palestinians’ years-long refusal to sit down for peace talks with Israel. The better reason for expelling them is that the label “envoy” sits uneasily next to the names of Palestinian officials, given the links between the Palestine Liberation Organization, President Mahmoud Abbas’s Fatah faction, and various armed terrorist groups.
Fatah, for example, praised the Fuld murder. As the Jerusalem Post reported, the “al-Aqsa Martyrs Brigades, the military wing of Fatah . . . welcomed the attack, stressing the necessity of resistance ‘against settlements, Judaization of the land, and occupation crimes.’” It is up to Palestinian leaders to decide whether they want to be terrorists or statesmen. Pretending that they can be both at once was the height of Western folly, as Ari Fuld no doubt recognized.
May his memory be a blessing.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
The end of the water's edge.
It was the blatant subversion of the president’s sole authority to conduct American foreign policy, and the political class received it with fury. It was called “mutinous,” and the conspirators were deemed “traitors” to the Republic. Those who thought “sedition” went too far were still incensed over the breach of protocol and the reckless way in which the president’s mandate was undermined. Yes, times have certainly changed since 2015, when a series of Republican senators signed a letter warning Iran’s theocratic government that the Joint Comprehensive Plan of Action (aka, the Iran nuclear deal) was built on a foundation of sand.
The outrage that was heaped upon Senate Republicans for freelancing on foreign policy in the final years of Barack Obama’s administration has not been visited upon former Secretary of State John Kerry, though he arguably deserves it. In the publicity tour for his recently published memoir, Kerry confessed to conducting meetings with Iranian Foreign Minister Javad Zarif “three or four times” as a private citizen. When asked by Fox News Channel’s Dana Perino if Kerry had advised his Iranian interlocutor to “wait out” the Trump administration to get a better set of terms from the president’s successor, Kerry did not deny the charge. “I think everybody in the world is sitting around talking about waiting out President Trump,” he said.
Think about that. This is a former secretary of state who all but confirmed that he is actively conducting what the Boston Globe described in May as “shadow diplomacy” designed to preserve not just the Iran deal but all the associated economic relief and security guarantees it provided Tehran. The abrogation of that deal has put new pressure on the Iranians to liberalize domestically, withdraw their support for terrorism, and abandon their provocative weapons development programs—pressures that the deal’s proponents once supported.
“We’ve got Iran on the ropes now,” said former Democratic Sen. Joe Lieberman, “and a meeting between John Kerry and the Iranian foreign minister really sends a message to them that somebody in America who’s important may be trying to revive them and let them wait and be stronger against what the administration is trying to do.” This is absolutely correct because the threat Iran poses to American national security and geopolitical stability is not limited to its nuclear program. The Iranian threat will not be neutralized until it abandons its support for terror and the repression of its people, and that will not end until the Iranian regime is no more.
While Kerry’s decision to hold a variety of meetings with a representative of a nation hostile to U.S. interests is surely careless and unhelpful, it is not uncommon. During his 1984 campaign for the presidency, Jesse Jackson visited the Soviet Union and Cuba to raise his own public profile and lend credence to Democratic claims that Ronald Reagan’s confrontational foreign policy was unproductive. House Speaker Jim Wright’s trip to Nicaragua to meet with the Sandinista government was a direct repudiation of the Reagan administration’s support for the country’s anti-Communist rebels. In 2007, as Bashar al-Assad’s government was providing material support for the insurgency in Iraq, House Speaker Nancy Pelosi sojourned to Damascus to shower the genocidal dictator in good publicity. “The road to Damascus is a road to peace,” Pelosi insisted. “Unfortunately,” replied George W. Bush’s national security council spokesman, “that road is lined with the victims of Hamas and Hezbollah, the victims of terrorists who cross from Syria into Iraq.”
Honest observers must reluctantly conclude that the adage is wrong. American politics does not, in fact, stop at the water’s edge. It never has, and maybe it shouldn’t. Though it may be commonplace, American political actors who contradict the president in the conduct of their own foreign policy should be judged on the policies they are advocating. In the case of Iran, those who seek to convince the mullahs and their representatives that repressive theocracy and a terroristic foreign policy are dead-ends are advancing the interests not just of the United States but all mankind. Those who provide this hopelessly backward autocracy with the hope that America’s resolve is fleeting are, as John Kerry might say, on “the wrong side of history.”
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Michael Wolff is its Marquis de Sade. Released on January 5, 2018, Wolff’s Fire and Fury became a template for authors eager to satiate the growing demand for unverified stories of Trump at his worst. Wolff filled his pages with tales of the president’s ignorant rants, his raging emotions, his television addiction, his fast-food diet, his unfamiliarity with and contempt for Beltway conventions and manners. Wolff made shocking insinuations about Trump’s mental state, not to mention his relationship with UN ambassador Nikki Haley. Wolff’s Trump is nothing more than a knave, dunce, and commedia dell’arte villain. The hero of his saga is, bizarrely, Steve Bannon, who in Wolff’s telling recognized Trump’s inadequacies, manipulated him to advance a nationalist-populist agenda, and tried to block his worst impulses.
Wolff’s sources are anonymous. That did not slow down the press from calling his accusations “mind-blowing” (Mashable.com), “wild” (Variety), and “bizarre” (Entertainment Weekly). Unlike most pornographers, he had a lesson in mind. He wanted to demonstrate Trump’s unfitness for office. “The story that I’ve told seems to present this presidency in such a way that it says that he can’t do this job, the emperor has no clothes,” Wolff told the BBC. “And suddenly everywhere people are going, ‘Oh, my God, it’s true—he has no clothes.’ That’s the background to the perception and the understanding that will finally end this, that will end this presidency.”
Nothing excites the Resistance more than the prospect of Trump leaving office before the end of his term. Hence the most stirring examples of Resistance Porn take the president’s all-too-real weaknesses and eccentricities and imbue them with apocalyptic significance. In what would become the standard response to accusations of Trumpian perfidy, reviewers of Fire and Fury were less interested in the truth of Wolff’s assertions than in the fact that his argument confirmed their preexisting biases.
Saying he agreed with President Trump that the book is “fiction,” the Guardian’s critic didn’t “doubt its overall veracity.” It was, he said, “what Mailer and Capote once called a nonfiction novel.” Writing in the Atlantic, Adam Kirsch asked: “No wonder, then, Wolff has written a self-conscious, untrustworthy, postmodern White House book. How else, he might argue, can you write about a group as self-conscious, untrustworthy, and postmodern as this crew?” Complaining in the New Yorker, Masha Gessen said Wolff broke no new ground: “Everybody” knew that the “president of the United States is a deranged liar who surrounded himself with sycophants. He is also functionally illiterate and intellectually unsound.” Remind me never to get on Gessen’s bad side.
What Fire and Fury lacked in journalistic ethics, it made up in receipts. By the third week of its release, Wolff’s book had sold more than 1.7 million copies. His talent for spinning second- and third-hand accounts of the president’s oddity and depravity into bestselling prose was unmistakable. Imitators were sure to follow, especially after Wolff alienated himself from the mainstream media by defending his innuendos about Haley.
It was during the first week of September that Resistance Porn became a competitive industry. On the afternoon of September 4, the first tidbits from Bob Woodward’s Fear appeared in the Washington Post, along with a recording of an 11-minute phone call between Trump and the white knight of Watergate. The opposition began panting soon after. Woodward, who like Wolff relies on anonymous sources, “paints a harrowing portrait” of the Trump White House, reported the Post.
No one looks good in Woodward’s telling other than former economics adviser Gary Cohn and—again bizarrely—the former White House staff secretary who was forced to resign after his two ex-wives accused him of domestic violence. The depiction of chaos, backstabbing, and mutual contempt between the president and high-level advisers who don’t much care for either his agenda or his personality was not so different from Wolff’s. What gave it added heft was Woodward’s status, his inviolable reputation.
“Nothing in Bob Woodward’s sober and grainy new book…is especially surprising,” wrote Dwight Garner at the New York Times. That was the point. The audience for Wolff and Woodward does not want to be surprised. Fear is not a book that will change minds. Nor is it intended to be. “Bob Woodward’s peek behind the Trump curtain is 100 percent as terrifying as we feared,” read a CNN headline. “President Trump is unfit for office. Bob Woodward’s ‘Fear’ confirms it,” read an op-ed headline in the Post. “There’s Always a New Low for the Trump White House,” said the Atlantic. “Amazingly,” wrote Susan Glasser in the New Yorker, “it is no longer big news when the occupant of the Oval Office is shown to be callous, ignorant, nasty, and untruthful.” How could it be, when the press has emphasized nothing but these aspects of Trump for the last three years?
The popular fixation with Trump the man, and with the turbulence, mania, frenzy, confusion, silliness, and unpredictability that have surrounded him for decades, serves two functions. It inoculates the press from having to engage in serious research into the causes of Trump’s success in business, entertainment, and politics, and into the crises of borders, opioids, stagnation, and conformity of opinion that occasioned his rise. Resistance Porn also endows Trump’s critics, both external and internal, with world-historical importance. No longer are they merely journalists, wonks, pundits, and activists sniping at a most unlikely president. They are politically correct versions of Charles Martel, the last line of defense preventing Trump the barbarian from enacting the policies on which he campaigned and was elected.
How closely their sensational claims and inflated self-conceptions track with reality is largely beside the point. When the New York Times published the op-ed “I am Part of the Resistance Inside the Trump Administration,” by an anonymous “senior official” on September 5, few readers bothered to care that the piece contained no original material. The author turned policy disagreements over trade and national security into a psychiatric diagnosis. In what can only be described as a journalistic innovation, the author dispensed with middlemen such as Wolff and Woodward, providing the Times the longest background quote in American history. That the author’s identity remains a secret only adds to its prurient appeal.
“The bigger concern,” the author wrote, “is not what Mr. Trump has done to the presidency but what we as a nation have allowed him to do to us.” Speak for yourself, bud. What President Trump has done to the Resistance is driven it batty. He’s made an untold number of people willing to entertain conspiracy theories, and to believe rumor is fact, hyperbole is truth, self-interested portrayals are incontrovertible evidence, credulity is virtue, and betrayal is fidelity—so long as all of this is done to stop that man in the White House.
Choose your plan and pay nothing for six Weeks!
Review of 'Stanley Kubrick' By Nathan Abrams
Except for Stanley Donen, every director I have worked with has been prone to the idea, first propounded in the 1950s by François Truffaut and his tendentious chums in Cahiers du Cinéma, that directors alone are authors, screenwriters merely contingent. In singular cases—Orson Welles, Michelangelo Antonioni, Woody Allen, Kubrick himself—the claim can be valid, though all of them had recourse, regular or occasional, to helping hands to spice their confections.
Kubrick’s variety of topics, themes, and periods testifies both to his curiosity and to his determination to “make it new.” Because his grades were not high enough (except in physics), this son of a Bronx doctor could not get into colleges crammed with returning GIs. The nearest he came to higher education was when he slipped into accessible lectures at Columbia. He told me, when discussing the possibility of a movie about Julius Caesar, that the great classicist Moses Hadas made a particularly strong impression.
While others were studying for degrees, solitary Stanley was out shooting photographs (sometimes with a hidden camera) for Look magazine. As a movie director, he often insisted on take after take. This gave him choices of the kind available on the still photographer’s contact sheets. Only Peter Sellers and Jack Nicholson had the nerve, and irreplaceable talent, to tell him, ahead of shooting, that they could not do a particular scene more than two or three times. The energy to electrify “Mein Führer, I can walk” and “Here’s Johnny!” could not recur indefinitely. For everyone else, “Can you do it again?” was the exhausting demand, and it could come close to being sadistic.
The same method could be applied to writers. Kubrick might recognize what he wanted when it was served up to him, but he could never articulate, ahead of time, even roughly what it was. Picking and choosing was very much his style. Cogitation and opportunism went together: The story goes that he attached Strauss’s Blue Danube to the opening sequence of 2001 because it happened to be playing in the sound studio when he came to dub the music. Genius puts chance to work.
Until academics intruded lofty criteria into cinema/film, the better to dignify their speciality, Alfred Hitchcock’s attitude covered most cases: When Ingrid Bergman asked for her motivation in walking to the window, Hitch replied, fatly, “Your salary.” On another occasion, told that some scene was not plausible, Hitch said, “It’s only a movie.” He did not take himself seriously until the Cahiers du Cinéma crowd elected to make him iconic. At dinner, I once asked Marcello Mastroianni why he was so willing to play losers or clowns. Marcello said, “Beh, cinema non e gran’ cosa” (cinema is no big deal). Orson Welles called movie-making the ultimate model-train set.
That was then; now we have “film studies.” After they moved in, academics were determined that their subject be a very big deal indeed. Comedy became no laughing matter. In his monotonous new book, the film scholar Nathan Abrams would have it that Stanley Kubrick was, in essence, a “New York Jewish intellectual.” Abrams affects to unlock what Stanley was “really” dealing with, in all his movies, never mind their apparent diversity. It is declared to be, yes, Yiddishkeit, and in particular, the Holocaust. This ground has been tilled before by Geoffrey Cocks, when he argued that the room numbers in the empty Overlook Hotel in The Shining encrypted references to the Final Solution. Abrams would have it that even Barry Lyndon is really all about the outsider seeking, and failing, to make his awkward way in (Gentile) Society. On this reading, Ryan O’Neal is seen as Hannah Arendt’s pariah in 18th-century drag. The movie’s other characters are all engaged in the enjoyment of “goyim-naches,” an expression—like menschlichkayit—he repeats ad nauseam, lest we fail to get the stretched point.
Theory is all when it comes to the apotheosis of our Jew-ridden Übermensch. So what if, in order to make a topic his own, Kubrick found it useful to translate its logic into terms familiar to him from his New York youth? In Abrams’s scheme, other mundane biographical facts count for little. No mention is made of Stanley’s displeasure when his 14-year-old daughter took a fancy to O’Neal. The latter was punished, some sources say, by having Barry’s voiceover converted from first person so that Michael Hordern would displace the star as narrator. By lending dispassionate irony to the narrative, it proved a pettish fluke of genius.
While conning Abrams’s volume, I discovered, not greatly to my chagrin, that I am the sole villain of the piece. Abrams calls me “self-serving” and “unreliable” in my accounts of my working and personal relationship with Stanley. He insinuates that I had less to do with Eyes Wide Shut than I pretend and that Stanley regretted my involvement. It is hard for him to deny (but convenient to omit) that, after trying for some 30 years to get a succession of writers to “crack” how to do Schnitzler’s Traumnovelle, Kubrick greeted my first draft with “I’m absolutely thrilled.” A source whose anonymity I respect told me that he had never seen Stanley so happy since the day he received his first royalty check (for $5 million) for 2001. No matter.
Were Abrams (the author also of a book as hostile to Commentary as this one is to me) able to put aside his waxed wrath, he might have quoted what I reported in my memoir Eyes Wide Open to support his Jewish-intellectual thesis. One day, Stanley asked me what a couple of hospital doctors, walking away with their backs to the camera, would be talking about. We were never going to hear or care what it was, but Stanley—at that early stage of development—said he wanted to know everything. I said, “Women, golf, the stock market, you know…”
“Couple of Gentiles, right?”
“That’s what you said you wanted them to be.”
“Those people, how do we ever know what they’re talking about when they’re alone together?”
“Come on, Stanley, haven’t you overheard them in trains and planes and places?”
Kubrick said, “Sure, but…they always know you’re there.”
If he was even halfway serious, Abrams’s banal thesis that, despite decades of living in England, Stanley never escaped the Old Country, might have been given some ballast.
Now, as for Stanley Kubrick’s being an “intellectual.” If this implies membership in some literary or quasi-philosophical elite, there’s a Jewish joke to dispense with it. It’s the one about the man who makes a fortune, buys himself a fancy yacht, and invites his mother to come and see it. He greets her on the gangway in full nautical rig. She says, “What’s with the gold braid already?”
“Mama, you have to realize, I’m a captain now.”
She says, “By you, you’re a captain, by me, you’re a captain, but by a captain, are you a captain?”
As New York intellectuals all used to know, Karl Popper’s definition of bad science, and bad faith, involves positing a theory and then selecting only whatever data help to furnish its validity. The honest scholar makes it a matter of principle to seek out elements that might render his thesis questionable.
Abrams seeks to enroll Lolita in his obsessive Jewish-intellectual scheme by referring to Peter Arno, a New Yorker cartoonist whom Kubrick photographed in 1949. The caption attached to Kubrick’s photograph in Look asserted that Arno liked to date “fresh, unspoiled girls,” and Abrams says this “hint[s] at Humbert Humbert in Lolita.” Ah, but Lolita was published, in Paris, in 1955, six years later. And how likely is it, in any case, that Kubrick wrote the caption?
The film of Lolita is unusual for its garrulity. Abrams’s insistence on the sinister Semitic aspect of both Clare Quilty and Humbert Humbert supposedly drawing Kubrick like moth to flame is a ridiculous camouflage of the commercial opportunism that led Stanley to seek to film the most notorious novel of the day, while fudging its scandalous eroticism.
That said, in my view, The Killing, Paths of Glory, Barry Lyndon, and Clockwork Orange were and are sans pareil. The great French poet Paul Valéry wrote of “the profundity of the surface” of a work of art. Add D.H. Lawrence’s “never trust the teller, trust the tale,” and you have two authoritative reasons for looking at or reading original works of art yourself and not relying on academic exegetes—especially when they write in the solemn, sometimes ungrammatical style of Professor Abrams, who takes time out to tell those of us at the back of his class that padre “is derived from the Latin pater.”
Abrams writes that I “claim” that I was told to exclude all overt reference to Jews in my Eyes Wide Shut screenplay, with the fatuous implication that I am lying. I am again accused of “claiming” to have given the name Ziegler to the character played by Sidney Pollack, because I once had a (quite famous) Hollywood agent called Evarts Ziegler. So I did. The principal reason for Abrams to doubt my veracity is that my having chosen the name renders irrelevant his subsequent fanciful digression on the deep, deep meanings of the name Ziegler in Jewish lore; hence he wishes to assign the naming to Kubrick. Pop goes another wished-for proof of Stanley’s deep and scholarly obsession with Yiddishkeit.
Abrams would be a more formidable enemy if he could turn a single witty phrase or even abstain from what Karl Kraus called mauscheln, the giveaway jargon of Jewish journalists straining to pass for sophisticates at home in Gentile circles. If you choose, you can apply, on line, for screenwriting lessons from Nathan Abrams, who does not have a single cinematic credit to his name. It would be cheaper, and wiser, to look again, and then again, at Kubrick’s masterpieces.
Choose your plan and pay nothing for six Weeks!
Is American opera in terminal condition?
At the Met, distinguished singers and conductors, mostly born and trained in Europe, appeared in theatrically conservative big-budget productions of the popular operas of the 19th century, with a sprinkling of pre-romantic and modern works thrown in to leaven the loaf. City Opera, by contrast, presented younger artists—many, like Beverly Sills, born in this country—in a wider-ranging, more adventurously staged repertoire that often included new operas, some of them written by American composers, to which the public was admitted at what were then called “popular prices.”
Between them, the companies represented a feast for culture-consuming New Yorkers, though complaints were already being heard that their new theaters were too big. Moreover, neither the Met nor City Opera was having any luck at commissioning memorable new operas and thereby expanding and refreshing the operatic repertoire, to which only a handful of significant new works—none of them, then or since, premiered by either company—had been added since World War I.
A half-century later, the feast has turned to famine. In 2011, New York City Opera left Lincoln Center, declaring bankruptcy. It closed its doors forever two years later. The Met has weathered a nearly uninterrupted string of crises that climaxed earlier this year with the firing of James Levine, the company’s once-celebrated music director emeritus. He was accused in 2017 of molesting teenage musicians and was dismissed from all of his conducting posts in New York and elsewhere. Today the Met is in dire financial straits that threaten its long-term survival.
And while newer opera companies in such other American cities as Chicago, Houston, San Francisco, Santa Fe, and Seattle now offer alternative models of leadership, none has established itself as a potential successor either to the Met or the now-defunct NYCO.1
Is American opera as a whole in a terminal condition? Or are the collapse of the New York City Opera and the Met’s ongoing struggle to survive purely local matters of no relevance elsewhere? Heidi Waleson addresses these questions in Mad Scenes and Exit Arias: The Death of the New York City Opera and the Future of Opera in America.2 Waleson draws on her experience as the opera critic of the Wall Street Journal to speculate on the prospects for an art form that has never quite managed to set down firm roots in American culture.
In this richly informative chronicle of NYCO’s decline and fall, Waleson persuasively argues that what happened to City Opera (and, by extension, the Met) could happen to other opera companies as well. The days in which an ambitious community sought successfully to elevate itself into the first rank of world cities by building and manning an opera house are long past, and Mad Scenes and Exit Arias helps us understand why.As Waleson reminds us, it was Fiorello LaGuardia, the New York mayor who played a central role in the creation of the NYCO, who dubbed the company “the people’s opera” when it was founded in 1943. According to LaGuardia, NYCO existed to perform popular operas at popular prices for a mass audience. In later years, it moved away from that goal, but the slogan stuck. Indeed, no opera company has ever formulated a clearer statement of its institutional mission.
Even after it moved to Lincoln Center in 1966, NYCO had an equally coherent and similarly appealing purpose: It was where you went to see the opera stars of tomorrow, foremost among them Sills and Plácido Domingo, in inexpensively but imaginatively staged productions of the classics. The company went out of its way to present modern operas, too, but it never did so at the expense of its central repertoire—and tickets to its performances cost half of what the Met charged. Well into the 21st century, City Opera stuck more or less closely to its redefined mission. Under Paul Kellogg, the general and artistic director from 1996 to 2007, it did so with consistent artistic success. But revenues declined throughout the latter part of Kellogg’s tenure, in part because younger New Yorkers were unwilling to become subscribers.
In those days, the Metropolitan Opera, NYCO’s next-door neighbor, was still one of the world’s most conservative opera houses. That changed when Peter Gelb became its general manager in 2006. Gelb was resolved to modernize the Met’s productions and, to a lesser extent, its repertoire, and he simultaneously sought to heighten its national profile by digitally simulcasting live performances into movie theaters throughout America.
Kellogg was frustrated by the chronic acoustic inadequacies of the New York State Theater and sought in vain to move City Opera to a three-theater complex that was to be built (but never was) on the World Trade Center site. He retired soon after Gelb came to the Met. Kellogg was succeeded by Gérard Mortier, a European impresario who was accustomed to working in state-subsidized theaters. Mortier made a pair of fateful decisions. First, he canceled City Opera’s entire 2008–2009 season while the interior of the State Theater underwent much-needed renovations. Then he announced a follow-up season of 20th-century operas that lacked audience appeal.
That follow-up season never happened, because Mortier resigned in 2008 and fled New York. He was replaced by George Steel, who had previously served for just three months as general manager of the Dallas Opera. Under Steel, NYCO slashed its schedule to ribbons in a futile attempt to get back on its financial feet after Mortier’s financially ruinous year-long hiatus. Then he mounted a series of productions of nonstandard repertory that received mixed reviews and flopped at the box office.
The combined effect of Gelb’s innovations and the inept leadership of Mortier and Steel all but obliterated City Opera’s reason for existing. Under Gelb, the Met’s repertory ranged from such warhorses as Rigoletto and Tosca to 20th-century masterpieces like Benjamin Britten’s Midsummer Night’s Dream and Alban Berg’s Wozzeck, and tickets could be bought for as little as $20. With the Met performing a more interesting repertoire under a wider range of directors, and in part at “people’s prices,” City Opera no longer did anything that the Met wasn’t already doing on a far larger and better-financed scale. What, then, was its mission now? The truth was that it had none, and when the company went under in 2013, few mourned its passing.
As it happened, Gelb’s own innovations were a mere artistic Band-aid, for he was unwilling or unable to trim the Met’s bloated budget to any meaningful extent. He made no serious attempt to cut the company’s labor costs until a budget crisis in 2014 forced him to confront its unions, which he did with limited success. In addition, his new productions of the standard-repertory operas on which the Met relied to draw and hold older subscribers were felt by many to be trashily trendy.
The Met had particular difficulty managing the reduced circumstances of the 21st century when it came to opera. Its 3,800-seat theater has an 80-foot-deep stage with a proscenium opening that measures 54 feet on each side. (Bayreuth, by contrast, seats 1,925, La Scala 2,030, and the Vienna State Opera 2,200.) As a result, it is all but impossible to mount low-to-medium-budget shows in the Metropolitan Opera House, even as the company finds it is no longer able to fill its increasingly empty house. Two decades ago, the Met earned 90 percent of its potential box-office revenue. That figure plummeted to 66 percent by 2015, forcing Gelb to raise ticket prices to an average of $158.50 per head. On Broadway, the average price of a ticket that season was $103.86.
Above all, Gelb was swimming against the cultural tide. Asked about the effects on audience development of the Met simulcasts, he admitted that three-quarters of the people who attended them were “over 65, and 30 percent of them are over 75.” As he explained: “Grand opera is in itself a kind of a dinosaur of an art form…. The question is not whether I think I’m doing a good job or not in trying to keep the [Metropolitan Opera] alive. It’s whether I’m doing a good job or not in the face of a cultural and social rejection of opera as an art form. And what I’m doing is fighting an uphill battle to try and maintain an audience in a very difficult time.”
Was that statement buck-passing defeatism, or a fair appraisal of the state of American opera? Other opera executives distanced themselves from Gelb’s remarks, and it was true—and still is—that smaller American companies have done a somewhat better job of attracting younger audiences than the top-heavy Met. But according to the National Endowment for the Arts, the percentage of U.S. adults who attend at least one operatic performance each year declined from 3.2 percent in 2002 to 2.1 percent in 2012. This problem, of course, is not limited to opera. As I wrote in these pages in 2010, the disappearance of secondary-school arts education and the rise of digital media may well be leading to “not merely a decline in public interest in the fine arts but the death of the live audience as a cultural phenomenon.”3D oes American opera have a future in an era of what Heidi Waleson succinctly describes as “flat ticket income and rising expenses”? In the last chapter of Mad Scenes and Exit Arias, she chronicles the activities of a group of innovative smaller troupes that are “rethinking what an opera company is, what it does, and who it serves.” Yet in the same breath, she acknowledges the possibility that “filling a giant theater for multiple productions of grand operas [is] no longer an achievable goal.”
If that is so, then it may be worth asking a different question: Did American opera ever have a past? It is true that opera in America has had a great and glorious history, but virtually the whole of that history consisted of American productions of 18th- and 19th-century European operas. By contrast, no opera by an American classical composer has ever entered the international major-house repertoire. Indeed, while new American operas are still commissioned and premiered at an impressive rate, few things are so rare as a second production of any of these works.
While a handful continue to be performed—John Adams’s Nixon in China (1987), André Previn’s A Streetcar Named Desire (1995), Mark Adamo’s Little Women (1998), and Jake Heggie’s Dead Man Walking (2000)—their success is a tribute to the familiarity of their subject matter and source material, not their musico-theatrical quality. As for the rest, the hard but inescapable truth is that with the exception of George Gershwin’s Porgy and Bess (1935), virtually all large-scale American operas have been purpose-written novelties that were shelved and forgotten immediately after their premieres.
The success of Porgy and Bess, which received its premiere not in an opera house but on Broadway, reminds us that American musical comedy, unlike American opera, is deeply rooted in our national culture, in much the same way that grand opera is no less deeply rooted in the national cultures of Germany and Italy, where it is still genuinely popular (if less so today than a half-century ago). By comparison with Porgy, Carousel, Guys and Dolls, or My Fair Lady, American opera as a homegrown form simply does not exist: It is merely an obscure offshoot of its European counterpart. Aaron Copland, America’s greatest composer, was not really joking when he wittily described opera as “la forme fatale,” and his own failed attempts to compose an audience-friendly opera that would be as successful as his folk-flavored ballet scores say much about the difficulties facing any composer who seeks to follow in his footsteps.
It is not that grand opera is incapable of appealing to American theatergoers. Even now, there are many Americans who love it passionately, just as there are regional companies such as Chicago’s Lyric Opera and San Francisco Opera that have avoided making the mistakes that closed City Opera’s doors. Yet the crises from which the Metropolitan Opera has so far failed to extricate itself suggest that in the absence of the generous state subsidies that keep European opera houses in business, large-house grand opera in America may simply be too expensive to thrive—or, ultimately, to survive. At its best, no art form is more thrilling or seductive. But none is at greater risk of following the dinosaurs down the cold road to extinction.
1 The “New York City Opera” founded in 2016 that now mounts operas in various New York theaters on an ad hoc basis is a brand-new enterprise that has no connection with its predecessor.
2 Metropolitan Books, 304 pages