F. R. LEAVIS, who is widely regarded as England's most important literary critic, recently launched a violent attack on C.…
F. R. Leavis, who is widely regarded as England’s most important literary critic, recently launched a violent attack on C. P. SNOW, and in particular on Snow’s famous description of the split between the “two cultures” of science and literature. The Leavis attack provoked a more heated intellectual controversy than any England has witnessed for some time, but the issues involved extend far beyond specifically English concerns. Consequently, we invited Lionel Trilling, whom many consider to be America’s leading critic, to comment on this controversy and to discuss the various questions that arise from it. A professor of English at Columbia, Mr. Trilling is the author, among other books, of The Liberal Imagination, The Opposing Self, and A Gathering of Fugitives.
It is now nearly eighty years since Matthew Arnold came to this country on his famous lecture tour. Of his repertory of three lectures, none was calculated to give unqualified pleasure to his audience. The lecture on Emerson praised that most eminent of American writers only after it had denied that he was a literary figure of the first order. The lecture called “Numbers” raised disturbing questions about the relation of democracy to excellence and distinction. “Literature and Science” was the least likely to give offense, yet even this most memorable of the three Discourses in America was not without its touch of un-comfortableness. In 1883 America was by no means committed—and, indeed, never was to be committed—to the belief that the right education for the modern age must be predominantly scientific and technical, and Arnold, when he cited the proponents of this idea, which of course he opposed, mentioned only those who were English. Yet his audiences surely knew that Arnold was warning them against what would seem to be the natural tendency of an industrial democracy to devalue the old “aristocratic” education in favor of studies that are merely practical.
Arnold wrote “Emerson” and “Numbers” especially for his American tour, but he had first composed “Literature and Science” as the Rede Lecture at Cambridge in 1882. Its original occasion cannot fail to have a peculiar interest at this moment, for C. P. Snow’s The Two Cultures and the Scientific Revolution, around which so curious a storm rages in England, was the Rede Lecture of 1959.
Sir Charles did not mention his great predecessor in the lectureship, although his own discourse was exactly on Arnold’s subject and took a line exactly the opposite of Arnold’s. And F. R. Leavis, whose admiration of Arnold is well known and whose position in respect to the relative importance of literature and of science in education is much the same as Arnold’s, did not mention Arnold either, when, in his recent Richmond Lecture at Downing College, he launched an attack of unexampled ferocity upon the doctrine and the author of The Two Cultures.
In its essential terms, the issue in debate has not changed since Arnold spoke. Arnold’s chief antagonist was T. H. Huxley—it was he who, in his lecture on “Culture and Education,” had said that literature should, and inevitably would, step down from its preeminent place in education, that science and not “culture” must supply the knowledge which is necessary for an age committed to rational truth and material practicality. What is more, Huxley said, science will supply the very basis of the assumptions of modern ethics. In effect Snow says nothing different.
- The word “culture” had been Arnold’s personal insigne ever since the publication of Culture and Anarchy in 1867 and Huxley made particular reference to the views on the value of humanistic study which Arnold had expressed in that book.1 Arnold’s reply in “Literature and Science” could not have been simpler, just as it could not have been more temperate, although it surely did not surpass in temperateness Huxley’s statement of his disagreement with Arnold’s ideas; the two men held each other in high admiration and were warm friends. Arnold said that he had not the least disposition to propose that science be slighted in education. Quite apart from its practical value, scientific knowledge is naturally a delight to the mind, no doubt engaging certain mental temperaments more than others but holding out the promise of intellectual pleasure to all. Yet of itself science does not, as Arnold put it, “serve” the instinct for conduct and the instinct for beauty, or at least it does not serve these instincts as they exist in most men. This service, which includes the relating of scientific knowledge to the whole life of man, is rendered by culture, which is not to be thought of as confined to literature—to belles lettres—but as comprising all the humane intellectual disciplines. When Dr. Leavis asserts the primacy of the humanities in education, he refers more exclusively to literature than Arnold did, but in general effect his position is the same.
It may seem strange, and a little tiresome, that the debate of eighty years ago should be instituted again today. Yet it is perhaps understandable in view of the “scientific revolution” about which Sir Charles tells us. This revolution would seem to be one of the instances in which a change of quantity becomes a change in kind—science can now do so much more and do it so much more quickly than it could a generation ago, let alone in the last century, that it has been transmuted from what the world has hitherto known. One of the consequences of this change—to Sir Charles it is the most salient of all possible consequences—is the new social hope that is now held out to us, of life made better in material respects, not merely in certain highly developed countries but all over the world and among peoples that at the moment are, by Western standards, scarcely developed at all.
The new power of science perhaps justifies a contemporary revival of the Victorian question. But if we consent to involve ourselves in the new dialectic of the old controversy, we must be aware that we are not addressing ourselves to a question of educational theory, or to an abstract contention as to what kind of knowledge has the truest affinity with the human soul. We approach these matters only to pass through them. What we address ourselves to is politics, and politics of a quite ultimate kind, and to the disposition of the modern mind.
The two cultures has had a very considerable currency in England and America ever since its publication in 1959, and in England it was for a time the subject of lively discussion. Indeed, the general agreement in England that it was a statement of great importance, to the point of its being used as an assigned text in secondary schools, was what aroused Dr. Leavis to make his assault on the lecture this long after the first interest in it had subsided. The early discussions of The Two Cultures were of a substantive kind, but the concerns which now agitate the English in response to Dr. Leavis’s attack have scarcely anything to do with literature and science, or with education, or with social hope. These matters have now been made a mere subordinate element in what amounts to a scandal over a breach of manners. The published comments on Dr. Leavis’s attack on The Two Cultures were, with few exceptions, directed to such considerations as the exact degree of monstrousness which Dr. Leavis achieved in speaking of Sir Charles as he did; whether or not he spoke out of envy of Sir Charles’s reputation; whether or not he has, or deserves to have, any real standing as a critic; or writes acceptable English; or represents, as he claims he does, “the essential Cambridge.”
Dr. Leavis’s Richmond Lecture, “The Significance of C. P. Snow,” was delivered in the Hall of Downing College, Cambridge, on February 28 and published in the Spectator of March 9.2 In the next week’s issue of the Spectator, seventeen letters appeared, all defending Snow and most of them expressing anger at, or contempt for, Leavis. The following week brought fifteen more communications, of which eight expressed partisanship with Leavis; several of these deplored the tone of the previous week’s correspondence. Many of the correspondents who defended Snow were of distinguished reputation; of the defenders of Leavis, the only one known to me was Mr. Geoffrey Wagner, who wrote from America to communicate his belief that the attack on Snow was much needed, for, despite a parody in New Left Review in which Snow appears as C. P. Sleet, despite, too, his own adverse criticism of Snow in the Critic, “the hosannas obediently continued on this side of the Atlantic, both from the Barzun-Trilling syndrome and the Book-of-the-Month Club, the worst of both worlds, as it were.” Three of the writers of the Snow party touched upon the question of literature and science, the scientist J. D. Bernal, the historian of science Stephen Toulmin, and the literary critic G. S. Fraser. In a miasma of personality-mongering, their letters afforded a degree of relief, but they said little that was of consequence. Of the Leavis party two dons of the University of Birmingham in a joint letter touched rapidly but with some cogency on the relation between literature and science, deploring any attempt to prefer one above the other, concluding that if one must be preferred, it should be, for reasons not stated, literature.
From the Spectator letters, so many of them expressing small and rather untidy passions, there are no doubt conclusions to be drawn, of a sufficiently depressing sort, about the condition of cultural life at the moment. But no awareness that we may have of the generally bad state of intellectual affairs ought to blind us to the particular fault of Dr. Leavis in his treatment of Sir Charles Snow. Intelligent and serious himself, Dr. Leavis has in this instance been the cause of stupidity and triviality in other men.
There can be no two opinions about the tone in which Dr. Leavis deals with Sir Charles. It is a bad tone, an impermissible tone. It is bad in a personal sense because it is cruel—it manifestly intends to wound. It is bad intellectually because by its use Dr. Leavis has diverted attention, his own included, from the matter he sought to illuminate. The doctrine of The Two Cultures is a momentous one and Dr. Leavis obscures its massive significance by bringing into consideration such matters as Sir Charles’s abilities as a novelist, his club membership, his opinion of his own talents, his worldly success, and his relation to worldly power. Anger, scorn, and an excessive consciousness of persons have always been elements of Dr. Leavis’s thought—of the very process of his thought, not merely of his manner of expressing it. They were never exactly reassuring elements, but they could be set aside and made to seem of relatively small account in comparison with the remarkable cogency in criticism which Dr. Leavis so often achieved. But as they now appear in his valedictory address—for, in effect, that is what the Richmond Lecture was, since Dr. Leavis retires this year from his university post—they cannot be easily set aside, they stand in the way of what Dr. Leavis means to say.
And, indeed, our understanding of what he means to say is to be derived less from the passionate utterance of the lecture itself than from our knowledge of the whole direction of his career in criticism. That direction was from the first determined by Dr. Leavis’s belief that the human faculty above all others to which literature addresses itself is the moral consciousness, which is also the source of all successful creation, the very root of poetic genius. The extent of his commitment to this idea results in what I believe to be a fault in his critical thought—he does not give anything like adequate recognition to those aspects of art which are gratuitous, which arise from high spirits and the impulse to play. One would suppose that the moral consciousness should, for its own purposes, take account of those aspects of art and life that do not fall within its dominion. But if the intensity of Dr. Leavis’s commitment to the moral consciousness contrives to produce this deficiency of understanding, it is no less responsible for the accuracy and force which we recognize as the positive characteristics of his work. For Dr. Leavis, literature is what Matthew Arnold said it is, the criticism of life—he can understand it in no other way. Both in all its simplicity and in all its hidden complexity, he has made Arnold’s saying his own, and from it he has drawn his strength.
If, then, Dr. Leavis now speaks with a very special intensity in response to The Two Cultures, we must do him the justice of seeing that the Rede Lecture denies, and in an extreme way, all that he has ever believed about literature—it is, in fact, nothing less than an indictment of literature on social and moral grounds. It represents literature as constituting a danger to the national well-being, and most especially when it is overtly a criticism of life.
Not only because Charles Snow is himself a practitioner of literature but also because he is the man he is, the statement that his lecture has this purport will be shocking and perhaps it will be thought scarcely credible. And I have no doubt that, in another mood and on some other occasion, Sir Charles would be happy to assert the beneficent powers of literature. But there can be no other interpretation of his lecture than that it takes toward literature a position of extreme antagonism.
The two cultures begins as an objective statement of the lack of communication between scientists and literary men. This is a circumstance that must have been often observed and often deplored. Perhaps nothing in our culture is so characteristic as the separateness of the various artistic and intellectual professions. As between, say, poets and painters, or musicians and architects, there is very little discourse, and perhaps the same thing could be remarked of scientists of different interests, say biologists and physicists. But the isolation of literary men from scientists may well seem to be the most extreme of these separations, if only because it is the most significant, for a reason which Sir Charles entirely understands: the especially close though never clearly defined relation of these two professions with our social and political life.
The even-handedness with which Sir Charles at first describes the split between the two “cultures” does not continue for long. He begins by telling us that scientists and literary men are equally to blame for the separation—they are kept apart by “a gulf of mutual incomprehension,” by distorted images of each other which give rise to dislike and hostility. But as Sir Charles’s lecture proceeds, it becomes plain that, although the scientists do have certain crudities and limitations, they are in general in the right of things and the literary men in the wrong of them. The matter which causes the scales to shift thus suddenly is the human condition. This, Sir Charles tells us, is of its nature tragic: man dies, and he dies alone. But the awareness of the ineluctably tragic nature of human life makes a moral trap, “for it tempts one to sit back, complacent in one’s unique tragedy,” paying no heed to the circumstances of everyday life, which, for the larger number of human beings, are painful. It is the literary men, we are told, who are the most likely, the scientists who are the least likely, to fall into this moral trap; the scientists “are inclined to be impatient to see if something can be done: and inclined to think that it can be done, until it’s proved otherwise.” It is their spirit, “tough and good and determined to fight it out at the side of their brother men,” which has “made scientists regard the other [i.e. the literary] culture’s social attitudes as contemptible.”
“This is too facile,” Sir Charles says in mild rebuke of the scientists, by which he of course means that essentially they are right. There follows a brief consideration of a question raised not by Sir Charles in his own person but by “a scientist of distinction” whom he quotes: “Yeats, Pound, Wyndham Lewis, nine out of ten of those who have dominated literary sensibility in our time, weren’t they not only politically silly, but politically wicked? Didn’t the influence of all they represent bring Auschwitz that much nearer?” And Sir Charles in answer grants that Yeats was a magnanimous man and a great poet, but he will not, he says, defend the indefensible—“the facts . . . are broadly true.” Sir Charles in general agrees, that is, that the literary sensibility of our time brought Auschwitz nearer. He goes on to say that things have changed considerably in the literary life in recent years, even if slowly, for “literature changes more slowly than science.”
From the mention of Auschwitz onward, the way is open to the full assertion by Sir Charles of the virtues of the scientists. Although they are admitted to be sometimes gauche or stupidly self-assertive, although Sir Charles concedes of some of them that “the whole literature of the traditional culture doesn’t seem relevant to [their] interests” and that, as a result, their “imaginative understanding” is diminished, he yet finds them to be men of a natural decency; they are free from racial feelings, they are lovers of equality, they are cooperative. And chief among their virtues, as Sir Charles describes them, is the fact that they “have the future in their bones.”
Indeed, it turns out that it is the future, and not mere ignorance of each other’s professional concerns, that makes the separation between the culture of science and the culture of literature. Scientists have the future in their bones. Literary men do not. Quite the contrary—“If the scientists have the future in their bones, then the traditional culture responds by wishing that the future did not exist.” The future that the scientists have in their bones is understood to be nothing but a good future; it is very much like the History of the Marxists, which is always the triumph of the right, never possibly the record of defeat. In fact, to entertain the idea that the future might be bad is represented as being tantamount to moral ill-will—in a note appended to the sentence I have just quoted, Sir Charles speaks of George Orwell’s 1984 as “the strongest possible wish that the future shall not exist.”
It is difficult to credit the implications of this astonishing remark and to ascribe them to Sir Charles. As everyone recalls, Orwell’s novel is an imagination of the condition of the world if the authoritarian tendencies which are to be observed in the present develop themselves—logically, as it were—in the future, the point being that it is quite within the range of possibility that this ultimate development should take place. In Orwell’s representation of an absolute tyranny, science has a part, and a polemical partisan of science might understand this as the evidence of a literary man’s malice toward science. But it is much more likely that, when Orwell imagined science as one of the instruments of repression, he meant to say that science, like everything else that is potentially good, like literature itself, can be perverted and debased to the ends of tyranny. Orwell was a man who, on the basis of actual and painful experience, tried to tell the truth about politics, even his own politics. I believe that he never gave up his commitment to socialism, but he refused to be illusioned in any way he could prevent; it lay within the reach of his mind to conceive that even an idealistic politics, perhaps especially an idealistic politics, can pervert itself. To say of such a man that he wishes that the future—the presumably good future—shall not exist is like saying that intelligence wishes that the future shall not exist.
Having characterized the culture of literature, or, as he sometimes calls it, “the traditional culture,” by its hostility to the future, Sir Charles goes on to say that “it is the traditional culture, to an extent remarkably little diminished by the emergence of the scientific one, which manages the western world.” This being so, it follows that the traditional culture must be strictly dealt with if the future is to be brought into being: what is called “the existing pattern” must be not merely changed but “broken.” Only if this is done shall we be able to educate ourselves as we should. As for the need to educate ourselves: “To say, we have to educate ourselves or perish is perhaps a little more melodramatic than the facts warrant. To say, we have to educate ourselves or watch a steep decline in our lifetime is about right.” And Sir Charles indicates our possible fate by the instance—he calls it an “historical myth”—of the Venetian Republic in its last half century. “Its citizens had become rich, as we did, by accident. They had acquired immense political skill, just as we have. A good many of them were tough-minded, realistic, patriotic men. They knew, just as clearly as we know, that the current of history had begun to flow against them. Many of them gave their minds to working out ways to keep going. It would have meant breaking the pattern into which they had been crystallized. They were fond of the pattern, just as we are fond of ours. They never found the will to break it.”
I quoted without comment Sir Charles’s statement of the idea on which, we may say, the whole argument of The Two Cultures is based: “It is the traditional culture, to an extent remarkably little diminished by the emergence of the scientific one, which manages the western world.” It is a bewildering statement. In what way can we possibly understand it? That the Western world is managed by some agency which is traditional is of course comprehensible. And we can take in the idea that this agency may be described, for particular purposes of explanation, in terms of a certain set of mind, a general tendency of thought and feeling which, being pervasive, is hard to formulate, and that this is to be called “a culture.” But for Sir Charles, the words “traditional” and “literary” are interchangeable, and that this culture, as we agree to call it, is literary, that it bears the same relation to actual literary men and their books that what is called the “scientific culture” bears to scientists and their work in laboratories, is truly a staggering thought. The actions of parliaments and congresses and cabinets in directing the massive affairs of state, the negotiations of embassies, the movement of armies and fleets, the establishment of huge scientific projects for the contrivance of armaments and of factories for the production of them, the promises made to citizens, and the choices made by voters at the polls—these, we are asked to believe, are in the charge of the culture of literature. What can this mean?
It can of course be said that literature has some part in the management of the Western world, a part which is limited but perhaps not wholly unimportant. If, for example, we compare the present condition of industrial England with the condition of industrial England in the early 19th century, we can say that the present condition is not, in human respects, anything like what men of good will might wish it to be, but that it is very much better than it was in the early years of the Industrial Revolution. And if we then ask what agencies brought about the improvement, we can say that one of them was literature. Certain literary men raised the “Condition of England Question” in a passionate and effective way and their names are still memorable to us—Coleridge, Carlyle, Mill (I take him to be a man of letters; he was certainly a good literary critic), Dickens, Ruskin, Arnold, William Morris. They made their effect only upon individuals, but the individuals they touched were numerous, and by what they said they made it ever harder for people to be indifferent to the misery around them or to the degradation of the national life in which they came to think themselves implicated. These literary men helped materially, some would say decisively, to bring about a change in the state of affairs. This is not exactly management, but it is a directing influence such as literature in the modern time often undertakes to have and sometimes does have.
Yet in Sir Charles’s opinion this directing influence of the literary men of the 19th century deserves no praise. On the contrary, his description of their work is but another count in the indictment of the culture of literature. Speaking of the response which literary men made to the Industrial Revolution, he says, “Almost everywhere . . . intellectual persons did not comprehend what was happening. Certainly the writers didn’t. Plenty of them shuddered away, as though the right course for a man of feeling was to contract out; some, like Ruskin and William Morris and Thoreau and Emerson and Lawrence, tried various kinds of fancies, which were not much in effect more than screams of horror. It is hard to think of a writer of high class who really stretched his imaginative sympathy, who could see at once the hideous back-streets, the smoking chimneys, the internal price—and also the prospects of life that were opening out for the poor. . . .”
Nothing could be further from the truth. No great English writer of the 19th century, once he had become aware of the Industrial Revolution, ever contracted out. This is not the place to rehearse the miseries that were acquiesced in by those who comforted the world and their own consciences with the thought of “the prospects of life that were opening out for the poor.” It is enough to say that there were miseries in plenty, of a brutal and horrifying kind, by no means adequately suggested by phrases like “the hideous back-streets, the smoking chimneys, the internal price.” (Auschwitz, since it has been mentioned, may be thought of as the development of the conditions of the factories and mines of the earlier Industrial Revolution.) If the writers “shuddered away,” it was not in maidenly disgust with machines and soot; if they uttered “screams of horror,” it was out of moral outrage at what man had made of man—and of women and little children. Their emotions were no different from those expressed by Karl Marx in his chapter on the Working Day, nor from those expressed in Blue Books by the factory inspectors, those remarkable men of the middle class whom Marx, in a moving passage of Capital, praises and wonders at for their transcendence of their class feelings.
I have mentioned Matthew Arnold among those writers who made the old conditions of the Industrial Revolution ever less possible. Like many of his colleagues in this undertaking, he did entertain “fancies”—they all found modern life ugly and fatiguing and in some way false, and they set store by certain qualities which are no doubt traditional to the point of being archaic3 But Arnold’s peculiar distinction as a literary critic is founded on the strong sensitivity of his response to the modern situation. He uniquely understood what Hegel had told the world, that the French Revolution marked an absolute change in the condition of man. For the first time in history, Hegel said, Reason—or Idea, or Theory, or Creative Imagination—had become decisive in human destiny. Arnold’s argument in “Literature and Science” was the affirmation of the French Revolution; he was speaking on behalf of the illumination and refinement of that Reason by which man might shape the conditions of his own existence. This is the whole purport of his famous statement, “Literature is the criticism of life.”
That saying used to have a rough time of it, perhaps because people found the word criticism narrow and dour and wished to believe that life was worthier of being celebrated than criticized. But less and less, I think, will anyone find the ground on which to quarrel with it. Whatever else we also take literature to be, it must always, for us now, be the criticism of life.
But it would seem to be precisely the critical function of literature that troubles Sir Charles. And perhaps that is why, despite all that he says about the need to educate ourselves, he does not make a single substantive proposal about education.
If we undertake to say what the purpose of modern education is, our answer will surely be suggested by Arnold’s phrase, together with the one by which he defined the particular function of criticism: “to see the object as in itself it really is.” Whenever we undertake to pass judgment on an educational enterprise, the import of these two phrases serves as our criterion: we ask that education supply the means for a criticism of life and teach the student to try to see the object as in itself it really is. Yet when Sir Charles speaks of the need to break the “existing pattern” and to go on to a right education, he does not touch upon any such standard of judgment. Although he would seem to be the likeliest person in the world to speak intelligently about the instruction in science of students who do not intend to be scientists, actually he says nothing more on the subject than that ignorance of the Second Law of Thermodynamics is equivalent to ignorance of Shakespeare, or that the Yang-Lee experiment at Columbia should have been a topic of general conversation at college High Tables.
Nor does he propose anything for the education of the scientist, except, of course, science. He does say that scientists need to be “trained not only in scientific but in human terms,” but he does not say how. Scientists—but eventually one begins to wonder if they are really scientists and not advanced technologists and engineers—are to play a decisive part in the affairs of mankind, but nowhere does Sir Charles suggest that, if this is so, they will face difficulties and perplexities and that their education should include the study of books—they need not be “literary,” they need not be “traditional”: they might be contemporary works of history, sociology, anthropology, psychology, philosophy—which would raise the difficult questions and propose the tragic complexity of the human condition, which would suggest that it is not always easy to see the object as in itself it really is.
Well, it isn’t beyond belief that a professional corps of high intellectual quality, especially if it is charged with great responsibility, should learn to ask its own questions and go on to make its own ethos, perhaps a very good one. But Sir Charles would seem to be asking for more than the right of scientists to go their own way. What he seems to require for scientists is the right to go their own way with no questions asked. The culture of literature, having done its worst, must now be supplanted and is not ever to play the part of a loyal opposition. How else are we to understand Sir Charles’s contempt for the irresponsibility of the literary mind, his curious representation of the literary culture as having the management of the Western world, that is to say, as being answerable for all the anomalies, stupidities, and crimes of the Western world, for having made the “existing pattern” which must now be broken if the West is to survive or at least not suffer steep decline? It is manifest that the literary culture has lost the right to ask questions.
No one could possibly suppose of Charles Snow that he is a man who wants to curtail the rights of free criticism. The line which he takes in The Two Cultures is so far from the actuality of his temperament in this respect that we can only suppose that he doesn’t mean it, not in all the extravagance of its literalness. Or we suppose that he means it at the behest of some large preoccupation of whose goodness he is so entirely convinced that he will seek to affirm it even in ways that would take him aback if the preoccupation were not in control of his thought. And this, I think, is the case. I believe that the position of The Two Cultures is to be explained by Sir Charles’s preoccupation—it has become almost the best-known thing about him—with a good and necessary aim, with the assuring of peace, which is to say, with the compounding of the differences between the West and the Soviet Union. It is an aim which, in itself, can of course only do Sir Charles credit, yet it would seem to have implicit in it a strange desperate method of implementing itself.
For the real message of The Two Cultures is that an understanding between the West and the Soviet Union could be achieved by the culture of scientists, which reaches over factitious national and ideological differences. The field of agreement would be the scientists’ common perception of the need for coming together to put the possibilities of the scientific revolution at the disposal of the disadvantaged of all nations. The bond between scientists, Sir Charles has told us, is virtually biological: they all have the future in their bones. Science brings men together in despite of all barriers—speaking of the way in which the very wide differences in the class origins of English scientists were overcome to make the scientific culture of England (and seeming to imply that this is a unique grace of scientists, that English men of letters never had differences of class to overcome), Sir Charles says, “Without thinking about it, they respond alike. That is what a culture means.” And in the same way, “without thinking about it,” the scientists of the West and the scientists of the Soviet Union may be expected to “respond alike.” And, since “that is what a culture means,” they will have joined together in an entity which will do what govern-merits have not done, the work of relieving the misery of the world. But in the degree to which science naturally unites men, literature separates them, and the scientists of the world cannot form this beneficent entity until we of the West break the existing pattern of our traditional culture, the literary culture, which is self-regarding in its complacent acceptance of tragedy, which is not only indifferent to human suffering but willing to inflict it, which asks rude and impertinent questions about the present and even about the future.
It is a point of view that must, I suppose, in desperate days, have a show of reason. In desperate days, it always seems wise to throw something or someone overboard, preferably Jonah or Arion, the prophet or the poet. Mr. G. S. Fraser, for example, seems to understand what Sir Charles wants, and he is rather willing to go along with him, rather open to the idea that the achievement of peace may require some adverse judgment on literature. “It does not matter,” he says, “whether we save the real Cambridge within the actual Cambridge . . . ; what we want to save is our actual human world with all the spots on it. This will not be done by teaching English at universities; men like Snow, at home both in Russia and America, and in a simple blunt way trying to teach these two blunt simple giants to understand each other may in in the end prove greater benefactors than Dr. Leavis.”
No, the world will not be saved by teaching English at universities, nor, indeed, by any other literary activity. It is very hard to say what will save the world, and pretty surely it is no one single thing. But we can be perfectly certain that the world will not be saved by denying the actualities of the world. Among these actualities politics is one. And it can be said of The Two Cultures that it communicates the strongest possible wish that we should forget about politics. It mentions national politics once, speaking of it as the clog upon the activity of scientists, as the impeding circumstance in which they must work. But the point is not developed and the lecture has the effect of suggesting that the issue is not between the abilities and good intentions of scientists and the inertia or bad will of governments; the issue is represented as being between the good culture of science and the bad culture of literature.
In this denial of the actuality of politics, Sir Charles is at one with the temper of intellectuals today—we all want politics not to exist, we all want that statement of Hegel’s to be absolutely and immediately true, we dream of Reason taking over the whole management of the world, and soon. No doubt a beneficent eventuality, but our impatience for it is dangerous if it leads us to deny the actuality of politics in the present. While we discuss, at Sir Charles’s instance, the relative merits of scientific Philosopher Kings as against literary Philosopher Kings, politics goes on living its own autonomous life, of which one aspect is its massive resistance to Reason. What is gained by describing the resistance to Reason as other than it is, by thinking in the specious terms of two opposing “cultures”?
But of course the fact is that politics is not finally autonomous. It may be so massively resistant to Reason that we are led to think of its resistance as absolute—in bad times we conceive politics to be nothing but power. Yet it cannot be said—at least not so long as politics relies in any degree upon ideology—that politics is never susceptible to such Reason as is expressed in opinion, only that it is less susceptible in some nations and at some times than in other nations and at other times. And nowhere and at no time is politics exempt from moral judgment, whether or not that judgment is effectual. But if we make believe, as The Two Cultures does, that politics does not exist at all, then it cannot be the object of moral judgment. And if we deny all authority to literature, as The Two Cultures does, going so far as to say that the great traditional agency of moral awareness is itself immoral, then the very activity of moral judgment is impugned, except for that single instance of it which asserts the Tightness of bringing the benefits of science to the disadvantaged of the world. In short, Sir Charles, seeking to advance the cause of understanding between the West and the Soviet Union, would seem to be saying that this understanding will come if we conceive both that politics cannot be judged (because it does not really exist) and that it should not be judged (because the traditional agency of judgment is irresponsible).
I Judge the two Cultures to be a book which is mistaken in a very large way indeed. And I find the failure of Dr. Leavis’s criticism of it to consist in his addressing himself not to the full extent of its error but to extraneous matters. From reading the Richmond Lecture one gains the impression that the substance of the Rede Lecture is extremely offensive to Dr. Leavis, that all his sensibilities are outraged by it: we conclude that Sir Charles wants something which is very different from what Dr. Leavis wants, and that Dr. Leavis thinks that what Sir Charles wants is crude and vulgar. But we can scarcely suppose from Dr. Leavis’s response that what Sir Charles says has a very wide reference—for all we can tell, he might have been proposing a change in the university curriculum which Dr. Leavis is repelling with the violence and disgust that are no doubt often felt though not expressed at meetings of curriculum committees. For Dr. Leavis, who has always attached great importance to educational matters, the proposed change is certainly important beyond the university. He understands it both as likely to have a bad effect on the national culture and as being the expression of something already bad in the national culture. But this, we suppose, he would feel about any change in the curriculum.
In short, Dr. Leavis, in dealing with the Rede Lecture, has not seen the object as in itself it really is, just as Sir Charles, in dealing with the culture of literature in its relation to politics, has not seen the object as in itself it really is.
An example of the inadequacy of Dr. Leavis’s criticism of The Two Cultures is his response to what Sir Charles says, in concert with the distinguished scientist, about the political posture of the great writers of the modern period. That statement, if we stop short of its mention of Auschwitz—which makes a most important modification—certainly does have a color of truth. It is one of the cultural curiosities of the first three decades of the 20th century that, while the educated people, the readers of books, tended to become ever more liberal and radical in their thought, there is no literary figure of the very first rank (although many of the next rank) who, in his work, makes use of or gives credence to liberal or radical ideas. I remarked on this circumstance in an essay of 1946. “Our educated class,” I said, “has a ready if mild suspiciousness of the profit motive, a belief in progress, science, social legislation, planning, and international cooperation, perhaps especially where Russia is in question. These beliefs do great credit to those who hold them. Yet it is a comment, if not on our beliefs then on our way of holding them, that not a single first-rate writer has emerged to deal with these ideas, and the emotions that are consonant with them, in a great literary way. . . . If we name those writers who, by the general consent of the most serious criticism, by consent too of the very class of educated people of which we speak, are thought of as the monumental figures of our time, we see that to these writers the liberal ideology has been at best a matter of indifference. Proust, Joyce, Lawrence, Yeats, Mann [as novelist], Kafka, Rilke, Gide [also as novelist]—all of them have their own love of justice and the good life, but in not one of them does it take the form of a love of the ideas and emotions which liberal democracy, as known by our educated class, has declared respectable.” To which it can be added that some great writers have in their work given credence or utterance to conservative and even reactionary ideas, and that some in their personal lives maintained a settled indifference to all political issues, or a disdain of them. No reader is likely to derive political light either from the works or the table-talk of a modern literary genius, and some readers (of weak mind) might even be led into bad political ways.
If these writers are to be brought to the bar of judgment, anyone who speaks as their advocate is not, as Sir Charles says, defending the indefensible. The advocacy can be conducted in honest and simple ways. It is not one of these ways to say that literature is by its nature or by definition innocent—it is powerful enough for us to suppose that it has the possibility of doing harm. But the ideational influence of literature is by no means always as direct as, for polemical purposes, people sometimes say it is. As against the dismay of Sir Charles and the distinguished scientist at the reactionary tendencies of modern literary geniuses, there is the fact—a bald one—that the English poets who learned their trade from Yeats and Eliot, or even from Pound, have notably had no sympathy with the social ideas and attitudes of their poetical masters.
Every university teacher of literature will have observed the circumstance that young people who are of radical social and political opinion are virtually never troubled by the opposed views or the settled indifference of the great modern writers. This is not because the young exempt the writer from dealing with the serious problems of living, or because they see him through a mere aesthetic haze. It is because they know—and quite without instruction—that, in D. H. Lawrence’s words, they are to trust the tale and not the teller of the tale. They perceive that the tale is always on the side of their own generous impulses. They know that, if the future is in the bones of anyone, it is in the bones of the literary genius, and exactly because the present is in his bones, exactly because the past is in his bones. They know that if a work of literature has any true artistic existence, it has value as a criticism of life; in whatever complex way it has chosen to speak, it is making a declaration about the qualities that life should have, about the qualities life does not have but should have. They feel, I think, that it is simply not possible for a work of literature that comes within the borders of greatness not to ask for more energy and fineness of life, and, by its own communication of awareness, bring these qualities into being. And if, in their experience of such a work, they happen upon an expression of contempt for some idea which they have connected with political virtue, they are not slow to understand that it is not the idea in its ideal form that is being despised, but the idea as it passes current in specious form, among certain and particular persons. I have yet to meet the student committed to an altruistic politics who is alienated from Stephen Daedalus by that young man’s disgust with political idealism, just as I have yet to meet the student from the most disadvantaged background who feels debarred from what Yeats can give him by the poet’s slurs upon shopkeepers or by anything else in his inexhaustible fund of snobbery.
If ever a man was qualified to state the case for literature, and far more persuasively than I have done, it is Dr. Leavis. His career as a critic and a teacher has been devoted exactly to the exposition of the idea that literature presents to us “the possibilities of life,” the qualities of energy and fineness that life might have. And it is, of course, the intention of the Richmond Lecture to say just this in answer to Sir Charles’s indictment. Yet something checks Dr. Leavis. When it is a question of the defense, not of literature in general, but of modern literature, he puts into countervailing evidence nothing more than a passage in which Lawrence says something, in a wry and grudging way, on behalf of social equality. This does not meet the charge; against it Sir Charles might cite a dozen instances in which Lawrence utters what Sir Charles—and perhaps even Dr. Leavis himself—would consider “the most imbecile expressions of anti-social feeling.”
There is only one feasible approach to the anti-social utterances of many modern writers, and that is to consider whether their expressions of anti-social feeling are nothing but imbecile. It is the fact, like it or not, that a characteristic cultural enterprise of our time has been the questioning of society itself, not its particular forms and aspects but its very essence. To this extreme point has the criticism of life extended itself. Of the ways of dealing with this phenomenon, that of horror and dismay, such as Sir Charles’s, is perhaps the least useful. Far better, it seems to me, is the effort to understand what this passionate hostility to society implies, to ask whether it is a symptom, sufficiently gross, of the decline of the West, or whether it is not perhaps an act of critical energy on the part of the West, an act of critical energy on the part of society itself—the effort of society to identify in itself that which is but speciously good, the effort to understand afresh the nature of the life it is designed to foster. I would not anticipate the answer, but these questions make, I am sure, the right way to come at the phenomenon.
It is not the way that Dr. Leavis comes at the phenomenon, despite his saying that the university study of literature must take its stand on the “intellectual-cultural frontier.” Of the two D. H. Lawrences, the one who descended from the social-minded 19th century and who did, in some sort, affirm the social idea, and the other, for whom the condition of salvation was the total negation of society, Dr. Leavis can be comfortable only with the former. For the fact is that his commitment to the intellectual-cultural frontier is sincere but chiefly theoretical; he has, as is well known, sympathy with very few modern writers, and he therefore cannot in good grace come to their defense against Sir Charles’s characterization of them.
Mr. walter allen, writing in the New York Times Book Review, has accurately remarked on “the common areas of agreement” between Dr. Leavis and Sir Charles. “One would expect. . . that Snow would be sympathetic to Leavis’s emphasis on the all-importance of the moral center of literature,” Mr. Allen says. “Both have attacked experiment in literature. Neither of them, to put it into crude shorthand, are Flaubert-and-Joyce men.” The similarities go further. In point of social background the two men are not much apart, at least to the untutored American eye. Both spring from the provincial middle class in one or another of its strata, and whatever differences there may have been in the material advantages that were available or lacking to one or the other, neither was reared in the assumption of easy privilege. From these origins they derived, we may imagine, their strong sense of quotidian actuality and a respect for those who discharge the duties it imposes, and a high regard for the domestic affections, a quick dislike of the frivolous and merely elegant. Neither, as I have suggested, has any least responsiveness to the tendencies of modern thought or literature which are existential or subversive. A lively young person of advanced tastes would surely say that if ever two men were committed to England, Home, and Duty, they are Leavis and Snow—he would say that in this they are as alike as two squares.
There is one other regard, an especially significant one, in which they are similar. This is their feeling about social class. One of the chief interests of Sir Charles’s novels is their explicitness about class as a determinative of the personal life, and in this respect The Two Cultures is quite as overt as the novels—its scientists make a new class by virtue of their alienation from the old class attitudes, and Sir Charles’s identification of literary men with the traditional culture which supposedly manages the Western world implies that they are in effect the representatives of an aristocratic ruling class, decadent but still powerful. The work of Dr. Leavis is no less suffused by the idea of social class, even though its preoccupation with the subject is far less explicit. To my recollection, Dr. Leavis does not make use of any of the words which denote the distinctions of English society—he does not refer to an aristocracy, a gentry, an upper-middle or lower-middle or working class. For him a class defines itself by its idea of itself—that is, by its tastes and style. Class is for him a cultural entity. And when he conceives of class power, as he often does, it is not economic or political power but, rather, cultural power that he thinks of. It is true that cultural power presents itself to his mind as being in some way suggestive of class power, but the actualities of power or influence are for him always secondary to the culture from which they arose or to which they give rise.
And indeed, no less than Sir Charles, Dr. Leavis is committed to the creation of a new class. This, we might even say, is the whole motive of his work. The social situation he would seem to assume is one in which there is a fair amount of mobility which is yet controlled and limited by the tendency of the mobile people to allow themselves to be absorbed into one of the traditional classes. As against the attraction exerted by a quasi-aristocratic, metropolitan upper-middle class, Dr. Leavis has taken it to be his function to organize the mobile people, those of them who are gifted and conscious, into a new social class formed on the basis of its serious understanding of and response to literature, chiefly English literature. In this undertaking he has by no means been wholly unsuccessful. One has the impression that many of the students he has trained think of themselves, as they take up their posts in secondary schools and universities, as constituting at least a social cadre.
The only other time I wrote about Dr. Leavis I remarked that the Cromwellian Revolution had never really come to an end in England and that Dr. Leavis was one of the chief colonels of the Roundhead party. His ideal readers are people who “are seriously interested in literature,” and it is on their behalf that he wages war against a cultural-social class which, when it concerns itself with literature, avows its preference for the qualities of grace, lightness, and irony, and deprecates an overt sincerity and seriousness. “To a polished nation,” said Gibbon, “poetry is an amusement of the fancy, not a passion of the soul,” and all through his career it is against everything that Gibbon means by a polished nation and might mean by a polished class that Dr. Leavis has set his face. Bloomsbury has been his characteristic antagonist. But now, in Charles Snow, he confronts an opponent who is as Roundhead as himself, and as earnest and intentional.
To this confrontation Dr. Leavis is not adequate. It is not an adequate response to the massive intention of The Two Cultures for Dr. Leavis to meet Sir Charles’s cultural preferences with his own preferences; or to seek to discredit Sir Charles’s ideas chiefly by making them out to be vulgar ideas or outmoded (Wellsian) ideas; or to offer, as against Sir Charles’s vision of a future made happier by science, the charms of primitive peoples “with their marvellous arts and skills and vital intelligence.” I do not mean to say that Dr. Leavis does not know where Sir Charles goes wrong in the details of his argument—he is as clear as we expect him to be in rebuking that quite massive blunder about the Victorian writers. Nor, certainly, do I mean that Dr. Leavis does not know what the great fundamental mistake of Sir Charles’s position is—he does, and he can be eloquent in asserting against a simplistic confidence in a scientific “future” the need of mankind, in the face of a rapid advance of science and technology, “to be in full intelligent possession of its full humanity (and ‘possession‘ here means, not confident ownership of that which belongs to us—our property, but a basic living deference towards that to which, opening as it does into the unknown and itself immeasurable, we know we belong).” But such moments of largeness do not save the Richmond Lecture from its general aspect of dealing with an issue that is essentially parochial. For example, of the almost limitless political implications of Sir Charles’s position it gives no evidence of awareness. And if we undertake to find a reason for the inadequacy of Dr. Leavis’s response, we will find, I think, that it is the same as the reason which accounts for Sir Charles having been in the first place so wholly mistaken in what he says—both men set too much store by the idea of culture as a category of thought.
The concept of culture is an idea of great attractiveness and undoubted usefulness. We may say that it begins in the assumption that all human expressions or artifacts are indicative of some considerable tendencies in the life of social groups or sub-groups, and that what is indicative is also causative—all cultural facts have their consequences. To think in cultural terms is to consider human expressions not only in their overt existence and avowed intention, but in, as it were, their secret life, taking cognizance of the desires and impulses which lie behind the open formulation. In the judgments which we make when we think in the category of culture we rely to a very large extent upon the style in which an expression is made, believing that style will indicate, or betray, what is not intended to be expressed. The aesthetic mode is integral to the idea of culture, and our judgments of social groups are likely to be made chiefly on an aesthetic basis—we like or do not like what we call their life-styles, and even when we judge moralities, the criterion by which we choose between two moralities of, say, equal strictness or equal laxness is likely to be an aesthetic one.
The concept of culture affords to those who use it a sense of the liberation of their thought, for they deal less with abstractions and mere objects, more with the momentous actualities of human feelings as these shape and condition the human community, as they make and as they indicate the quality of man’s existence. Not the least of the attractions of the cultural mode of thought are the passions which attend it—because it assumes that all things are causative or indicative of the whole of the cultural life, it proposes to us those intensities of moralized feeling which seem appropriate to our sense that all that is good in life is at stake in every cultural action. An instance of mediocrity or failure in art or thought is not only what it is but also a sin, deserving to be treated as such. These passions are vivifying; they have the semblance of heroism.
And if we undertake to say what were the circumstances that made the cultural mode of thought as available and as authoritative as it now is, we must refer to Marx, and to Freud, and to the general movement of existentialism, to all that the tendencies of modernity imply of the sense of contingency in life, from which we learn that the one thing that can be disputed, and that is worth disputing, is preference or taste. The Rede Lecture and the Richmond Lecture exemplify the use to which the idea of culture can be put in shaking the old certainties of class, in contriving new social groups on the basis of taste. All this does indeed give the cultural mode of thought a very considerable authority. Yet sometimes we may wonder if it is wholly an accident that so strong an impulse to base our sense of life, and our conduct of the intellectual life, chiefly upon the confrontations of taste should have developed in an age dominated by advertising, the wonderful and terrible art which teaches us that we define ourselves and realize our true being by choosing the right style. In our more depressed moments we might be led to ask whether there is a real difference between being The Person Who defines himself by his commitment to one or another idea of morality, politics, literature, or city-planning, and being The Person Who defines himself by wearing trousers without pleats.
We can, I suppose, no more escape from the cultural mode of thought than we can escape from culture itself. Yet perhaps we must learn to cast a somewhat colder eye upon it for the sake of whatever regard we have for the intellectual life, for the possibility of rational discourse. Sir Charles envisages a new and very powerful social class on the basis of a lifestyle which he imputes to a certain profession in contrast with the life-style he imputes to another profession, and he goes on from there to deny both the reality of politics and the possibility of its being judged by moral standards. Dr. Leavis answers him with a passion of personal scorn which obscures the greater part of the issue and offers in contradiction truth indeed but truth so hampered and hidden by the defenses of Dr. Leavis’s own choice in life-styles that it looks not much different from a prejudice. And the Spectator correspondents exercise their taste in lifestyles and take appropriate sides. It is at such a moment that our dispirited minds yearn to find comfort and courage in the idea of Mind, that faculty whose ancient potency our commitment to the idea of culture denies. To us today, Mind must inevitably seem but a poor gray thing, for it always sought to detach itself from the passions (but not from the emotions, Spinoza said, and explained the difference) and from the conditions of time and place. Yet it is salutary for us to contemplate it, whatever its grayness, because of the bright belief that was once attached to it, that it was the faculty which belonged not to professions, or to social classes, or to cultural groups, but to Man, and that it was possible for men, and becoming to them, to learn its proper use, for it was the means by which they could communicate with each other.
It was on this belief that science based its early existence, and it gave to the men who held it a character which is worth remarking. Sir Charles mentions Faraday among those scientists who overrode the limitations of social class to form the “scientific culture” of England. This is true only so far as it can be made consonant with the fact that Faraday could not have imagined the idea of a “scientific culture” and would have been wholly repelled by it. It is told of Faraday that he refused to be called a physicist; he very much disliked the new name as being too special and particular and insisted on the old one, philosopher, in all its spacious generality: we may suppose that this was his way of saying that he had not overridden the limiting conditions of class only to submit to the limitations of profession. The idea of Mind which had taught the bookbinder’s apprentice to embark on his heroic enterprise of self-instruction also taught the great scientist to place himself beyond the specialness of interest which groups prescribe for their members. Every personal episode in Tyndall’s classic account of his master, Faraday as a Researcher, makes it plain that Faraday undertook to be, in the beautiful lost sense of the word, a disinterested man. From his belief in Mind, he derived the certitude that he had his true being not as a member of this or that profession or class, but as—in the words of a poet of his time—“a man speaking to men.”
No one now needs to be reminded of what may befall the idea of Mind in the way of excess and distortion. The literature of the 19th century never wearied of telling us just this, of decrying the fatigue and dessication of spirit which result from an allegiance to Mind that excludes impulse and will, and desire and preference. It was, surely, a liberation to be made aware of this, and then to go on to take serious account of those particularities of impulse and will, of desire and preference, which differentiate individuals and groups—to employ what I have called the cultural mode of thought. We take it for granted that this, like any other mode of thought, has its peculiar dangers, but there is cause for surprise and regret that it should be Sir Charles Snow and Dr. Leavis who have jointly demonstrated how far the cultural mode of thought can go in excess and distortion.
1 Arnold, of course, did not use the word in the modern sense in which it is used by anthropologists, sociologists, and historians of thought and art; this is, more or less, the sense in which it is used by Snow. For Arnold, “culture” was “the best that has been thought and said in the world” and also an individual person's relation to this body of thought and expression. My own use of the word in this essay is not Arnold's.
2 In an editorial note, Dr. Leavis is quoted as saying, “The lecture was private and representatives of the press who inquired were informed that there was no admission and that no reporting was to be permitted. The appearance in newspapers of garbled reports has made it desirable that the lecture should appear in full.”
3 Emerson doesn't deserve Sir Charles's scorn on this point. His advice to the American scholar was that he should respond positively to the actual and the modern, and he was inclined to take an almost too unreserved pleasure in new forms of human energy and ingenuity. As for Thoreau, his quarrel was not with factories but with farms—and families.
Science, Literature & Culture: A Comment on the Leavis-Snow Controversy
Must-Reads from Magazine
Can it be reversed?
Writing in these pages last year (“Illiberalism: The Worldwide Crisis,” July/August 2016), I described this surge of intemperate politics as a global phenomenon, a crisis of illiberalism stretching from France to the Philippines and from South Africa to Greece. Donald Trump and Bernie Sanders, I argued, were articulating American versions of this growing challenge to liberalism. By “liberalism,” I was referring not to the left or center-left but to the philosophy of individual rights, free enterprise, checks and balances, and cultural pluralism that forms the common ground of politics across the West.
Less a systematic ideology than a posture or sensibility, the new illiberalism nevertheless has certain core planks. Chief among these are a conspiratorial account of world events; hostility to free trade and finance capital; opposition to immigration that goes beyond reasonable restrictions and bleeds into virulent nativism; impatience with norms and procedural niceties; a tendency toward populist leader-worship; and skepticism toward international treaties and institutions, such as NATO, that provide the scaffolding for the U.S.-led postwar order.
The new illiberals, I pointed out, all tend to admire established authoritarians to varying degrees. Trump, along with France’s Marine Le Pen and many others, looks to Vladimir Putin. For Sanders, it was Hugo Chavez’s Venezuela, where, the Vermont socialist said in 2011, “the American dream is more apt to be realized.” Even so, I argued, the crisis of illiberalism traces mainly to discontents internal to liberal democracies.
Trump’s election and his first eight months in office have confirmed the thrust of my predictions, if not all of the policy details. On the policy front, the new president has proved too undisciplined, his efforts too wild and haphazard, to reorient the U.S. government away from postwar liberal order.
The courts blunted the “Muslim ban.” The Trump administration has reaffirmed Washington’s commitment to defend treaty partners in Europe and East Asia. Trumpian grumbling about allies not paying their fair share—a fair point in Europe’s case, by the way—has amounted to just that. The president did pull the U.S. out of the Trans-Pacific Partnership, but even the ultra-establishmentarian Hillary Clinton went from supporting to opposing the pact once she figured out which way the Democratic winds were blowing. The North American Free Trade Agreement, which came into being nearly a quarter-century ago, does look shaky at the moment, but there is no reason to think that it won’t survive in some modified form.
Yet on the cultural front, the crisis of illiberalism continues to rage. If anything, it has intensified, as attested by the events surrounding the protest over a Robert E. Lee statue in Charlottesville, Virginia. The president refused to condemn unequivocally white nationalists who marched with swastikas and chanted “Jews will not replace us.” Trump even suggested there were “very fine people” among them, thus winking at the so-called alt-right as he had during the campaign. In the days that followed, much of the left rallied behind so-called antifa (“anti-fascist”) militants who make no secret of their allegiance to violent totalitarian ideologies at the other end of the political spectrum.
Disorder is the new American normal, then. Questions that appeared to have been settled—about the connection between economic and political liberty, the perils of conspiracism and romantic politics, America’s unique role on the world stage, and so on—are unsettled once more. Serious people wonder out loud whether liberal democracy is worth maintaining at all, with many of them concluding that it is not. The return of ideas that for good reason were buried in the last century threatens the decent political order that has made the U.S. an exceptionally free and prosperous civilization.F or many leftists, America’s commitment to liberty and equality before the law has always masked despotism and exploitation. This view long predated Trump’s rise, and if they didn’t subscribe to it themselves, too often mainstream Democrats and progressives treated its proponents—the likes of Noam Chomsky and Howard Zinn—as beloved and respectable, if slightly eccentric, relatives.
This cynical vision of the free society (as a conspiracy against the dispossessed) was a mainstay of Cold War–era debates about the relative merits of Western democracy and Communism. Soviet apologists insisted that Communist states couldn’t be expected to uphold “merely” formal rights when they had set out to shape a whole new kind of man. That required “breaking a few eggs,” in the words of the Stalinist interrogators in Arthur Koestler’s Darkness at Noon. Anyway, what good were free speech and due process to the coal miner, when under capitalism the whole social structure was rigged against him?
That line worked for a time, until the scale of Soviet tyranny became impossible to justify by anyone but its most abject apologists. It became obvious that “bourgeois justice,” however imperfect, was infinitely preferable to the Marxist alternative. With the Communist experiment discredited, and Western workers uninterested in staging world revolution, the illiberal left began shifting instead to questions of identity. In race-gender-sexuality theory and the identitarian “subaltern,” it found potent substitutes for dialectical materialism and the proletariat. We are still living with the consequences of this shift.
Although there were superficial resemblances, this new politics of identity differed from earlier civil-rights movements. Those earlier movements had sought a place at the American table for hitherto entirely or somewhat excluded groups: blacks, women, gays, the disabled, and so on. In doing so, they didn’t seek to overturn or radically reorganize the table. Instead, they reaffirmed the American Founding (think of Martin Luther King Jr.’s constant references to the Declaration of Independence). And these movements succeeded, owing to America’s tremendous capacity for absorbing social change.
Yet for the new identitarians, as for the Marxists before them, liberal-democratic order was systematically rigged against the downtrodden—now redefined along lines of race, gender, and sexuality, with social class quietly swept under the rug. America’s strides toward racial progress, not least the election and re-election of an African-American president, were dismissed. The U.S. still deserved condemnation because it fell short of perfect inclusion, limitless autonomy, and complete equality—conditions that no free society can achieve given the root fact of human nature. The accidentals had changed from the Marxist days, in other words, but the essentials remained the same.
In one sense, though, the identitarians went further. The old Marxists still claimed to stand on objectively accessible truth. Not so their successors. Following intellectual lodestars such as the gender theorist Judith Butler, the identity left came to reject objective truth—and with it, biological sex differences, aesthetic standards in art, the possibility of universal moral precepts, and much else of the kind. All of these things, the left identitarians said, were products of repressive institutions, hierarchies, and power.
Today’s “social-justice warriors” are heirs to this sordid intellectual legacy. They claim to seek justice. But, unmoored from any moral foundations, SJW justice operates like mob justice and revolutionary terror, usually carried out online. SJWs claim to protect individual autonomy, but the obsession with group identity and power dynamics means that SJW autonomy claims must destroy the autonomy of others. Self-righteousness married to total relativism is a terrifying thing.
It isn’t enough to have legalized same-sex marriage in the U.S. via judicial fiat; the evangelical baker must be forced to bake cakes for gay weddings. It isn’t enough to have won legal protection and social acceptance for the transgendered; the Orthodox rabbi must use preferred trans pronouns on pain of criminal prosecution. Likewise, since there is no objective truth to be gained from the open exchange of ideas, any speech that causes subjective discomfort among members of marginalized groups must be suppressed, if necessary through physical violence. Campus censorship that began with speech codes and mobs that prevented conservative and pro-Israel figures from speaking has now evolved into a general right to beat anyone designated as a “fascist,” on- or off-campus.
For the illiberal left, the election of Donald Trump was indisputable proof that behind America’s liberal pieties lurks, forever, the beast of bigotry. Trump, in this view, wasn’t just an unqualified vulgarian who nevertheless won the decisive backing of voters dissatisfied with the alternative or alienated from mainstream politics. Rather, a vote for Trump constituted a declaration of war against women, immigrants, and other victims of American “structures of oppression.” There would be no attempt to persuade Trump supporters; war would be answered by war.
This isn’t liberalism. Since it can sometimes appear as an extension of traditional civil-rights activism, however, identity leftism has glommed itself onto liberalism. It is frequently impossible to tell where traditional autonomy- and equality-seeking liberalism ends and repressive identity leftism begins. Whether based on faulty thinking or out of a sense of weakness before an angry and energetic movement, liberals have too often embraced the identity left as their own. They haven’t noticed how the identitarians seek to undermine, not rectify, liberal order.
Some on the left, notably Columbia University’s Mark Lilla, are sounding the alarm and calling on Democrats to stress the common good over tribalism. Yet these are a few voices in the wilderness. Identitarians of various stripes still lord over the broad left, where it is fashionable to believe that the U.S. project is predatory and oppressive by design. If there is a viable left alternative to identity on the horizon, it is the one offered by Sanders and his “Bernie Bros”—which is to say, a reversion to the socialism and class struggle of the previous century.
Americans, it seems, will have to wait a while for reason and responsibility to return to the left.T
hen there is the illiberal fever gripping American conservatives. Liberal democracy has always had its critics on the right, particularly in Continental Europe, where statist, authoritarian, and blood-and-soil accounts of conservatism predominate. Mainstream Anglo-American conservatism took a different course. It has championed individual rights, free enterprise, and pluralism while insisting that liberty depends on public virtue and moral order, and that sometimes the claims of liberty and autonomy must give way to those of tradition, state authority, and the common good.
The whole beauty of American order lies in keeping in tension these rival forces that are nevertheless fundamentally at peace. The Founders didn’t adopt wholesale Enlightenment liberalism; rather, they tempered its precepts about universal rights with the teachings of biblical religion as well as Roman political theory. The Constitution drew from all three wellsprings. The product was a whole, and it is a pointless and ahistorical exercise to elevate any one source above the others.
American conservatism and liberalism, then, are in fact branches of each other, the one (conservatism) invoking tradition and virtue to defend and, when necessary, discipline the regime of liberty; the other (liberalism) guaranteeing the open space in which churches, volunteer organizations, philanthropic activity, and other sources of tradition and civic virtue flourish, in freedom, rather than through state establishment or patronage.
One result has been long-term political stability, a blessing that Americans take for granted. Another has been the transformation of liberalism into the lingua franca of all politics, not just at home but across a world that, since 1945, has increasingly reflected U.S. preferences. The great French classical liberal Raymond Aron noted in 1955 that the “essentials of liberalism—the respect for individual liberty and moderate government—are no longer the property of a single party: they have become the property of all.” As Aron archly pointed out, even liberalism’s enemies tend to frame their objections using the rights-based talk associated with liberalism.
Under Trump, however, some in the party of the right have abdicated their responsibility to liberal democracy as a whole. They have reduced themselves to the lowest sophistry in defense of the New Yorker’s inanities and daily assaults on presidential norms. Beginning when Trump clinched the GOP nomination last year, a great deal of conservative “thinking” has amounted to: You did X to us, now enjoy it as we dish it back to you and then some. Entire websites and some of the biggest stars in right-wing punditry are singularly devoted to making this rather base point. If Trump is undermining this or that aspect of liberal order that was once cherished by conservatives, so be it; that 63 million Americans supported him and that the president “drives the left crazy”—these are good enough reasons to go along.
Some of this is partisan jousting that occurs with every administration. But when it comes to Trump’s most egregious statements and conduct—such as his repeated assertions that the U.S. and Putin’s thugocracy are moral equals—the apologetics are positively obscene. Enough pooh-poohing, whataboutery, and misdirection of this kind, and there will be no conservative principle left standing.
More perniciously, as once-defeated illiberal philosophies have returned with a vengeance to the left, so have their reactionary analogues to the right. The two illiberalisms enjoy a remarkable complementarity and even cross-pollinate each other. This has developed to the point where it is sometimes hard to distinguish Tucker Carlson from Chomsky, Laura Ingraham from Julian Assange, the Claremont Review from New Left Review, and so on.
Two slanders against liberalism in particular seem to be gathering strength on the thinking right. The first is the tendency to frame elements of liberal democracy, especially free trade, as a conspiracy hatched by capitalists, the managerial class, and others with soft hands against American workers. One needn’t renounce liberal democracy as a whole to believe this, though believers often go the whole hog. The second idea is that liberalism itself was another form of totalitarianism all along and, therefore, that no amount of conservative course correction can set right what is wrong with the system.
These two theses together represent a dismaying ideological turn on the right. The first—the account of global capitalism as an imposition of power over the powerless—has gained currency in the pages of American Affairs, the new journal of Trumpian thought, where class struggle is a constant theme. Other conservatives, who were always skeptical of free enterprise and U.S.-led world order, such as the Weekly Standard’s Christopher Caldwell, are also publishing similar ideas to a wider reception than perhaps greeted them in the past.
In a March 2017 essay in the Claremont Review of Books, for example, Caldwell flatly described globalization as a “con game.” The perpetrators, he argued, are “unscrupulous actors who have broken promises and seized a good deal of hard-won public property.” These included administrations of both parties that pursued trade liberalization over decades, people who live in cities and therefore benefit from the knowledge-based economy, American firms, and really anyone who has ever thought to capitalize on global supply chains to boost competitiveness—globalists, in a word.
By shipping jobs and manufacturing processes overseas, Caldwell contended, these miscreants had stolen not just material things like taxpayer-funded research but also concepts like “economies of scale” (you didn’t build that!). Thus, globalization in the West differed “in degree but not in kind from the contemporaneous Eastern Bloc looting of state assets.”
That comparison with predatory post-Communist privatization is a sure sign of ideological overheating. It is somewhat like saying that a consumer bank’s lending to home buyers differs in degree but not in kind from a loan shark’s racket in a housing project. Well, yes, in the sense that the underlying activity—moneylending, the purchase of assets—is the same in both cases. But the context makes all the difference: The globalization that began after World War II and accelerated in the ’90s took place within a rules-based system, which duly elected or appointed policymakers in Western democracies designed in good faith and for a whole host of legitimate strategic and economic reasons.
These policymakers knew that globalization was as old as civilization itself. It would take place anyway, and the only question was whether it would be rules-based and efficient or the kind of globalization that would be driven by great-power rivalry and therefore prone to protectionist trade wars. And they were right. What today’s anti-trade types won’t admit is that defeating the Trans-Pacific Partnership and a proposed U.S.-European trade pact known as TTIP won’t end globalization as such; instead, it will cede the game to other powers that are less concerned about rules and fair play.
The postwar globalizers may have gone too far (or not far enough!). They certainly didn’t give sufficient thought to the losers in the system, or how to deal with the de-industrialization that would follow when information became supremely mobile and wages in the West remained too high relative to skills and productivity gains in the developing world. They muddled and compromised their way through these questions, as all policymakers in the real world do.
The point is that these leaders—the likes of FDR, Churchill, JFK, Ronald Reagan, Margaret Thatcher, and, yes, Bill Clinton—acted neither with malice aforethought nor anti-democratically. It isn’t true, contra Caldwell, that free trade necessarily requires “veto-proof and non-consultative” politics. The U.S., Britain, and other members of what used to be called the Free World have respected popular sovereignty (as understood at the time) for as long as they have been trading nations. Put another way, you were far more likely to enjoy political freedom if you were a citizen of one of these states than of countries that opposed economic liberalism in the 20th century. That remains true today. These distinctions matter.
Caldwell and like-minded writers of the right, who tend to dwell on liberal democracies’ crimes, are prepared to tolerate far worse if it is committed in the name of defeating “globalism.” Hence the speech on Putin that Caldwell delivered this spring at a Hillsdale College gathering in Phoenix. Promising not to “talk about what to think about Putin,” he proceeded to praise the Russian strongman as the “preeminent statesman of our time” (alongside Turkish strongman Recep Tayyip Erdogan). Putin, Caldwell said, “has become a symbol of national self-determination.”
Then Caldwell made a remark that illuminates the link between the illiberalisms of yesterday and today. Putin is to “populist conservatives,” he declared, what Castro once was to progressives. “You didn’t have to be a Communist to appreciate the way Castro, whatever his excesses, was carving out a space of autonomy for his country.”
Whatever his excesses, indeed.T
he other big idea is that today’s liberal crises aren’t a bug but a core feature of liberalism. This line of thinking is particularly prevalent among some Catholic traditionalists and other orthodox Christians (both small- and capital-“o”). The common denominator, it seems to me, is having grown up as a serious believer at a time when many liberals—to their shame—have declared war on faith generally and social conservatism in particular.
The argument essentially is this:
We (social conservatives, traditionalists) saw the threat from liberalism coming. With its claims about abstract rights and universal reason, classical liberalism had always posed a danger to the Church and to people of God. We remembered what those fired up by the new ideas did to our nuns and altars in France. Still we made peace with American liberal order, because we were told that the Founders had “built on low but solid ground,” to borrow Leo Strauss’s famous formulation, or that they had “built better than they knew,” as American Catholic hierarchs in the 19th century put it.
Maybe these promises held good for a couple of centuries, the argument continues, but they no longer do. Witness the second sexual revolution under way today. The revolutionaries are plainly telling us that we must either conform our beliefs to Herod’s ways or be driven from the democratic public square. Can it still be said that the Founding rested on solid ground? Did the Founders really build better than they knew? Or is what is passing now precisely what they intended, the rotten fruit of the Enlightenment universalism that they planted in the Constitution? We don’t love Trump (or Putin, Hungary’s Viktor Orbán, etc.), but perhaps he can counter the pincer movement of sexual and economic liberalism, and restore a measure of solidarity and commitment to the Western project.
The most pessimistic of these illiberal critics go so far as to argue that liberalism isn’t all that different from Communism, that both are totalitarian children of the Enlightenment. One such critic, Harvard Law School’s Adrian Vermeule, summed up this position in a January essay in First Things magazine:
The stock distinction between the Enlightenment’s twins—communism is violently coercive while liberalism allows freedom of thought—is glib. Illiberal citizens, trapped [under liberalism] without exit papers, suffer a narrowing sphere of permitted action and speech, shrinking prospects, and increasing pressure from regulators, employers, and acquaintances, and even from friends and family. Liberal society celebrates toleration, diversity, and free inquiry, but in practice it features a spreading social, cultural, and ideological conformism.1
I share Vermeule’s despair and that of many other conservative-Christian friends, because there have been genuinely alarming encroachments against conscience, religious freedom, and the dignity of life in Western liberal democracies in recent years. Even so, despair is an unhelpful companion to sober political thought, and the case for plunging into political illiberalism is weak, even on social-conservative grounds.
Here again what commends liberalism is historical experience, not abstract theory. Simply put, in the real-world experience of the 20th century, the Church, tradition, and religious minorities fared far better under liberal-democratic regimes than they did under illiberal alternatives. Are coercion and conformity targeting people of faith under liberalism? To be sure. But these don’t take the form of the gulag or the concentration camp or the soccer stadium–cum-killing field. Catholic political practice knows well how to draw such moral distinctions between regimes: Pope John Paul II befriended Reagan. If liberal democracy and Communism were indeed “twins” whose distinctions are “glib,” why did he do so?
And as Pascal Bruckner wrote in his essay “The Tyranny of Guilt,” if liberal democracy does trap or jail you (politically speaking), it also invariably slips the key under your cell door. The Swedish midwives driven out of the profession over their pro-life views can take their story to the media. The Down syndrome advocacy outfit whose anti-eugenic advertising was censored in France can sue in national and then international courts. The Little Sisters of the Poor can appeal to the Supreme Court for a conscience exemption to Obamacare’s contraceptives mandate. And so on.
Conversely, once you go illiberal, you don’t just rid yourself of the NGOs and doctrinaire bureaucrats bent on forcing priests to perform gay marriages; you also lose the legal guarantees that protect the Church, however imperfectly, against capricious rulers and popular majorities. And if public opinion in the West is turning increasingly secular, indeed anti-Christian, as social conservatives complain and surveys seem to confirm, is it really a good idea to militate in favor of a more illiberal order rather than defend tooth and nail liberal principles of freedom of conscience? For tomorrow, the state might fall into Elizabeth Warren’s hands.
Nor, finally, is political liberalism alone to blame for the Church’s retreating on various fronts. There have been plenty of wounds inflicted by churchmen and laypeople, who believed that they could best serve the faith by conforming its liturgy, moral teaching, and public presence to liberal order. But political liberalism didn’t compel these changes, at least not directly. In the space opened up by liberalism, and amid the kaleidoscopic lifestyles that left millions of people feeling empty and confused, it was perfectly possible to propose tradition as an alternative. It is still possible to do so.N one of this is to excuse the failures of liberals. Liberals and mainstream conservatives must go back to the drawing board, to figure out why it is that thoughtful people have come to conclude that their system is incompatible with democracy, nationalism, and religious faith. Traditionalists and others who see Russia’s mafia state as a defender of Christian civilization and national sovereignty have been duped, but liberals bear some blame for driving large numbers of people in the West to that conclusion.
This is a generational challenge for the liberal project. So be it. Liberal societies like America’s by nature invite such questioning. But before we abandon the 200-and-some-year-old liberal adventure, it is worth examining the ways in which today’s left-wing and right-wing critiques of it mirror bad ideas that were overcome in the previous century. The ideological ferment of the moment, after all, doesn’t relieve the illiberals of the responsibility to reckon with the lessons of the past.
1 Vermeule was reviewing The Demon in Democracy, a 2015 book by the Polish political theorist and parliamentarian Ryszard Legutko that makes the same case. Fred Siegel’s review of the English edition appeared in our June 2016 issue.
How the courts are intervening to block some of the most unjust punishments of our time
Barrett’s decision marked the 59th judicial setback for a college or university since 2013 in a due-process lawsuit brought by a student accused of sexual assault. (In four additional cases, the school settled a lawsuit before any judicial decision occurred.) This body of law serves as a towering rebuke to the Obama administration’s reinterpretation of Title IX, the 1972 law barring sex discrimination in schools that receive federal funding.
Beginning in 2011, the Education Department’s Office for Civil Rights (OCR) issued a series of “guidance” documents pressuring colleges and universities to change how they adjudicated sexual-assault cases in ways that increased the likelihood of guilty findings. Amid pressure from student and faculty activists, virtually all elite colleges and universities have gone far beyond federal mandates and have even further weakened the rights of students accused of sexual assault.
Like all extreme victims’-rights approaches, the new policies had the greatest impact on the wrongly accused. A 2016 study from UCLA public-policy professor John Villasenor used just one of the changes—schools employing the lowest standard of proof, a preponderance of the evidence—to predict that as often as 33 percent of the time, campus Title IX tribunals would return guilty findings in cases involving innocent students. Villasenor’s study could not measure the impact of other Obama-era policy demands—such as allowing accusers to appeal not-guilty findings, discouraging cross-examination of accusers, and urging schools to adjudicate claims even when a criminal inquiry found no wrongdoing.
In a September 7 address at George Mason University, Education Secretary Betsy DeVos stated that “no student should be forced to sue their way to due process.” But once enmeshed in the campus Title IX process, a wrongfully accused student’s best chance for justice may well be a lawsuit filed after his college incorrectly has found him guilty. (According to data from United Educators, a higher-education insurance firm, 99 percent of students accused of campus sexual assault are male.) The Foundation for Individual Rights has identified more than 180 such lawsuits filed since the 2011 policy changes. That figure, obviously, excludes students with equally strong claims whose families cannot afford to go to court. These students face life-altering consequences. As Judge T.S. Ellis III noted in a 2016 decision, it is “so clear as to be almost a truism” that a student will lose future educational and employment opportunities if his college wrongly brands him a rapist.
“It is not the role of the federal courts to set aside decisions of school administrators which the court may view as lacking in wisdom or compassion.” So wrote the Supreme Court in a 1975 case, Wood v. Strickland. While the Supreme Court has made clear that colleges must provide accused students with some rights, especially when dealing with nonacademic disciplinary questions, courts generally have not been eager to intervene in such matters.
This is what makes the developments of the last four years all the more remarkable. The process began in May 2013, in a ruling against St. Joseph’s University, and has lately accelerated (15 rulings in 2016 and 21 thus far in 2017). Of the 40 setbacks for colleges in federal court, 14 came from judges nominated by Barack Obama, 11 from Clinton nominees, and nine from selections of George W. Bush. Brown University has been on the losing side of three decisions; Duke, Cornell, and Penn State, two each.
Court decisions since the expansion of Title IX activism have not all gone in one direction. In 36 of the due-process lawsuits, courts have permitted the university to maintain its guilty finding. (In four other cases, the university settled despite prevailing at a preliminary stage.) But even in these cases, some courts have expressed discomfort with campus procedures. One federal judge was “greatly troubled” that Georgia Tech veered “very far from an ideal representation of due process” when its investigator “did not pursue any line of investigation that may have cast doubt on [the accuser’s] account of the incident.” Another went out of his way to say that he considered it plausible that a former Case Western Reserve University student was actually “innocent of the charges levied against him.” And one state appellate judge opened oral argument by bluntly informing the University of California’s lawyer, “When I . . . finished reading all the briefs in this case, my comment was, ‘Where’s the kangaroo?’”
Judges have, obviously, raised more questions in cases where the college has found itself on the losing side. Those lawsuits have featured three common areas of concern: bias in the investigation, resulting in a college decision based on incomplete evidence; procedures that prevented the accused student from challenging his accuser’s credibility, chiefly through cross-examination; and schools utilizing a process that seemed designed to produce a predetermined result, in response to real or perceived pressure from the federal government.C olleges and universities have proven remarkably willing to act on incomplete information when adjudicating sexual-assault cases. In December 2013, for example, Amherst College expelled a student for sexual assault despite text messages (which the college investigator failed to discover) indicating that the accuser had consented to sexual contact. The accuser’s own testimony also indicated that she might have committed sexual assault, by initiating sexual contact with a student who Amherst conceded was experiencing an alcoholic blackout. When the accused student sued Amherst, the college said its failure to uncover the text messages had been irrelevant because its investigator had only sought texts that portrayed the incident as nonconsensual. In February, Judge Mark Mastroianni allowed the accused student’s lawsuit to proceed, commenting that the texts could raise “additional questions about the credibility of the version of events [the accuser] gave during the disciplinary proceeding.” The two sides settled in late July.
Amherst was hardly alone in its eagerness to avoid evidence that might undermine the accuser’s version of events; the same happened at Penn State, St. Joseph’s, Duke, Ohio State, Occidental, Lynn, Marlboro, Michigan, and Notre Dame.
Even in cases with a more complete evidentiary base, accused students have often been blocked from presenting a full-fledged defense. As part of its reinterpretation of Title IX, the Obama administration sought to shield campus accusers from cross-examination. OCR’s 2011 guidance “strongly” discouraged direct cross-examination of accusers by the accused student—a critical restriction, since most university procedures require the accused student, rather than his lawyer, to defend himself in the hearing. OCR’s 2014 guidance suggested that this type of cross-examination in and of itself could create a hostile environment. The Obama administration even spoke favorably about the growing trend among schools to abolish hearings altogether and allow a single official to serve as investigator, prosecutor, judge, and jury in sexual-assault cases.
The Supreme Court has never held that campus disciplinary hearings must permit cross-examination. Nonetheless, the recent attack on the practice has left schools struggling to explain why they would not want to utilize what the Court has described as the “greatest legal engine ever invented for the discovery of truth.” In June 2016, the University of Cincinnati found a student guilty of sexual assault after a hearing at which neither his accuser nor the university’s Title IX investigator appeared. In an unintentionally comical line, the hearing chair noted the absent witnesses before asking the accused student if he had “any questions of the Title IX report.” The student, befuddled, replied, “Well, since she’s not here, I can’t really ask anything of the report.” (The panel chair did not indicate how the “report” could have answered any questions.) Cincinnati found the student guilty anyway.1
Limitations on full cross-examination also played a role in judicial setbacks for Middlebury, George Mason, James Madison, Ohio State, Occidental, Penn State, Brandeis, Amherst, Notre Dame, and Skidmore.
Finally, since 2011, more than 300 students have filed Title IX complaints with the Office for Civil Rights, alleging mishandling of their sexual-assault allegation by their college. OCR’s leadership seemed to welcome the complaints, which allowed Obama officials not only to inspect the individual case but all sexual-assault claims at the school in question over a three-year period. Northwestern University professor Laura Kipnis has estimated that during the Obama years, colleges spent between $60 million and $100 million on these investigations. If OCR finds a Title IX violation, that might lead to a loss of federal funding. This has led Harvard Law professors Jeannie Suk Gersen, Janet Halley, Elizabeth Bartholet, and Nancy Gertner to observe in a white paper submitted to OCR that universities have “strong incentives to ensure the school stays in OCR’s good graces.”
One of the earliest lawsuits after the Obama administration’s policy shift, involving former Xavier University basketball player Dez Wells, demonstrated how an OCR investigation can affect the fairness of a university inquiry. The accuser’s complaint had been referred both to Xavier’s Title IX office and the Cincinnati police. The police concluded that the allegation was meritless; Hamilton County Prosecuting Attorney Joseph Deters later said he considered charging the accuser with filing a false police report.
Deters asked Xavier to delay its proceedings until his office completed its investigation. School officials refused. Instead, three weeks after the initial allegation, the university expelled Wells. He sued and speculated that Xavier’s haste came not from a quest for justice but instead from a desire to avoid difficulties in finalizing an agreement with OCR to resolve an unrelated complaint filed by two female Xavier students. (In recent years, OCR has entered into dozens of similar resolution agreements, which bind universities to policy changes in exchange for removing the threat of losing federal funds.) In a July 2014 ruling, Judge Arthur Spiegel observed that Xavier’s disciplinary tribunal, however “well-equipped to adjudicate questions of cheating, may have been in over its head with relation to an alleged false accusation of sexual assault.” Soon thereafter, the two sides settled; Wells transferred to the University of Maryland.
Ohio State, Occidental, Cornell, Middlebury, Appalachian State, USC, and Columbia have all found themselves on the losing side of court decisions arising from cases that originated during a time in which OCR was investigating or threatening to investigate the school. (In the Ohio State case, one university staffer testified that she didn’t know whether she had an obligation to correct a false statement by an accuser to a disciplinary panel.) Pressure from OCR can be indirect, as well. The Obama administration interpreted federal law as requiring all universities to have at least one Title IX coordinator; larger universities now employ dozens of Title IX personnel who, as the Harvard Law professors explained, “have reason to fear for their jobs if they hold a student not responsible or if they assign a rehabilitative or restorative rather than a harshly punitive sanction.”A mid the wave of judicial setbacks for universities, two decisions in particular stand out. Easily the most powerful opinion in a campus due-process case came in March 2016 from Judge F. Dennis Saylor. While the stereotypical campus sexual-assault allegation results from an alcohol-filled, one-night encounter between a male and a female student, a case at Brandeis University involved a long-term monogamous relationship between two male students. A bad breakup led to the accusing student’s filing the following complaint, against which his former boyfriend was expected to provide a defense: “Starting in the month of September, 2011, the Alleged violator of Policy had numerous inappropriate, nonconsensual sexual interactions with me. These interactions continued to occur until around May 2013.”
To adjudicate, Brandeis hired a former OCR staffer, who interviewed the two students and a few of their friends. Since the university did not hold a hearing, the investigator decided guilt or innocence on her own. She treated each incident as if the two men were strangers to each other, which allowed her to determine that sexual “violence” had occurred in the relationship. The accused student, she found, sometimes looked at his boyfriend in the nude without permission and sometimes awakened his boyfriend with kisses when the boyfriend wanted to stay asleep. The university’s procedures prevented the student from seeing the investigator’s report, with its absurdly broad definition of sexual misconduct, in preparing his appeal. “In the context of American legal culture,” Boston Globe columnist Dante Ramos later argued, denying this type of information “is crazy.” “Standard rules of evidence and other protections for the accused keep things like false accusations or mistakes by authorities from hurting innocent people.” When the university appeal was denied, the student sued.
At an October 2015 hearing to consider the university’s motion to dismiss, Saylor seemed flabbergasted at the unfairness of the school’s approach. “I don’t understand,” he observed, “how a university, much less one named after Louis Brandeis, could possibly think that that was a fair procedure to not allow the accused to see the accusation.” Brandeis’s lawyer cited pressure to conform to OCR guidance, but the judge deemed the university’s procedures “closer to Salem 1692 than Boston, 2015.”
The following March, Saylor issued an 89-page opinion that has been cited in virtually every lawsuit subsequently filed by an accused student. “Whether someone is a ‘victim’ is a conclusion to be reached at the end of a fair process, not an assumption to be made at the beginning,” Saylor wrote. “If a college student is to be marked for life as a sexual predator, it is reasonable to require that he be provided a fair opportunity to defend himself and an impartial arbiter to make that decision.” Saylor concluded that Brandeis forced the accused student “to defend himself in what was essentially an inquisitorial proceeding that plausibly failed to provide him with a fair and reasonable opportunity to be informed of the charges and to present an adequate defense.”
The student, vindicated by the ruling’s sweeping nature, then withdrew his lawsuit. He currently is pursuing a Title IX complaint against Brandeis with OCR.
Four months later, a three-judge panel of the Second Circuit Court of Appeals produced an opinion that lacked Saylor’s rhetorical flourish or his understanding of the basic unfairness of the campus Title IX process. But by creating a more relaxed standard for accused students to make federal Title IX claims, the Second Circuit’s decision in Doe v. Columbia carried considerable weight.
Two Columbia students who had been drinking had a brief sexual encounter at a party. More than four months later, the accuser claimed she was too intoxicated to have consented. Her allegation came in an atmosphere of campus outrage about the university’s allegedly insufficient toughness on sexual assault. In this setting, the accused student found Columbia’s Title IX investigator uninterested in hearing his side of the story. He cited witnesses who would corroborate his belief that the accuser wasn’t intoxicated; the investigator declined to speak with them. The student was found guilty, although for reasons differing from the initial claim; the Columbia panel ruled that he had “directed unreasonable pressure for sexual activity toward the [accuser] over a period of weeks,” leaving her unable to consent on the night in question. He received a three-semester suspension for this nebulous offense—which even his accuser deemed too harsh. He sued, and the case was assigned to Judge Jesse Furman.
Furman’s opinion provided a ringing victory for Columbia and the Obama-backed policies it used. As Title IX litigator Patricia Hamill later observed, Furman’s “almost impossible standard” required accused students to have inside information about the institution’s handling of other sexual-assault claims—information they could plausibly obtain only through the legal process known as discovery, which happens at a later stage of litigation—in order to survive a university’s initial motion to dismiss. Furman suggested that, to prevail, an accused student would need to show that his school treated a female student accused of sexual assault more favorably, or at least provide details about how cases against other accused students showed a pattern of bias. But federal privacy law keeps campus disciplinary hearings private, leaving most accused students with little opportunity to uncover the information before their case is dismissed.
At the same time, the opinion excused virtually any degree of unfairness by the institution. Furman reasoned that taking “allegations of rape on campus seriously and . . . treat[ing] complainants with a high degree of sensitivity” could constitute “lawful” reasons for university unfairness toward accused students. Samantha Harris of the Foundation for Individual Rights in Education detected the decision’s “immediate and nationwide impact” in several rulings against accused students. It also played the same role in university briefs that Saylor’s Brandeis opinion did in filings by accused students.
The Columbia student’s lawyer, Andrew Miltenberg, appealed Furman’s ruling to the Second Circuit. The stakes were high, since a ruling affirming the lower court’s reasoning would have all but foreclosed Title IX lawsuits by accused students in New York, Connecticut, and Vermont. But a panel of three judges, all nominated by Democratic presidents, overturned Furman’s decision. In the opinion’s crucial passage, Judge Pierre Leval held that a university “is not excused from liability for discrimination because the discriminatory motivation does not result from a discriminatory heart, but rather from a desire to avoid practical disadvantages that might result from unbiased action. A covered university that adopts, even temporarily, a policy of bias favoring one sex over the other in a disciplinary dispute, doing so in order to avoid liability or bad publicity, has practiced sex discrimination, notwithstanding that the motive for the discrimination did not come from ingrained or permanent bias against that particular sex.” Before the Columbia decision, courts almost always had rebuffed Title IX pleadings from accused students. More recently, judges have allowed Title IX claims to proceed against Amherst, Cornell, California–Santa Barbara, Drake, and Rollins.
After the Second Circuit’s decision, Columbia settled with the accused student, sparing its Title IX decision-makers from having to testify at a trial. James Madison was one of the few universities to take a different course, with disastrous results. A lawsuit from an accused student survived a motion to dismiss, but the university refused to settle, allowing the student’s lawyer to depose the three school employees who had decided his client’s fate. One unintentionally revealed that he had misapplied the university’s own definition of consent. Another cited the importance of the accuser’s slurring words on a voicemail, thus proving her extreme intoxication on the night of the alleged assault. It was left to the accused student’s lawyer, at a deposition months after the decision had been made, to note that the voicemail in question actually was received on a different night. In December 2016, Judge Elizabeth Dillon, an Obama nominee, granted summary judgment to the accused student, concluding that “significant anomalies in the appeal process” violated his due-process rights under the Constitution.niversities were on the losing side of 36 due-process rulings when Obama appointee Catherine Lhamon was presiding over the Office for Civil Rights between 2013 and 2016; no record exists of her publicly acknowledging any of them. In June 2017, however, Lhamon suddenly rejoiced that “yet another federal court” had found that students disciplined for sexual misconduct “were not denied due process.” That Fifth Circuit decision, involving two former students at the University of Houston, was an odd case for her to celebrate. The majority cabined its findings to the “unique facts” of the case—that the accused students likely would have been found guilty even under the fairest possible process. And the dissent, from Judge Edith Jones, denounced the procedures championed by Lhamon and other Obama officials as “heavily weighted in favor of finding guilt,” predicting “worse to come if appellate courts do not step in to protect students’ procedural due process right where allegations of quasi-criminal sexual misconduct arise.”
At this stage, Lhamon, who now chairs the U.S. Commission on Civil Rights, cannot be taken seriously when it comes to questions of campus due process. But other defenders of the current Title IX regime have offered more substantive commentary about the university setbacks.
Legal scholar Michelle Anderson was one of the few to even discuss the due-process decisions. “Colleges and universities do not always adjudicate allegations of sexual assault well,” she noted in a 2016 law review article defending the Obama-era policies. Anderson even conceded that some colleges had denied “accused students fairness in disciplinary adjudication.” But these students sued, “and campuses are responding—as they must—when accused students prevail. So campuses face powerful legal incentives on both sides to address campus sexual assault, and to do so fairly and impartially.”
This may be true, but Anderson does not explain why wrongly accused students should bear the financial and emotional burden of inducing their colleges to implement fair procedures. More important, scant evidence exists that colleges have responded to the court victories of wrongly accused students by creating fairer procedures. Some have even made it more difficult for wrongly accused students to sue. After losing a lawsuit in December 2014, Brown eliminated the right of students accused of sexual assault to have “every opportunity” to present evidence. That same year, an accused student showed how Swarthmore had deviated from its own procedures in his case. The college quickly settled the lawsuit—and then added a clause to its procedures immunizing it from similar claims in the future. Swarthmore currently informs accused students that “rules of evidence ordinarily found in legal proceedings shall not be applied, nor shall any deviations from any of these prescribed procedures alone invalidate a decision.”
Many lawsuits are still working their way through the judicial system; three cases are pending at federal appellate courts. Of the two that address substantive matters, oral arguments seemed to reveal skepticism of the university’s position. On July 26, a three-judge panel of the First Circuit considered a case at Boston College, where the accused student plausibly argued that someone else had committed the sexual assault (which occurred on a poorly lit dance floor). Judges Bruce Selya and William Kayatta seemed troubled that a Boston College dean had improperly intruded on the hearing board’s deliberations. At the Sixth Circuit a few days later, Judges Richard Griffin and Amul Thapar both expressed concerns about the University of Cincinnati’s downplaying the importance of cross-examination in campus-sex adjudications. Judge Eric Clay was quieter, but he wondered about the tension between the university’s Title IX and truth-seeking obligations.
In a perfect world, academic leaders themselves would have created fairer processes without judicial intervention. But in the current campus environment, such an approach is impossible. So, at least for the short term, the courts remain the best, albeit imperfect, option for students wrongly accused of sexual assault. Meanwhile, every year, young men entrust themselves and their family’s money to institutions of higher learning that are indifferent to their rights and unconcerned with the injustices to which these students might be subjected.
1 After a district court placed that finding on hold, the university appealed to the Sixth Circuit.
Review of 'Terror in France' By Gilles Kepel
Kepel is particularly knowledgeable about the history and process of radicalization that takes place in his nation’s heavily Muslim banlieues (the depressed housing projects ringing Paris and other major cities), and Terror in France is informed by decades of fieldwork in these volatile locales. What we have been witnessing for more than a decade, Kepel argues, is the “third wave” of global jihadism, which is not so much a top-down doctrinally inspired campaign (as were the 9/11 attacks, directed from afar by the oracular figure of Osama bin Laden) but a bottom-up insurgency with an “enclave-based ethnic-racial logic of violence” to it. Kepel traces the phenomenon back to 2005, a convulsive year that saw the second-generation descendants of France’s postcolonial Muslim immigrants confront a changing socio-political landscape.
That was the year of the greatest riots in modern French history, involving mostly young Muslim men. It was also the year that Abu Musab al-Suri, the Syrian-born Islamist then serving as al-Qaeda’s operations chief in Europe, published The Global Islamic Resistance Call. This 1,600-page manifesto combined pious imprecations against the West with do-it-yourself ingenuity, an Anarchist’s Cookbook for the Islamist set. In Kepel’s words, the manifesto preached a “jihadism of proximity,” the brand of civil war later adopted by the Islamic State. It called for ceaseless, mass-casualty attacks in Western cities—attacks which increase suspicion and regulation of Muslims and, in turn, drive those Muslims into the arms of violent extremists.
The third-generation jihad has been assisted by two phenomena: social-networking sites that easily and widely disseminate Islamist propaganda (thus increasing the rate of self-radicalization) and the so-called Arab Spring, which led to state collapse in Syria and Libya, providing “an exceptional site for military training and propaganda only a few hours’ flight from Europe, and at a very low cost.”
Kepel’s book is not just a study of the ideology and tactics of Islamists but a sociopolitical overview of how this disturbing phenomenon fits within a country on the brink. For example, Kepel finds that jihadism is emerging in conjunction with developments such as the “end of industrial society.” A downturn in work has led to an ominous situation in which a “right-wing ethnic nationalism” preying on the economically anxious has risen alongside Islamism as “parallel conduits for expressing grievances.” Filling a space left by the French Communist Party (which once brought the ethnic French working class and Arab immigrants together), these two extremes leer at each other from opposite sides of a societal chasm, signaling the potentially cataclysmic future that awaits France if both mass unemployment and Islamist terror continue undiminished.
The French economy has also had a more direct inciting effect on jihadism. Overregulated labor markets make it difficult for young Muslims to get jobs, thus exacerbating the conditions of social deprivation and exclusion that make individuals susceptible to radicalization. The inability to tackle chronic unemployment has led to widespread Muslim disillusionment with the left (a disillusionment aggravated by another, often glossed over, factor: widespread Muslim opposition to the Socialist Party’s championing of same-sex marriage). Essentially, one left-wing constituency (unions) has made the unemployment of another constituency (Muslim youth) the mechanism for maintaining its privileges.
Kepel does not, however, cite deprivation as the sole or even main contributing factor to Islamist radicalization. One Parisian banlieue that has sent more than 80 residents to fight in Syria, he notes, has “attractive new apartment buildings” built by the state and features a mosque “constructed with the backing of the Socialist mayor.” It is also the birthplace of well-known French movie stars of Arab descent, and thus hardly a place where ambition goes to die. “The Islamophobia mantra and the victim mentality it reinforces makes it possible to rationalize a total rejection of France and a commitment to jihad by making a connection between unemployment, discrimination, and French republican values,” Kepel writes. Indeed, Kepel is refreshingly derisive of the term “Islamophobia” throughout the book, excoriating Islamists and their fellow travelers for “substituting it for anti-Semitism as the West’s cardinal sin.” These are meaningful words coming from Kepel, a deeply learned scholar of Islam who harbors great respect for the faith and its adherents.
Kepel also weaves the saga of jihadism into the ongoing “kulturkampf within the French left.” Arguments about Islamist terrorism demonstrate a “divorce between a secular progressive tradition” and the children of the Muslim immigrants this tradition fought to defend. The most ironically perverse manifestation of this divorce was ISIS’s kidnapping of Didier François, co-founder of the civil-rights organization SOS Racisme. Kepel recognizes the origins of this divorce in the “red-green” alliance formed decades ago between Islamists and elements of the French intellectual left, such as Michel Foucault, a cheerleader of the Iranian revolution.
Though he offers a rigorous history and analysis of the jihadist problem, Kepel is generally at a loss for solutions. He decries a complacent French elite, with its disregard for genuine expertise (evidenced by the decline in institutional academic support for Islamicists and Arabists) and the narrow, relatively impenetrable way in which it perpetuates itself, chiefly with a single school (the École normale supérieure) that practically every French politician must attend. Despite France’s admirable republican values, this has made the process of assimilation rather difficult. But other than wishing that the public education system become more effective and inclusive at instilling republican values, Kepel provides little in the way of suggestions as to how France emerges from this mess. That a scholar of such erudition and humanity can do little but throw up his hands and issue a sigh of despair cannot bode well. The third-generation jihad owes as much to the political breakdown in France as it does to the meltdown in the Middle East. Defeating this two-headed beast requires a new and comprehensive playbook: the West’s answer to The Global Islamic Resistance Call. That book has yet to be written.
resident Trump, in case you haven’t noticed, has a tendency to exaggerate. Nothing is “just right” or “meh” for him. Buildings, crowds, election results, and military campaigns are always outsized, gargantuan, larger, and more significant than you might otherwise assume. “People want to believe that something is the biggest and the greatest and the most spectacular,” he wrote 30 years ago in The Art of the Deal. “I call it truthful hyperbole. It’s an innocent form of exaggeration—and a very effective form of promotion.”
So effective, in fact, that the press has picked up the habit. Reporters and editors agree with the president that nothing he does is ordinary. After covering Trump for more than two years, they still can’t accept him as a run-of-the-mill politician. And while there are aspects of Donald Trump and his presidency that are, to say the least, unusual, the media seem unable to distinguish between the abnormal and significant—firing the FBI director in the midst of an investigation into one’s presidential campaign, for example—and the commonplace.
Consider the fiscal deal President Trump struck with Democratic leaders in early September.
On September 6, the president held an Oval Office meeting with Vice President Pence, Treasury Secretary Mnuchin, and congressional leaders of both parties. He had to find a way to (a) raise the debt ceiling, (b) fund the federal government, and (c) spend money on hurricane relief. The problem is that a bloc of House Republicans won’t vote for (a) unless the increase is accompanied by significant budget cuts, which interferes with (b) and (c). To raise the debt ceiling, then, requires Democratic votes. And the debt ceiling must be raised. “There is zero chance—no chance—we will not raise the debt ceiling,” Senate Majority Leader Mitch McConnell said in August.
The meeting went like this. First House Speaker Paul Ryan asked for an 18-month increase in the debt ceiling so Republicans wouldn’t have to vote again on the matter until after the midterm elections. Democrats refused. The bargaining continued until Ryan asked for a six-month increase. The Democrats remained stubborn. So Trump, always willing to kick a can down the road, interrupted Mnuchin to offer a three-month increase, a continuing resolution that will keep the government open through December, and about $8 billion in hurricane money. The Democrats said yes.
That, anyway, is what happened. But the media are not satisfied to report what happened. They want—they need—to tell you what it means. And what does it mean? Well, they aren’t really sure. But it’s something big. It’s something spectacular. For example:
1. “Trump Bypasses Republicans to Strike Deal on Debt Limit and Harvey Aid” was the headline of a story for the New York Times by Peter Baker, Thomas Kaplan, and Michael D. Shear. “The deal to keep the government open and paying its debts until Dec. 15 represented an extraordinary public turn for the president, who has for much of his term set himself up on the right flank of the Republican Party,” their article began. Fair enough. But look at how they import speculation and opinion into the following sentence: “But it remained unclear whether Mr. Trump’s collaboration with Democrats foreshadowed a more sustained shift in strategy by a president who has presented himself as a master dealmaker or amounted to just a one-time instinctual reaction of a mercurial leader momentarily eager to poke his estranged allies.”
2. “The decision was one of the most fascinating and mysterious moves he’s made with Congress during eight months in office,” reported Jeff Zeleny, Dana Bash, Deirdre Walsh, and Jeremy Diamond for CNN. Thanks for sharing!
3. “Trump budget deal gives GOP full-blown Stockholm Syndrome,” read the headline of Tina Nguyen’s piece for Vanity Fair. “Donald Trump’s unexpected capitulation to new best buds ‘Chuck and Nancy’ has thrown the Grand Old Party into a frenzy as Republicans search for explanations—and scapegoats.”
4. “For Conservatives, Trump’s Deal with Democrats Is Nightmare Come True,” read the headline for a New York Times article by Jeremy W. Peters and Maggie Haberman. “It is the scenario that President Trump’s most conservative followers considered their worst nightmare, and on Wednesday it seemed to come true: The deal-making political novice, whose ideology and loyalty were always fungible, cut a deal with Democrats.”
5. “Trump sides with Democrats on fiscal issues, throwing Republican plans into chaos,” read the Washington Post headline the day after the deal was announced. “The president’s surprise stance upended sensitive negotiations over the debt ceiling and other crucial policy issues this fall and further imperiled his already tenuous relationships with Senate Majority Leader Mitch McConnell and House Speaker Paul Ryan.” Yes, the negotiations were upended. Then they made a deal.
6. “Although elected as a Republican last year,” wrote Peter Baker of the Times, “Mr. Trump has shown in the nearly eight months in office that he is, in many ways, the first independent to hold the presidency since the advent of the two-party system around the time of the Civil War.” The title of Baker’s news analysis: “Bound to No Party, Trump Upends 150 Years of Two-Party Rule.” One hundred and fifty years? Why not 200?
The journalistic rule of thumb used to be that an article describing a political, social, or cultural trend requires at least three examples. Not while covering Trump. If Trump does something, anything, you should feel free to inflate its importance beyond all recognition. And stuff your “reporting” with all sorts of dramatic adjectives and frightening nouns: fascinating, mysterious, unexpected, extraordinary, nightmare, chaos, frenzy, and scapegoats. It’s like a Vince Flynn thriller come to life.
The case for the significance of the budget deal would be stronger if there were a consensus about whom it helped. There isn’t one. At first the press assumed Democrats had won. “Republicans left the Oval Office Wednesday stunned,” reported Rachael Bade, Burgess Everett, and Josh Dawsey of Politico. Another trio of Politico reporters wrote, “In the aftermath, Republicans seethed privately and distanced themselves publicly from the deal.” Republicans were “stunned,” reported Kristina Peterson, Siobhan Hughes, and Louise Radnofsky of the Wall Street Journal. “Meet the swamp: Donald Trump punts September agenda to December after meeting with Congress,” read the headline of Charlie Spiering’s Breitbart story.
By the following week, though, these very outlets had decided the GOP was looking pretty good. “Trump’s deal with Democrats bolsters Ryan—for now,” read the Politico headline on September 11. “McConnell: No New Debt Ceiling Vote until ‘Well into 2018,’” reported the Washington Post. “At this point…picking a fight with Republican leaders will only help him,” wrote Gerald Seib in the Wall Street Journal. “Trump has long warned that he would work with Democrats, if necessary, to fulfill his campaign promises. And Wednesday’s deal is a sign that he intends to follow through on that threat,” wrote Breitbart’s Joel Pollak.
The sensationalism, the conflicting interpretations, the visceral language is dizzying. We have so many reporters chasing the same story that each feels compelled to gussy up a quotidian budget negotiation until it resembles the Ribbentrop–Molotov pact, and none feel it necessary to apply to their own reporting the scrutiny and incredulity they apply to Trump. The truth is that no one knows what this agreement portends. Nor is it the job of a reporter to divine the meaning of current events like an augur of Rome. Sometimes a cigar is just a cigar. And a deal is just a deal.
Remembering something wonderful
Not surprisingly, many well-established performers were left in the lurch by the rise of the new media. Moreover, some vaudevillians who, like Fred Allen, had successfully reinvented themselves for radio were unable to make the transition to TV. But a handful of exceptionally talented performers managed to move from vaudeville to radio to TV, and none did it with more success than Jack Benny, whose feigned stinginess, scratchy violin playing, slightly effeminate demeanor, and preternaturally exact comic timing made him one of the world’s most beloved performers. After establishing himself in vaudeville, he became the star of a comedy series, The Jack Benny Program, that aired continuously, first on radio and then TV, from 1932 until 1965. Save for Bob Hope, no other comedian of his time was so popular.
With the demise of nighttime network radio as an entertainment medium, the 931 weekly episodes of The Jack Benny Program became the province of comedy obsessives—and because Benny’s TV series was filmed in black-and-white, it is no longer shown in syndication with any regularity. And while he also made Hollywood films, some of which were box-office hits, only one, Ernst Lubitsch’s To Be or Not to Be (1942), is today seen on TV other than sporadically.
Nevertheless, connoisseurs of comedy still regard Benny, who died in 1974, as a giant, and numerous books, memoirs, and articles have been published about his life and art. Most recently, Kathryn H. Fuller-Seeley, a professor at the University of Texas at Austin, has brought out Jack Benny and the Golden Age of Radio Comedy, the first book-length primary-source academic study of The Jack Benny Program and its star.1 Fuller-Seeley’s genuine appreciation for Benny’s work redeems her anachronistic insistence on viewing it through the fashionable prism of gender- and race-based theory, and her book, though sober-sided to the point of occasional starchiness, is often quite illuminating.
Most important of all, off-the-air recordings of 749 episodes of the radio version of The Jack Benny Program survive in whole or part and can easily be downloaded from the Web. As a result, it is possible for people not yet born when Benny was alive to hear for themselves why he is still remembered with admiration and affection—and why one specific aspect of his performing persona continues to fascinate close observers of the American scene.B orn Benjamin Kubelsky in Chicago in 1894, Benny was the son of Eastern European émigrés (his father was from Poland, his mother from Lithuania). He started studying violin at six and had enough talent to pursue a career in music, but his interests lay elsewhere, and by the time he was a teenager, he was working in vaudeville as a comedian who played the violin as part of his act. Over time he developed into a “monologist,” the period term for what we now call a stand-up comedian, and he began appearing in films in 1929 and on network radio three years after that.
Radio comedy, like silent film, is now an obsolete art form, but the program formats that it fostered in the ’20s and ’30s all survived into the era of TV, and some of them flourish to this day. One, episodic situation comedy, was developed in large part by Jack Benny and his collaborators. Benny and Harry Conn, his first full-time writer, turned his weekly series, which started out as a variety show, into a weekly half-hour playlet featuring a regular cast of characters augmented by guest stars. Such playlets, relying as they did on a setting that was repeated from week to week, were easier to write than the free-standing sketches favored by Allen, Hope, and other ex-vaudevillians, and by the late ’30s, the sitcom had become a staple of radio comedy.
The process, as documented by Fuller-Seeley, was a gradual one. The Jack Benny Program never broke entirely with the variety format, continuing to feature both guest stars (some of whom, like Ronald Colman, ultimately became semi-regular members of the show’s rotating ensemble of players) and songs sung by Dennis Day, a tenor who joined the cast in 1939. Nor was it the first radio situation comedy: Amos & Andy, launched in 1928, was a soap-opera-style daily serial that also featured regular characters. Nevertheless, it was Benny who perfected the form, and his own character would become the prototype for countless later sitcom stars.
The show’s pivotal innovation was to turn Benny and the other cast members into fictionalized versions of themselves—they were the stars of a radio show called “The Jack Benny Program.” Sadye Marks, Benny’s wife, played Mary Livingstone, his sharp-tongued secretary, with three other characters added as the self-reflexive concept took shape. Don Wilson, the stout, genial announcer, came on board in 1934. He was followed in 1936 by Phil Harris, Benny’s roguish bandleader, and, in 1939, by Day, Harris’s simple-minded vocalist. To this team was added a completely fictional character, Rochester Van Jones, Benny’s raspy-voiced, outrageously impertinent black valet, played by Eddie Anderson, who joined the cast in 1938.
As these five talented performers coalesced into a tight-knit ensemble, the jokey, vaudeville-style sketch comedy of the early episodes metamorphosed into sitcom-style scripts that portrayed their offstage lives, as well as the making of the show itself. Scarcely any conventional jokes were told, nor did Benny’s writers employ the topical and political references in which Allen and Hope specialized. Instead, the show’s humor arose almost entirely from the close interplay of character and situation.
Benny was not solely responsible for the creation of this format, which was forged by Conn and perfected by his successors. Instead, he doubled as the star and producer—or, to use the modern term, show runner—closely supervising the writing of the scripts and directing the performances of the other cast members. In addition, he and Conn turned the character of Jack Benny from a sophisticated vaudeville monologist into the hapless butt of the show’s humor, a vain, sexually inept skinflint whose character flaws were ceaselessly twitted by his colleagues, who in turn were given most of the biggest laugh lines.
This latter innovation was a direct reflection of Benny’s real-life personality. Legendary for his voluble appreciation of other comedians, he was content to respond to the wisecracking of his fellow cast members with exquisitely well-timed interjections like “Well!” and “Now, cut that out,” knowing that the comic spotlight would remain focused on the man of whom they were making fun and secure in the knowledge that his own comic personality was strong enough to let them shine without eclipsing him in the process.
And with each passing season, the fictional personalities of Benny and his colleagues became ever more firmly implanted in the minds of their listeners, thus allowing the writers to get laughs merely by alluding to their now-familiar traits. At the same time, Benny and his writers never stooped to coasting on their familiarity. Even the funniest of the “cheap jokes” that were their stock-in-trade were invariably embedded in carefully honed dramatic situations that heightened their effectiveness.
A celebrated case in point is the best-remembered laugh line in the history of The Jack Benny Program, heard in a 1948 episode in which a burglar holds Benny up on the street. “Your money or your life,” the burglar says—to which Jack replies, after a very long pause, “I’m thinking it over!” What makes this line so funny is, of course, our awareness of Benny’s stinginess, reinforced by a decade and a half of constant yet subtly varied repetition. What is not so well remembered is that the line is heard toward the end of an episode that aired shortly after Ronald Colman won an Oscar for his performance in A Double Life. Inspired by this real-life event, the writers concocted an elaborately plotted script in which Benny talks Colman (who played his next-door neighbor on the show) into letting him borrow the Oscar to show to Rochester. It is on his way home from this errand that Benny is held up, and the burglar not only robs him of his money but also steals the statuette, a situation that was resolved to equally explosive comic effect in the course of two subsequent episodes.
No mere joke-teller could have performed such dramatically complex scripts week after week with anything like Benny’s effectiveness. The secret of The Jack Benny Program was that its star, fully aware that he was not “being himself” but playing a part, did so with an actor’s skill. This was what led Ernst Lubitsch to cast him in To Be or Not to Be, in which he plays a mediocre Shakespearean tragedian, a character broadly related to but still quite different from the one who appeared on his own radio show. As Lubitsch explained to Benny, who was skeptical about his ability to carry off the part:
A clown—he is a performer what is doing funny things. A comedian—he is a performer what is saying funny things. But you, Jack, you are an actor, you are an actor playing the part of a comedian and this you are doing very well.
To Be or Not to Be also stands out from the rest of Benny’s work because he plays an identifiably Jewish character. The Jack Benny character that he played on radio and TV, by contrast, was never referred to or explicitly portrayed as Jewish. To be sure, most listeners were in no doubt of his Jewishness, and not merely because Benny made no attempt in real life to conceal his ethnicity, of which he was by all accounts proud. The Jack Benny Program was written by Jews, and the ego-puncturing insults with which their scripts were packed, as well as the schlemiel-like aspect of Benny’s “fall guy” character, were quintessentially Jewish in style.
As Benny explained in a 1948 interview cited by Fuller-Seeley:
The humor of my program is this: I’m a big shot, see? I’m fast-talking. I’m a smart guy. I’m boasting about how marvelous I am. I’m a marvelous lover. I’m a marvelous fiddle player. Then, five minutes after I start shooting off my mouth, my cast makes a shmo out of me.
Even so, his avoidance of specific Jewish identification on the air is noteworthy precisely because his character was a miser. At a time when overt anti-Semitism was still common in America, it is remarkable that Benny’s comic persona was based in large part on an anti-Semitic stereotype—yet one that seems not to have inspired any anti-Semitic attacks on Benny himself. When, in 1945, his writers came up with the idea of an “I Can’t Stand Jack Benny Because . . . ” write-in campaign, they received 270,000 entries. Only three made mention of his Jewishness.
As for the winning entry, submitted by a California lawyer, it says much about what insulated Benny from such attacks: “He fills the air with boasts and brags / And obsolete, obnoxious gags / The way he plays his violin / Is music’s most obnoxious sin / His cowardice alone, indeed, / Is matched by his obnoxious greed / And all the things that he portrays / Show up MY OWN obnoxious ways.” It is clear that Benny’s foibles were seen by his listeners not as particular but universal, just as there was no harshness in the razzing of his fellow cast members, who very clearly loved the Benny character in spite of his myriad flaws. So, too, did the American people. Several years after his TV series was cancelled, a corporation that was considering using him as a spokesman commissioned a national poll to find out how popular he was. It learned that only 3 percent of the respondents disliked him.
Therein lay Benny’s triumph: He won total acceptance from the American public and did so by embodying a Jewish stereotype from which the sting of prejudice had been leached. Far from being a self-hating whipping boy for anti-Semites, he turned himself into WASP America’s Jewish uncle, preposterous yet lovable.W hen the bottom fell out of network radio, Benny negotiated the move to TV without a hitch, debuting on the small screen in 1950 and bringing the radio version of The Jack Benny Program to a close five years later, making it one of the very last radio comedy series to shut up shop. Even after his weekly TV series was finally canceled by CBS in 1965, he continued to star in well-received one-shot specials on NBC.
But Benny’s TV appearances, for all their charm, were never quite equal in quality to his radio work, which is why he clung to the radio version of The Jack Benny Program until network radio itself went under: Better than anyone else, he knew how good the show had been. For the rest of his life, he lived off the accumulated comic capital built up by 21 years of weekly radio broadcasts.
Now, at long last, he belongs to the ages, and The Jack Benny Program is a museum piece. Yet it remains hugely influential, albeit at one or more removes from the original. From The Dick Van Dyke Show and The Danny Thomas Show to Seinfeld, Everybody Loves Raymond, and The Larry Sanders Show, every ensemble-cast sitcom whose central character is a fictionalized version of its star is based on Benny’s example. And now that the ubiquity of the Web has made the radio version of his series readily accessible for the first time, anyone willing to make the modest effort necessary to seek it out is in a position to discover that The Jack Benny Program, six decades after it left the air, is still as wonderfully, benignly funny as it ever was, a monument to the talent of the man who, more than anyone else, made it so.
Review of 'The Transferred Life of George Eliot' By Philip Davis
Not that there’s any danger these theoretically protesting students would have read George Eliot’s works—not even the short one, Silas Marner (1861), which in an earlier day was assigned to high schoolers. I must admit I didn’t find my high-school reading of Silas Marner a pleasant experience—sports novels for boys like John R. Tunis’s The Kid from Tomkinsville were inadequate preparation. I must confess, too, that when I was in graduate school, determined to study 17th-century English verse, my reaction to the suggestion that I should also read Middlemarch (1871–72) was “What?! An 800-page novel by the guy who wrote Silas Marner?” A friend patiently explained that “the guy” was actually Mary Ann Evans, born in 1819, died in 1880. Partly because she was living in sin with the literary jack-of-all-trades George Henry Lewes (legally and irrevocably bound to his estranged wife), she adopted “George Eliot” as a protective pseudonym when, in her 1857 debut, she published Scenes from Clerical Life.
I did, many times over and with awe and delight, go on to read Middlemarch and the seven other novels, often in order to teach them to college students. Students have become less and less receptive over the years. Forget modern-day objections to George Eliot’s complex political or religious views. Adam Bede (1859) and The Mill on the Floss (1860) were too hefty, and the triple-decked Middlemarch and Deronda, even if I set aside three weeks for them, rarely got finished.
The middle 20th century was perhaps a more a propitious time for appreciating George Eliot, Henry James, and other 19th-century English and American novelists. Influential teachers like F.R. Leavis at Cambridge and Lionel Trilling at Columbia were then working hard to persuade students that the study of literature, not just poetry and drama but also fiction, matters both to their personal lives—the development of their sensibility or character—and to their wider society. The “moral imagination” that created Middlemarch enriches our minds by dramatizing the complications—the frequent blurring of good and evil—in our lives. Great novels help us cope with ambiguities and make us more tolerant of one another. Many of Leavis’s and Trilling’s students became teachers themselves, and for several decades the feeling of cultural urgency was sustained. In the 1970s, though, between the leftist emphasis on literature as “politics by other means” and the deconstructionist denial of the possibility of any knowledge, literary or otherwise, independent of political power, the high seriousness of Leavis and Trilling began to fade.
The study of George Eliot and her life has gone through many stages. Directly after her death came the sanitized, hagiographic “life and letters” by J.W. Cross, the much younger man she married after Lewes’s death. Gladstone called it “a Reticence in three volumes.” The three volumes helped spark, if they didn’t cause, the long reaction against the Victorian sages generally that culminated in the dismissively satirical work of the Bloomsbury biographer and critic Lytton Strachey in his immensely influential Eminent Victorians (1916). Strachey’s mistreatment of his forbears was, with regard to George Eliot at least, tempered almost immediately by Virginia Woolf. It was Woolf who in 1919 provocatively said that Middlemarch had been “the first English novel for adults.” Eventually, the critical tide against George Eliot was decisively reversed in the ’40s by Joan Bennett and Leavis, who made the inarguable case for her genuine and lasting achievement. That period of correction culminated in the 1960s with Gordon S. Haight’s biography and with interpretive studies by Barbara Hardy and W.J. Harvey. Books on George Eliot over the last four decades have largely been written by specialists for specialists—on her manuscripts or working notes, and on her affiliations with the scientists, social historians, and competing novelists of her day.
The same is true, only more so, of the books written, with George Eliot as the ostensible subject, to promote deconstructionist or feminist agendas. Biographies have done a better job appealing to the common reader, not least because the woman’s own story is inherently compelling. The question right now is whether a book combining biographical and interpretive insight—one “pitched,” as publishers like to say, not just at experts but at the common reader—is past praying for.
Philip Davis, a Victorian scholar and an editor at Oxford University Press, hopes not. His The Transferred Life of George Eliot—transferred, that is, from her own experience into her letters, journals, essays, and novels, and beyond them into us—deserves serious attention. Davis is conscious that George Eliot called biographies of writers “a disease of English literature,” both overeager to discover scandals and too inclined to substitute day-to-day travels, relationships, dealings with publishers and so on, for critical attention to the books those writers wrote. Davis therefore devotes himself to George Eliot’s writing. Alas, he presumes rather too much knowledge on the reader’s part of the day-to-day as charted in Haight’s marvelous life. (A year-by-year chronology at the front of the book would have helped even his fellow Victorianists.)
As for George Eliot’s writing, Davis is determined to refute “what has been more or less said . . . in the schools of theory for the last 40 years—that 19th-century realism is conservatively bland and unimaginative, bourgeois and parochial, not truly art at all.” His argument for the richness, breadth, and art of George Eliot’s realism—her factual and sympathetic depiction of poor and middling people, without omitting a candid representation of the rich—is most convincing. What looms largest, though, is the realist, the woman herself—the Mary Ann Evans who, from the letters to the novels, became first Marian Evans the translator and essayist and then later “her own greatest character”: George Eliot the novelist. Davis insists that “the meaning of that person”—not merely the voice of her omniscient narrators but the omnipresent imagination that created the whole show—“has not yet exhausted its influence nor the larger future life she should have had, and may still have, in the world.”
The transference of George Eliot’s experience into her fiction is unquestionable: In The Mill on the Floss, for example, Mary Ann is Maggie, and her brother Isaac is Tom Tulliver. Davis knows that a better word might be transmutation, as George Eliot had, in Henry James’s words, “a mind possessed,” for “the creations which brought her renown were of the incalculable kind, shaped themselves in mystery, in some intellectual back-shop or secret crucible, and were as little as possible implied in the aspect of her life.” No data-accumulating biographer, even the most exhaustive, can account for that “incalculable . . . mystery.”
Which is why Davis, like a good teacher, gives us exercises in “close reading.” He pauses to consider how a George Eliot sentence balances or turns on an easy-to-skip-over word or phrase—the balance or turn often representing a moment when the novelist looks at what’s on the underside of the cards.
George Eliot’s style is subtle because her theme is subtle. Take D.H. Lawrence’s favorite heroine, the adolescent Maggie Tulliver. The external event in The Mill on the Floss may be the girl’s impulsive cutting off her unruly hair to spite her nagging aunts, or the young woman’s drifting down the river with a superficially attractive but truly impossible boyfriend. But the real “action” is Maggie’s internal self-blame and self-assertion. No Victorian novelist was better than George Eliot at tracing the psychological development of, say, a husband and wife who realize they married each other for shallow reasons, are unhappy, and now must deal with the ordinary necessities of balancing the domestic budget—Lydgate and Rosamund in Middlemarch—or, in the same novel, the religiously inclined Dorothea’s mistaken marriage to the old scholar Casaubon. That mistake precipitates not merely disenchantment and an unconscious longing for love with someone else, but (very finely) a quest for a religious explanation of and guide through her quandary.
It’s the religio-philosophical side of George Eliot about which Davis is strongest—and weakest. Her central theological idea, if one may simplify, was that the God of the Bible didn’t exist “out there” but was a projection of the imagination of the people who wrote it. Jesus wasn’t, in Davis’s characterization of her view, “the impervious divine, but [a man who] shed tears and suffered,” and died feeling forsaken. “This deep acceptance of so-called weakness was what most moved Marian Evans in her Christian inheritance. It was what God was for.” That is, the character of Jesus, and the dramatic play between him and his Father, expressed the human emotions we and George Eliot are all too familiar with. The story helps reconcile us to what is, finally, inescapable suffering.
George Eliot came to this demythologized understanding not only of Judaism and Christianity but of all religions through her contact first with a group of intellectuals who lived near Coventry, then with two Germans she translated: David Friedrich Strauss, whose 1,500-page Life of Jesus Critically Examined (1835–36) was for her a slog, and Ludwig Feuerbach, whose Essence of Christianity (1841) was for her a joy. Also, in the search for the universal morality that Strauss and Feuerbach believed Judaism and Christianity expressed mythically, there was Spinoza’s utterly non-mythical Ethics (1677). It was seminal for her—offering, as Davis says, “the intellectual origin for freethinking criticism of the Bible and for the replacement of religious superstition and dogmatic theology by pure philosophic reason.” She translated it into English, though her version did not appear until 1981.
I wish Davis had left it there, but he takes it too far. He devotes more than 40 pages—a tenth of the whole book—to her three translations, taking them as a mother lode of ideational gold whose tailings glitter throughout her fiction. These 40 pages are followed by 21 devoted to Herbert Spencer, the Victorian hawker of theories-of-everything (his 10-volume System of Synthetic Philosophy addresses biology, psychology, sociology, and ethics). She threw herself at the feet of this intellectual huckster, and though he rebuffed her painfully amorous entreaties, she never ceased revering him. Alas, Spencer was a stick—the kind of philosopher who was incapable of emotion. And she was his intellectual superior in every way. The chapter is largely unnecessary.
The book comes back to life when Davis turns to George Henry Lewes, the man who gave Mary Ann Evans the confidence to become George Eliot—perhaps the greatest act of loving mentorship in all of literature. Like many prominent Victorians, Lewes dabbled in all the arts and sciences, publishing highly readable accounts of them for a general audience. His range was as wide as Spencer’s, but his personality and writing had an irrepressible verve that Spencer could only have envied. Lewes was a sort Stephen Jay Gould yoked to Daniel Boorstin, popularizing other people’s findings and concepts, and coming up with a few of his own. He regarded his Sea-Side Studies (1860) as “the book . . . which was to me the most unalloyed delight,” not least because Marian, whom he called Polly, had helped gather the data. She told a friend “There is so much happiness condensed in it! Such scrambles over rocks, and peeping into clear pool [sic], and strolls along the pure sands, and fresh air mingling with fresh thoughts.” In his remarkably intelligent 1864 biography of Goethe, Lewes remarks that the poet “knew little of the companionship of two souls striving in emulous spirit of loving rivalry to become better, to become wiser, teaching each other to soar.” Such a companionship Lewes and George Eliot had in spades, and some of Davis’s best passages describe it.
Regrettably, Davis also offers many passages well below the standard of his best—needlessly repeating an already established point or obfuscating the obvious. Still, The Transferred Lives is the most formidably instructive, and certainly complete, life-and-works treatment of George Eliot we have.