All the hopes that we entertain in the midst of the confusion and dangers of the present are founded, positively…
I. The Beginning of the Bible and Its Greek Counterparts
All the hopes that we entertain in the midst of the confusion and dangers of the present are founded, positively or negatively, directly or indirectly, on the experiences of the past. Of these experiences, the broadest and deepest—so far as Western man is concerned—are indicated by the names of two cities: Jerusalem and Athens. Western man became what he is, and is what he is, through the coming together of biblical faith and Greek thought. In order to understand ourselves and to illuminate our trackless way into the future, we must understand Jerusalem and Athens. It goes without saying that this is a task whose proper performance goes much beyond my power; but we cannot define our tasks by our powers, for our powers become known to us through the performance of our tasks, and it is better to fail nobly than to succeed basely.
The objects to which we refer when we speak of Jerusalem and Athens are understood today, by the science devoted to such objects, as cultures; “culture” is meant to be a scientific concept. According to this concept there is an indefinitely large number of cultures: n cultures. The scientist who studies them beholds them as objects; as scientist, he stands outside all of them; he has no preference for any of them; he is not only impartial but objective; he is anxious not to distort any of them; in speaking about them he avoids any “culture-bound” concepts—i.e., concepts bound to any particular culture or kind of culture. In many cases the objects studied by the scientist of culture do or did not know that they are or were cultures. This causes no difficulty for him: electrons also do not know that they are electrons; even dogs do not know that they are dogs. By the mere fact that he speaks of his objects as cultures, the scientific student takes it for granted that he understands the people whom he studies better than they understood or understand themselves.
This whole approach has been questioned for some time, but the questioning does not seem to have had any effect on the scientists. The man who started the questioning was Nietzsche. We have said that according to the prevailing view there were or are n cultures. Let us say there were or are 1,001 cultures, thus reminding ourselves of the 1,001 Arabian Nights; the account of the cultures, if it is well done, will be a series of exciting stories, perhaps of tragedies. Accordingly, Nietzsche speaks of our subject in a speech by his Zarathustra that is entitled “Of 1,000 Goals and One.” The Hebrews and the Greeks appear in this speech as two among a number of nations, not superior to the two others that are mentioned or to the 996 that are not. The peculiarity of the Greeks, according to Nietzsche, is the full dedication of the individual to the contest for excellence, distinction, supremacy. The peculiarity of the Hebrews is the utmost honoring of father and mother. Nietzsche’s reverence for the sacred tables of the Hebrews, as well as for those of the other nations in question, is deeper than that of any other beholder. Yet since he too is only a beholder of these tables, since what one table commends or commands is incompatible with what others command, he himself is not subject to the commandments of any. This is true also and especially of the tables, or “values,” of modern Western culture. But according to him, all scientific concepts, and hence in particular the concept of culture, are culture-bound; the concept of culture is an outgrowth of 19th-century Western culture; its application to the “cultures” of other ages and climates is an act stemming from the spiritual imperialism of that particular culture. There is, then, for Nietzsche, a glaring contradiction between the claimed objectivity of the science of cultures and the subjectivity of that science. To state the case differently, one cannot behold—i.e., truly understand—any culture unless one is firmly rooted in one’s own culture or unless one belongs, in one’s capacity as a beholder, to some culture. But if the universality of the beholding of all cultures is to be preserved, the culture to which the beholder of all cultures belongs must be the universal culture, the culture of mankind, the world culture; the universality of beholding presupposes, if only by anticipating, the universal culture which is no longer one culture among many. Nietzsche sought therefore for a culture that would no longer be particular and hence in the last analysis arbitrary. The single goal of mankind is conceived by him as in a sense super-human: he speaks of the super-man of the future. The super-man is meant to unite in himself, on the highest level, both Jerusalem and Athens.
However much the science of all cultures may protest its innocence of all preferences or evaluations, it fosters a specific moral posture. Since it requires openness to all cultures, it fosters universal tolerance and the exhilaration which derives from the beholding of diversity; it necessarily affects all cultures that it can still affect by contributing to their transformation in one and the same direction; it willy-nilly brings about a shift of emphasis from the particular to the universal. By asserting, if only implicitly, the Tightness of pluralism, it asserts that pluralism is the right way; it asserts the monism of universal tolerance and respect for diversity; for by virtue of being an “-ism,” pluralism is a monism.
One remains somewhat closer to the science of culture as it is commonly practiced if one limits oneself to saying that every attempt to understand the phenomena in question remains dependent upon a conceptual framework that is alien to most of these phenomena and therefore necessarily distorts them. “Objectivity” can be expected only if one attempts to understand the various cultures or peoples exactly as they understand or understood themselves. Men of ages and climates other than our own did not understand themselves in terms of cultures because they were not concerned with culture in the present-day meaning of the term. What we now call culture is the accidental result of concerns that were not concerns with culture but with other things—above all with the Truth.
Yet our intention to speak of Jerusalem and Athens seems to compel us to go beyond the self-understanding of either. Or is there a notion, a word that points to the highest that both the Bible and the greatest works of the Greeks claim to convey? There is such a word: wisdom. Not only the Greek philosophers but the Greek poets as well were considered to be wise men, and the Torah is said, in the Torah, to be “your wisdom in the eyes of the nations.” We, then, must try to understand the difference between biblical wisdom and Greek wisdom. We see at once that each of the two claims to be the true wisdom, thus denying to the other its claim to be wisdom in the strict and highest sense. According to the Bible, the beginning of wisdom is fear of the Lord; according to the Greek philosophers, the beginning of wisdom is wonder. We are thus compelled from the very beginning to make a choice, to take a stand. Where then do we stand? Confronted by the incompatible claims of Jerusalem and Athens, we are open to both and willing to listen to each. We ourselves are not wise but we wish to become wise. We are seekers for wisdom, “philo-sophoi.” Yet since we say that we wish to hear first and then to act or to decide, we have already decided in favor of Athens against Jerusalem.
This, indeed, seems to be the necessary position for all of us who cannot be Orthodox and therefore must accept the principle of the historical-critical study of the Bible. The Bible was traditionally understood to be the true and authentic account of the deeds of God and men from the beginning till the restoration after the Babylonian exile. The deeds of God include His legislation as well as His inspirations to the prophets, and the deeds of men include their praises of God and their prayers to Him as well as their God-inspired admonitions. Biblical criticism starts from the observation that the biblical account is in important respects not authentic but derivative or consists not of “histories” but of “memories of ancient histories,” to borrow a Machiavellian expression. Biblical criticism reached its first climax in Spinoza’s Theological-Political Treatise, which is frankly anti-theological; Spinoza read the Bible as he read the Talmud and the Koran. The result of his criticism can be summarized as follows: the Bible consists to a considerable extent of self-contradictory assertions, of remnants of ancient prejudices or superstitions, and of the outpourings of an uncontrolled imagination; in addition, it is poorly compiled and poorly preserved. He arrived at this conclusion by presupposing the impossibility of miracles. The considerable differences between 19th- and 20th-century biblical criticism and that of Spinoza can be traced to their difference in regard to the evaluation of imagination: whereas for Spinoza imagination is simply sub-rational, it was assigned a much higher rank in later times when it was understood as the vehicle of religious or spiritual experience, which necessarily expresses itself in symbols and the like. The historical-critical study of the Bible is the attempt to understand the various layers of the Bible as they were understood by their immediate addressees, i.e., the contemporaries of its authors. Of course, the Bible speaks of many things—for instance, the creation of the world—that for the biblical authors themselves belong to the remote past. But there is undoubtedly much history in the Bible—accounts of events written by contemporaries or near-contemporaries. One is thus led to say that the Bible contains both “myth” and “history.” Yet this distinction is alien to the Bible; it is a special form of the Greek distinction between mythos and logos. From the point of view of the Bible, the “myths” are as true as the “histories”: what Israel “in fact” did or suffered cannot be understood except in the light of the “facts” of Creation and Election. What is now called “historical” are those deeds and speeches that are equally accessible to the believer and to the unbeliever. But from the point of view of the Bible, the unbeliever is the fool who has said in his heart “there is no God”; the Bible narrates everything as it is credible to the wise in the biblical sense of wisdom. Let us never forget that there is no biblical word for doubt. The biblical signs and wonders convince men who have little faith or who believe in other gods; they are not addressed to “the fools who say in their hearts ‘there is no God.’”
It is true that we cannot ascribe to the Bible the theological concept of miracles, for that concept presupposes the concept of nature, and the concept of nature is foreign to the Bible. One is, however, tempted to ascribe to the Bible what one may call the poetic concept of miracles as illustrated by Psalm 114:
When Israel went out of Egypt, the house of Jacob from a people of strange tongue, Judah became his sanctuary and Israel his dominion. The sea saw and fled; the Jordan turned back. The mountains skipped like rams, the hills like lambs. What ails thee, sea, that thou fleest, thou Jordan that thou turnst back? Ye mountains that ye skip like rams, ye hills like lambs? From the presence of the Lord tremble thou earth, from the presence of the God of Jacob who turns the rock into a pond of water, the flint into a fountain of waters.
The presence of God calls forth from His creatures a conduct that differs strikingly from their ordinary conduct: it enlivens the lifeless; it makes fluid the fixed. It is not easy to say whether the author of the psalm did not mean his utterance to be simply or literally true. It is easy to say that the concept of poetry—as distinguished from that of song—is foreign to the Bible. It is perhaps more simple to say that owing to the victory of science over natural theology the impossibility of miracles can no longer be said to be established but has degenerated to the status of an undemonstrable hypothesis. One may trace to the hypothetical character of this fundamental premise the hypothetical character of many, not to say all, results of biblical criticism. Certain it is that biblical criticism in all its forms makes use of terms having no biblical equivalents and is to this extent unhistorical.
How then must we proceed? We shall not take issue with the findings or even the premises of biblical criticism. Let us grant that the Bible and in particular the Torah consists to a considerable extent of “memories of ancient histories,” even of memories of memories. But memories of memories are not necessarily distorted or pale reflections of the original; they may be recollections of recollections, deepenings through meditation of the primary experience. We shall therefore take the latest and uppermost layer as seriously as the earlier ones. We shall start from the uppermost layer—from what is first for us, even though it may not be simply the first. We shall start, that is, where both the traditional and the historical study of the Bible necessarily start. In thus proceeding we avoid the compulsion to make an advance decision in favor of Athens against Jerusalem. For the Bible does not require us to believe in the miraculous character of events that the Bible does not present as miraculous. God’s speaking to men may be described as miraculous, but the Bible does not claim that the putting-together of those speeches was done miraculously. We begin at the beginning, at the beginning of the beginning. The beginning of the beginning happens to deal with the beginning: the creation of heaven and earth. The Bible begins reasonably.
“In the beginning God created heaven and earth.” Who says this? We are not told; hence we do not know. We have no right to assume that God said it, for the Bible introduces God’s sayings by expressions like “God said.” We shall then assume that the words were spoken by a nameless man. Yet no man can have been an eyewitness of God’s creating heaven and earth; the only eyewitness was God. Since “There did not arise in Israel a prophet like Moses whom the Lord saw face to face,” it is understandable that tradition ascribed to Moses the sentence quoted and its whole sequel. But what is understandable or plausible is not as such certain. The narrator does not claim to have heard the account from God; perhaps he heard it from some man or men; perhaps he is retelling a tale. The Bible continues: “And the earth was unformed and void. . . .” It is not clear whether the earth thus described was created by God or antedated His creation. But it is quite clear that while speaking about how the earth looked at first, the Bible is silent about how heaven looked at first. The earth, i.e., that which is not heaven, seems to be more important than heaven. This impression is confirmed by the sequel.
God created everything in six days. On the first day He created light; on the second, heaven; on the third, the earth, the seas, and vegetation; on the fourth, the sun, the moon, and the stars; on the fifth, the water animals and the birds; and on the sixth, the land animals and man. The most striking difficulties are these: light and hence day (and nights) are presented as preceding the sun, and vegetation is presented as preceding the sun. The first difficulty is disposed of by the observation that creation-days are not sun-days. One must add at once, however, that there is a connection between the two kinds of days, for there is a connection, a correspondence between light and sun. The account of creation manifestly consists of two parts, the first part dealing with the first three creation-days and the second part dealing with the last three. The first part begins with the creation of light and the second with the creation of the heavenly light-givers. Correspondingly, the first part ends with the creation of vegetation and the second with the creation of man. All creatures dealt with in the first part lack local motion; all creatures dealt with in the second part possess local motion.1 Vegetation precedes the sun because vegetation lacks local motion and the sun possesses it. Vegetation belongs to the earth; it is rooted in the earth; it is the fixed covering of the fixed earth.2 Vegetation was brought forth by the earth at God’s command; the Bible does not speak of God’s “making” vegetation; but as regards the living beings in question, God commanded the earth to bring them forth and yet God “made” them. Vegetation was created at the end of the first half of the creation-days; at the end of the last half, the living beings that spend their whole lives on the firm earth were created. The living beings—beings that possess life in addition to local motion—were created on the fifth and sixth days, on the days following the day on which the heavenly light-givers were created. The Bible presents the creatures in an ascending order. Heaven is lower than earth. The heavenly light-givers lack life; they are lower than the lowliest living beast; they serve the living creatures, which are to be found only beneath heaven; they have been created in order to rule over day and night: they have not been made in order to rule over the earth, let alone over man.
The most striking characteristic of the biblical account of creation is its demoting or degrading of heaven and the heavenly lights. Sun, moon, and stars precede the living things because they are lifeless: they are not gods. What the heavenly lights lose, man gains; man is the peak of creation. The creatures of the first three days cannot change their places; the heavenly bodies change their places but not their courses; the living beings change their courses but not their “ways”; men alone can change their “ways.” Man is the only being created in God’s image. Only in the case of man’s creation does the biblical account of creation repeatedly speak of God’s “creating” him; in the case of the creation of heaven and the heavenly bodies, that account speaks of God’s “making” them. Similarly, only in the case of man’s creation does the Bible intimate that there is a multiplicity in God: “Let us make man in our image, after our likeness. . . . So God created man in His image, in the image of God He created him; male and female He created them.” Bisexuality is not a preserve of man, but only man’s bisexuality could give rise to the view that there are gods and goddesses: there is no biblical word for “goddess.” Hence creation is not begetting. The biblical account of creation teaches silently what the Bible teaches elsewhere explicitly: there is only one God, the God whose name is written as the Tetragrammaton, the living God Who lives from ever to ever, Who alone has created heaven and earth and all their hosts; He has not created any gods and hence there are no gods besides Him. The many gods whom men worship are either nothings that owe such being as they possess to man’s making them, or if they are something (like sun, moon, and stars), they surely are not gods.3 All non-polemical references to “other gods” occurring in the Bible are fossils whose preservation indeed poses a question but only a rather unimportant one. Not only did the biblical God not create any gods; on the basis of the biblical account of creation, one could doubt whether He created any beings one would be compelled to call “mythical”: heaven and earth and all their hosts are always accessible to man as man. One would have to start from this fact in order to understand why the Bible contains so many sections that, on the basis of the distinction between mythical (or legendary) and historical, would have to be described as historical.
According to the Bible, creation was completed by, and culminated in, the creation of man. Only after the creation of man did God “see all that He had made, and behold, it was very good.” What then is the origin of the evil or the bad? The biblical answer seems to be that since everything of divine origin is good, evil is of human origin. Yet if God’s creation as a whole is very good, it does not follow that all its parts are good or that creation as a whole contains no evil whatsoever: God did not find all parts of His creation to be good. Perhaps creation as a whole cannot be “very good” if it does not contain some evils. There cannot be light if there is not darkness, and the darkness is as much created as is the light: God creates evil as well as He makes peace (Isaiah 45:7). However this may be, the evils whose origin the Bible lays bare, after it has spoken of creation, are a particular kind of evils: the evils that beset man. Those evils are not due to creation or implicit in it, as the Bible shows by setting forth man’s original condition. In order to set forth that condition, the Bible must retell man’s creation by making man’s creation as much as possible the sole theme. This second account answers the question, not of how heaven and earth and all their hosts have come into being but of how human life as we know it—beset with evils with which it was not beset originally—has come into being. This second account may only supplement the first account but it may also correct it and thus contradict it. After all, the Bible never teaches that one can speak about creation without contradicting oneself. In post-biblical parlance, the mysteries of the Torah (sithre torah) are the contradictions of the Torah; the mysteries of God are the contradictions regarding God.
The first account of creation ended with man; the second account begins with man. According to the first account, God created man and only man in His image; according to the second account, God formed man from the dust of the earth and He blew into his nostrils the breath of life. The second account makes clear that man consists of two profoundly different ingredients, a high one and a low one. According to the first account, it would seem that man and woman were created simultaneously; according to the second account, man was created first. The life of man as we know it, the life of most men, is that of tillers of the soil; their life is needy and harsh. If human life had been needy and harsh from the very beginning, man would have been compelled or at least almost irresistibly tempted to be harsh, uncharitable, unjust; he would not have been fully responsible for his lack of charity or justice. But man is to be fully responsible. Hence the harshness of human life must be due to man’s fault. His original condition must have been one of ease: he was not in need of rain nor of hard work; he was put by God into a well-watered garden that was rich in trees that were good for food. Yet while man was created for a life of ease, he was not created for a life of luxury: there was no gold or precious stones in the garden of Eden. Man was created for a simple life. Accordingly, God permitted him to eat of every tree of the garden except the tree of knowledge of good and evil, “for in the day that you eat of it, you shall surely die.” Man was not denied knowledge; without knowledge he could not have known the tree of knowledge, nor the woman, nor the brutes; nor could he have understood the prohibition. Man was denied knowledge of good and evil, i.e., the knowledge sufficient for guiding himself, his life. Though not being a child, he was to live in childlike simplicity and obedience to God. We are free to surmise that there is a connection between the demotion of heaven in the first account and the prohibition against eating of the tree of knowledge in the second. While man was forbidden to eat of the tree of knowledge, he was not forbidden to eat of the tree of life.
Man, lacking knowledge of good and evil, was content with his condition and in particular with his loneliness. But God, possessing knowledge of good and evil, found that “it is not good for man to be alone, so I will make him a helper as his counterpart.” So God formed the brutes and brought them to man, but they proved not to be the desired helpers. Thereupon God formed the woman out of a rib of the man. The man welcomed her as bone of his bones and flesh of his flesh but, lacking knowledge of good and evil, he did not call her good. The narrator adds that “therefore [namely because the woman is bone of man’s bone and flesh of his flesh] a man leaves his father and his mother, and cleaves to his wife, and they become one flesh.” Both were naked but, lacking knowledge of good and evil, they were not ashamed.
Thus the stage was set for the fall of our first parents. The first move came from the serpent, the most cunning of all the beasts of the field; it seduced the woman into disobedience and then the woman seduced the man. The seduction moves from the lowest to the highest. The Bible does not tell what induced the serpent to seduce the woman into disobeying the divine prohibition. It is reasonable to assume that the serpent acted as it did because it was cunning, i.e., possessed a low kind of wisdom, a congenital malice; everything that God had created would not be very good if it did not include something congenitally bent on mischief. The serpent begins its seduction by suggesting that God might have forbidden man and woman to eat of any tree in the garden, i.e., that God’s prohibition might be malicious or impossible to comply with. The woman corrects the serpent and in so doing makes the prohibition more stringent than it was: “We may eat of the fruit of the other trees of the garden; it is only about the tree in the middle of the garden that God said: you shall not eat of it or touch it, lest you die.”
Now, God did not forbid the man to touch the fruit of the tree of knowledge of good and evil. Besides, the woman does not explicitly speak of the tree of knowledge; she may have had in mind the tree of life. Moreover, God had issued the prohibition only to the man, whereas the woman claims that God had spoken to her as well; she surely knew the divine prohibition only through human tradition. The serpent assures her that they will not die, “for God knows that when you eat of it, your eyes will be opened and you will be like God, knowing good and evil.” The serpent tacitly questions God’s veracity. At the same time it glosses over the fact that eating of the tree involves disobedience to God. In this it is followed by the woman. According to the serpent’s assertion, knowledge of good and evil makes man immune to death (although we cannot know whether the serpent believes this). But the woman, having forgotten the divine prohibition, having therefore in a manner tasted of the tree of knowledge, is no longer wholly unaware of good and evil: she “saw that the tree was good for eating and a delight to the eyes and that the tree was to be desired to make one wise”; therefore she took of its fruit and ate. She thus made the fall of the man almost inevitable, for he was cleaving to her: she gave some of the fruit of the tree to the man, and he ate. The man drifts into disobedience by following the woman. After they had eaten of the tree, their eyes were opened and they knew that they were naked, and they sewed fig leaves together and made themselves aprons: through the fall they became ashamed of their nakedness; eating of the tree of knowledge of good and evil made them realize that nakedness is evil.
The Bible says nothing to the effect that our first parents fell because they were prompted by the desire to be like God; they did not rebel highhandedly against God; rather, they forgot to obey God; they drifted into disobedience. Nevertheless, God punished them severely. But the punishment did not do away with the fact that, as God Himself said, as a consequence of his disobedience “man has become like one of us, knowing good and evil.” There was now the danger that man might eat of the tree of life and live forever. Therefore God expelled him from the garden and made it impossible for him to return to it. One may wonder why man, while he was still in the garden of Eden, had not eaten of the tree of life of which he had not been forbidden to eat. Perhaps he did not think of it because, lacking knowledge of good and evil, he did not fear to die and, besides, the divine prohibition drew his attention away from the tree of life to the tree of knowledge.
The Bible intends to teach that man was meant to live in simplicity, without knowledge of good and evil. But the narrator seems to be aware of the fact that a being which can be forbidden to strive for knowledge of good and evil, i.e., that can understand to some degree that knowledge of good and evil is evil for it, necessarily possesses such knowledge. Human suffering from evil presupposes human knowledge of good and evil and vice versa. Man wishes to live without evil. The Bible tells us that he was given the opportunity to live without evil and that he cannot blame God for the evils from which he suffers. By giving man that opportunity, God convinces him that his deepest wish cannot be fulfilled. The story of the fall is the first part of the story of God’s education of man.
Man has to live with knowledge of good and evil and with the sufferings inflicted on him because of that knowledge or its acquisition. Human goodness or badness presupposes that knowledge and its concomitants. The Bible gives us the first inkling of human goodness and badness in the story of the first brothers. The older brother, Cain, was a tiller of the soil; the younger brother, Abel, a keeper of sheep. God preferred the offering of the keeper of sheep, who brought the choicest of the firstlings of his flock, to that of the tiller of the soil. There were many reasons for this preference but one of them seems to be that the pastoral life is closer to original simplicity than the life of the tillers of the soil. Cain, however was vexed, and despite his having been warned by God against sinning in general, killed his brother. After a futile attempt to deny his guilt—an attempt that increased that guilt (“Am I my brother’s keeper?”)—he was cursed by God as the serpent and the soil had been after the Fall, in contradistinction to Adam and Eve who were not cursed. He was punished by God, but not with death: anyone slaying Cain would be punished much more severely than Cain himself. The relatively mild punishment of Cain cannot be explained by the fact that murder had not been expressly forbidden: Cain possessed some knowledge of good and evil, and he knew that Abel was his brother, even assuming that he did not know that man was created in the image of God. It is better to explain Cain’s punishment by assuming that punishments were milder in the beginning than later on. Cain—like his fellow fratricide, Romulus—founded a city, and some of his descendants were the ancestors of men practicing various arts: the city and the arts, so alien to man’s original simplicity, owe their origin to Cain and his race rather than to Seth, the substitute for Abel, and his race. It goes without saying that this is not the last word of the Bible on the city and the arts but it is its first word, just as the prohibition against eating of the tree of knowledge is, one may say, its first word simply, and the revelation of the Torah—i.e., the highest kind of knowledge of good and evil that is vouchsafed to men—is its last word. The account of the race of Cain culminates in the song of Lamech who boasted to his wives of his slaying of men, of his being superior to God as an avenger. The (antediluvian) race of Seth cannot boast of a single inventor; its only distinguished members were Enoch, who walked with God, and Noah, who was a righteous man and walked with God: civilization and piety are two very different things.
By the time of Noah the wickedness of man had become so great that God repented of His creation of man and all other earthly creatures, Noah alone excepted; so He brought on the flood. Generally speaking, prior to the flood, man’s lifespan was much longer than after it. Man’s antediluvian longevity was a relic of his original condition. Man originally lived in the garden of Eden where he could have eaten of the tree of life and thus become immortal. The longevity of antediluvian man reflects this lost chance. To this extent the transition from antediluvian to postdiluvian man is a decline. This impression is confirmed by the fact that before the flood rather than after it the sons of God consorted with the daughters of man and thus generated the mighty men of old, the men of renown. On the other hand, the fall of our first parents made possible or necessary in due time God’s revelation of His Torah, and this was decisively prepared, as we shall see, by the flood. In this respect, the transition from antediluvian to postdiluvian mankind is a progress. The ambiguity regarding the Fall—the fact that it was a sin and hence avoidable and that it was at the same time inevitable—is reflected in the ambiguity regarding the status of antediluvian mankind.
The link between antediluvian mankind and the revelation of the Torah is supplied by the first covenant between God and men, the covenant following the flood. The flood was the proper punishment for the extreme and well-nigh universal wickedness of antediluvian men. Prior to the flood, mankind lived, so to speak, without restraint, without law. While our first parents were still in the garden of Eden, they were not forbidden anything except to eat of the tree of knowledge. The vegetarianism of antediluvian men was not due to an explicit prohibition (Gen. 1:29); rather, their abstention from meat belongs together with their abstention from wine (cf. 9:20); both were relics of man’s original simplicity. After the expulsion from the garden of Eden, God did not punish men, apart from the relatively mild punishment which He inflicted on Cain. Nor did He establish human judges. God experimented, as it were, for the instruction of mankind, with the possibility of mankind’s living free of the law. The experiment, just like the experiment of having men remain like innocent children, ended in failure. Fallen or awake man needs restraint, must live under law. But this law must not be simply imposed. It must form part of a covenant in which God and man are equally, though not equal, partners. Such a partnership was established only after the flood; it did not exist in antediluvian times either before or after the fall.
The inequality regarding the covenant is shown especially by the fact that when God undertook never again to destroy almost all life on earth as long as the earth lasts, He did not do so on the condition that all or almost all men obey the laws promulgated by God after the flood: God makes His promise despite, or because of, His knowing that the devisings of man’s heart are evil from his youth. Noah is the ancestor of all later men just as Adam was; the purgation of the earth through the flood is to some extent a restoration of mankind to its original state; it is a kind of second creation. Within the limits indicated, the condition of postdiluvian men is superior to that of antediluvian men. One point requires special emphasis: in the legislation following the flood, murder is expressly forbidden and made punishable by death on the ground that man was created in the image of God (9:6). The first covenant brought an increase in hope and at the same time an increase in punishment. Not until after the flood was man’s rule over the beasts, ordained or established from the beginning, to be accompanied by the beasts’ fear and dread of man (cf. 9:2 with 1:26-30 and 2:15).
The covenant following the flood prepares the covenant with Abraham. The Bible singles out three events that took place between the covenant after the flood and God’s calling of Abraham: Noah’s curse of Canaan, a son of Ham; the achievement of excellence by Nimrod, a grandson of Ham; and men’s attempt to prevent their dispersal over the earth by building a city which had a tower that reached to the heavens. Canaan, whose land came to be the promised land, was cursed because Ham saw the nakedness of his father, Noah—because Ham transgressed a most sacred, if unpromulgated, law; the curse of Canaan was accompanied by the blessing of Shem and Japheth who turned their eyes away from the nakedness of their father. Here we have the first and the most fundamental division of mankind, at any rate of postdiluvian mankind, the division into “cursed” and “blessed.” Nimrod was the first to be a mighty man on earth—a mighty hunter before the Lord; his kingdom included Babel (big kingdoms are attempts to overcome by force the division of mankind, conquest and hunting being akin to each other). The city that men built in order to remain together and thus to make a name for themselves was Babel; God scattered them by confounding their speech, by bringing about the division of mankind into groups that could not understand one another: into nations, i.e., groups united not only by descent but also by language. The division of mankind into nations may be described as a milder alternative to the flood.
The three events that took place between God’s covenant with mankind after the flood and His calling of Abraham point to God’s way of dealing with men who know good and evil and devise evil from their youth. Well-nigh universal wickedness will no longer be punished with well-nigh universal destruction, but will be prevented through the division of mankind into nations. Mankind will be divided, not into the cursed and the blessed (the curses and blessings were Noah’s, not God’s), but into a chosen nation and into nations that are not chosen. The emergence of nations made it possible to replace Noah’s Ark—which floated alone on the waters covering the entire earth—by a whole, numerous nation living in the midst of the nations covering the earth. The election of the holy nation begins with the election of Abraham. Noah was distinguished from his contemporaries by his righteousness; Abraham separates himself from his contemporaries and in particular from his country and kindred at God’s command—a command accompanied by God’s promise to make of him a great nation. The Bible does not say that this primary election of Abraham was preceded by the fact of Abraham’s righteousness. However this may be, Abraham shows his righteousness by obeying God’s command at once, by trusting in God’s promise whose fulfillment he could not possibly live to see, given the short lifespan of postdiluvian man: only after Abraham’s offspring would have become a great nation would the land of Canaan be given to them forever.
The fulfillment of the promise required that Abraham not remain childless, and he was already quite old. Accordingly, God promised him that he would have issue. It was Abraham’s trust in God’s promise that, above everything else, made him righteous in the eyes of the Lord. It was God’s intention that His promise be fulfilled through the offspring of Abraham and his wife Sarah. But this promise seemed laughable to Abraham, to say nothing of Sarah: Abraham was one hundred years old and Sarah, ninety. Yet nothing is too wondrous for the Lord. The laughable announcement became a joyous one. It was followed immediately by God’s announcement to Abraham of His concern with the wickedness of the people of Sodom and Gomorrah. God did not yet know whether those people were as wicked as they were said to be. But they might be; they might deserve total destruction as much as did the generation of the flood. Noah had accepted the destruction of his generation without any questioning. Abraham, however, who had a deeper trust in God, in God’s righteousness, and a deeper awareness of his being only dust and ashes, presumed in fear and trembling to appeal to God’s righteousness lest He, the judge of the whole earth, destroy the righteous along with the wicked. In response to Abraham’s insistent pleading, God as it were promised to Abraham that He would not destroy Sodom if ten righteous men could be found in the city: He would save the city for the sake of the ten righteous men within it. Abraham acted as the mortal partner in God’s righteousness; he acted as if he had some share in the responsibility for God’s acting righteously. No wonder God’s covenant with Abraham was incomparably more incisive than His covenant immemediately following the flood.
Abraham’s trust in God thus appears to be the trust that God in His righteousness will not do anything incompatible with His righteousness and that while, or because, nothing is too wondrous for the Lord, there are firm boundaries set to Him by His own righteousness, by Himself. This awareness is deepened and therewith modified by the last and severest test of Abraham’s trust: God’s command to him to sacrifice Isaac, his only son by Sarah. Abraham’s supreme test presupposes the wondrous character of Isaac’s birth: the very son who was to be the sole link between Abraham and the chosen people and who was born against all reasonable expectations, was to be sacrificed by his father. This command contradicted not only the divine promise, but also the divine prohibition against the shedding of innocent blood. Yet Abraham did not argue with God as he had done in the case of Sodom’s destruction. In the case of Sodom, Abraham was not confronted with a divine command to do a certain thing and more particularly he was not confronted with a command to surrender to God what was dearest to him: Abraham did not argue with God for the preservation of Isaac because he loved God—not himself or his most cherished hope—with all his heart, with all his soul, and with all his might. The same concern with God’s righteousness that had induced him to plead with God for the preservation of Sodom if ten just men could be found in that city, induced him not to plead for the preservation of Isaac, for God rightfully demands that He alone be loved unqualifiedly. The fact that the command to sacrifice Isaac contradicted the prohibition against the shedding of innocent blood must be understood in the light of the difference between human justice and divine justice: God alone is unqualifiedly, if un-fathomably, just. God promised Abraham that He would spare Sodom if ten righteous men could be found in it, and Abraham was satisfied with this promise; He did not promise that He would spare the city if nine righteous men were found in it; would those nine be destroyed together with the wicked? And even if all Sodomites were wicked and hence justly destroyed, did their infants who were destroyed with them deserve their destruction? The apparent contradiction between the command to sacrifice Isaac and the divine promise to the descendants of Isaac is disposed of by the consideration that nothing is too wondrous for the Lord. Abraham’s supreme trust in God, his simple, singleminded, childlike faith was rewarded although, or because, it presupposed his entire unconcern with any reward, for Abraham was willing to forgo, to destroy, to kill the only reward with which he was concerned: God prevented the sacrifice of Isaac. Abraham’s intended action needed a reward although he was not concerned with a reward because his intended action cannot be said to have been intrinsically rewarding. The preservation of Isaac is as wondrous as his birth. These two wonders illustrate more clearly than anything else the origin of the holy nation.
The God Who created heaven and earth, Who is the only God, Whose only image is man, Who forbade man to eat of the tree of knowledge of good and evil, Who made a covenant with mankind after the flood and thereafter a convenant with Abraham which became His covenant with Abraham, Isaac, and Jacob—what kind of God is He? Or, to speak more reverently and more adequately, what is His name? This question was addressed to God Himself by Moses when he was sent by Him to the sons of Israel. God replied: “Ehyeh-Asher-Ehyeh,” which is most often translated: “I am That (Who) I am.” I believe, however, that we ought to render this statement, “I shall be What I shall be,” thus preserving the connection between God’s name and the fact that He makes covenants with men, i.e., that He reveals Himself to men above all by His commandments and by His promises and His fulfillment of those promises. “I shall be What I shall be” is, as it were, explained in the verse (Ex. 33:19), “I shall be gracious to whom I shall be gracious and I shall show mercy to whom I shall show mercy.” God’s actions cannot be predicted, unless He Himself has predicted them, i.e., promised them. But as is shown precisely by the account of Abraham’s binding of Isaac, the way in which He fulfills His promises cannot be known in advance. The biblical God is a mysterious God: He comes in a thick cloud (Ex. 19:4); He cannot be seen; His presence can be sensed but not always and everywhere; what is known of Him is only what He chose to communicate by His word through His chosen servants. The rest of the chosen people knows His word—apart from the Ten Commandments (Deut. 4:12 and 5:4-5)—only mediately and does not wish to know it immediately (Ex. 20:19, and 21, 24:1-2; Deut. 10:15-18; Amos 3:7). For almost all purposes the word of God as revealed to His prophets and especially to Moses became the source of knowledge of good and evil, the true tree of knowledge which is at the same time the tree of life.
Having said this much about the beginning of the Bible and what it entails, let us now cast a glance at some Greek counterparts to the beginning of the Bible—to begin with, at Hesiod’s Theogony as well as the remains of Parmenides’s and Empedocles’s works. They are all the works of known authors. This does not mean that they are, or present themselves as being, merely human. Hesiod sings what the Muses, the daughters of Zeus who is the father of gods and men, taught him or commanded him to sing. One could say that the Muses vouch for the truth of Hesiod’s song, were it not for the fact that they sometimes speak lies which resemble what is true. Parmenides transmits the teaching of a goddess, and so does Empedocles. Yet these men composed their books; their songs or speeches are books. The Bible, on the other hand, is not a book. The most one could say is that it is a collection of books. The author of a book, in the strict sense of the term, excludes everything that is not necessary, that does not fulfill a function necessary to the purpose his book is meant to fulfill. The compilers of the Bible as a whole and of the Torah in particular seem to have followed an entirely different rule. Confronted with a variety of preexisting holy speeches, which as such had to be treated with the utmost respect, they excluded only what could not by any stretch of the imagination be rendered compatible with the fundamental and authoritative teaching; their very piety, aroused and fostered by the pre-existing holy speeches, led them to make such changes in those holy speeches as they did make. Their work may then abound in contradictions and repetitions that no one ever intended as such, whereas in a book in the strict sense there is nothing that is not intended by the author.
Hesiod’s Theogony sings of the generation or begetting of the gods; the gods were not “made” by anybody. Far from having been created by a god, earth and heaven are the ancestors of the immortal gods. More precisely, according to Hesiod everything that is has come to be. First there arose Chaos, Gaia (Earth), and Eros. Gaia gave birth first to Ouranos (Heaven) and then, mating with Ouranos, she brought forth Kronos and his brothers and sisters. Ouranos hated his children and did not wish them to come to life. At the wish and advice of Gaia, Kronos deprived his father of his generative power and thus unintentionally brought about the emergence of Aphrodite; Kronos became the king of the gods. Kronos’s evil deed was avenged by his son Zeus whom he had generated by mating with Rheia and whom he had planned to destroy; Zeus dethroned his father and thus became the king of the gods, the father of gods and men, the mightiest of all gods. Given his ancestors it is not surprising that while he is the father of men and belongs to the gods who are the givers of good things, he is far from being kind to men. Mating with Mnemosyne, the daughter of Gaia and Ouranos, Zeus generated the nine Muses. The Muses give sweet and gentle eloquence and understanding to the kings whom they wish to honor. Through the Muses there are singers on earth, just as through Zeus there are kings. While kingship and song may go together, there is a profound difference between the two—a difference that, guided by Hesiod, one may compare to that between the hawk and the nightingale. Surely Metis (Wisdom), while being Zeus’s first spouse and having become inseparable from him, is not identical with him; the relation of Zeus and Metis may remind one of the relation of God and wisdom in the Bible.
Hesiod speaks of the creation or making of men not in the Theogony but in his Works and Days, i.e., in the context of his speeches regarding how man should live, regarding man’s right life, which includes the teaching regarding the right seasons (the “days”); the question of the right life does not arise regarding the gods. The right life for man is the just life, the life devoted to working, especially to tilling the soil. Work thus understood is a blessing ordained by Zeus who blesses the just and crushes the proud: often even a whole city is destroyed for the deeds of a single bad man. Yet Zeus takes cognizance of men’s justice and injustice only if he so wills. Accordingly, work appears to be not a blessing but a curse: men must work because the gods keep hidden from them the means of life and they do this in order to punish them for Prometheus’s theft of fire—a theft inspired by philanthropy. But was not Prometheus’s action itself prompted by the fact that men were not properly provided for by the gods and in particular by Zeus? Be this as it may, Zeus did not deprive men of the fire that Prometheus had stolen for them; he punished them by sending them Pandora and her box, that was filled with countless evils like hard labor. The evils with which human life is beset cannot be traced to human sin. Hesiod conveys the same message by his story of the five races of men which came into being successively. The first of these, the golden race, was made by the gods while Kronos was still ruling in heaven. These men lived without toil or grief; they had all good things in abundance because the earth by itself gave them abundant fruit. Yet the men made by father Zeus lack this bliss. Hesiod does not make clear whether this is due to Zeus’s ill-will or to his lack of power; he gives us no reason to think that it is due to man’s sin. He creates the impression that human life becomes ever more miserable as one race of men succeeds another: there is no divine promise, supported by the fulfillment of earlier divine promises, that permits one to trust and to hope.
The most striking difference between the poet Hesiod and the philosophers Parmenides and Empedocles is that according to the philosophers, not everything has come into being: that which truly is, has not come into being and does not perish. This does not necessarily mean that what exists always is a god or gods. For if Empedocles calls one of the eternal four elements Zeus, this Zeus has hardly anything in common with what Hesiod, or the people generally, understood by Zeus. At any rate, according to both philosophers, the gods as ordinarily understood have come into being, just like heaven and earth, and will therefore perish again.
At the time when the opposition between Jerusalem and Athens reached the level of what one may call its classical struggle, in the 12th and 13th centuries, philosophy was represented by Aristotle. The Aristotelian god, like the biblical God, is a thinking being, but in opposition to the biblical God he is only a thinking being, pure thought: pure thought that thinks itself and only itself. Only by thinking himself and nothing but himself does he rule the world. He surely does not rule by giving orders and laws. Hence he is not a creator-god: the world is as eternal as god. Man is not his image: man is much lower in rank than other parts of the world. For Aristotle it is almost a blasphemy to ascribe justice to his god; he is above justice as well as injustice.
It has often been said that the philosopher who comes closest to the Bible is Plato. This was said not least during the classical struggle between Jerusalem and Athens in the Middle Ages. Both Platonic philosophy and biblical piety are animated by the concern with purity and purification: “pure reason” in Plato’s sense is closer to the Bible than “pure reason” in Kant’s sense or for that matter in Anaxagoras’s and Aristotle’s sense. Plato teaches, just as the Bible does, that heaven and earth were created or made by an invisible God whom he calls the Father, who is eternal, who is good, and hence whose creation is good. The coming-into-being and the preservation of the world that he has created depend on the will of its maker. What Plato himself calls theology consists of two teachings: (1) God is good and hence in no way the cause of evil; (2) God is simple and hence unchangeable. On the question of divine concern with men’s justice and injustice, Platonic teaching is in fundamental agreement with biblical teaching; it even culminates in a statement that agrees almost literally with biblical statements.4 Yet the differences between the Platonic and biblical teachings are no less striking than the similarities. The Platonic teaching on creation does not claim to be more than a likely tale. The Platonic God is a creator also of gods, of visible living beings, i.e., of the stars; the created gods rather than the creator God create the mortal living beings and in particular man; heaven is a blessed god. The Platonic God does not create the world by his word; he creates it after having looked to the eternal ideas which therefore are higher than he. In accordance with this, Plato’s explicit theology is presented within the context of the first discussion of education in the Republic, within the context of what one may call the discussion of elementary education; in the second and final discussion of education—the education of philosophers—theology is replaced by the doctrine of ideas. As for the thematic discussion of providence in the Laws, it may suffice here to say that it occurs within the context of the discussion of penal law.
In his likely tale of how God created the visible whole, Plato makes a distinction between two kinds of gods, the visible cosmic gods and the traditional gods—between the gods who revolve manifestly, i.e., who manifest themselves regularly, and the gods who manifest themselves so far as they will. The least one would have to say is that according to Plato the cosmic gods are of much higher rank than the traditional gods, the Greek gods. Inasmuch as the cosmic gods are accessible to man as man—to his observations and calculations—whereas the Greek gods are accessible only to the Greeks through Greek tradition, one may, in comic exaggeration, ascribe the worship of the cosmic gods to barbarians. This ascription is made in a manner and with an intention altogether non-comic in the Bible: Israel is forbidden to worship the sun and the moon and the stars which the Lord has allotted to the other peoples everywhere under heaven. This implies that the worship of the cosmic gods by other peoples, the barbarians, is not due to a natural or rational cause, to the fact that those gods are accessible to man as man, but to an act of God’s will. It goes without saying that according to the Bible the God Who manifests Himself as far as He wills, Who is not universally worshipped as such, is the only true God. The Platonic statement taken in conjunction with the biblical statement brings out the fundamental opposition of Athens at its peak to Jerusalem: the opposition of the God or gods of the philosophers to the God of Abraham, Isaac, and Jacob, the opposition of reason and revelation.
II. On Socrates and the Prophets
Fifty years ago, in the middle of World War I, Hermann Cohen, the greatest representative of, and spokesman for, German Jewry, the most powerful figure among the German professors of philosophy of his time, stated his view on Jerusalem and Athens in a lecture entitled “The Social Ideal in Plato and the Prophets.” He repeated that lecture shortly before his death, and we may regard it as stating his final view on Jerusalem and Athens and therewith on the truth. For, as Cohen says right at the beginning, “Plato and the prophets are the two most important sources of modern culture.” Being concerned with “the social ideal,” he does not say a single word about Christianity in the whole lecture.
Cohen’s view may be restated as follows. The truth is the synthesis of the teachings of Plato and the prophets. What we owe to Plato is the insight that the truth is in the first place the truth of science but that science must be supplemented, overarched, by the idea of the good which to Cohen means, not God, but rational, scientific ethics. The ethical truth must not only be compatible with the scientific truth; the ethical truth needs the scientific truth. The prophets are very much concerned with knowledge: with the knowledge of God. But this knowledge, as the prophets understood it, has no connection whatever with scientific knowledge; it is knowledge only in a metaphorical sense. It is perhaps with a view to this fact that Cohen speaks once of the divine Plato but never of the divine prophets. Why then can he not leave matters at Platonic philosophy? What is the fundamental defect of Platonic philosophy that is remedied by the prophets and only by the prophets? According to Plato, the cessation of evil requires the rule of the philosophers, of the men who possess the highest kind of human knowledge, i.e., of science in the broadest sense of the term. But this kind of knowledge like, to some extent, all scientific knowledge, is, according to Plato, the preserve of a small minority: of the men who possess a certain nature and certain gifts that most men lack. Plato presupposes that there is an unchangeable human nature and, as a consequence, a fundamental structure of the good human society which is unchangeable. This leads him to assert or to assume that there will be wars as long as there will be human beings, that there ought to be a class of warriors and that the class ought to be higher in rank and honor than the class of producers and exchangers. These defects in Plato’s system are remedied by the prophets precisely because they lack the idea of science and hence the idea of nature, and therefore they can believe that men’s conduct toward one another can undergo a change much more radical than any change ever dreamed of by Plato.
Cohen brought out very well the antagonism between Plato and the prophets. Nevertheless we cannot leave matters at his view of that antagonism. Cohen’s thought belongs to the world preceding World War I, and accordingly reflects a greater faith in the power of modern Western culture to mold the fate of mankind than seems to be warranted now. The worst things experienced by Cohen were the Dreyfus scandal and the pogroms instigated by Tsarist Russia: he did not experience Communist Russia and Hitler Germany. More disillusioned than he regarding modern culture, we wonder whether the two separate ingredients of modern culture, of the modern synthesis, are not more solid than the synthesis itself. Catastrophes and horrors of a magnitude hitherto unknown, which we have seen and through which we have lived, were better provided for, or made intelligible, by both Plato and the prophets than by the modern belief in progress. Since we are less certain than Cohen was that the modern synthesis is superior to its pre-modern ingredients, and since the two ingredients are in fundamental opposition to each other, we are ultimately confronted by a problem rather than by a solution.
More particularly, Cohen understood Plato in the light of the opposition between Plato and Aristotle—an opposition that he understood in turn in the light of the opposition between Kant and Hegel. We, however, are more impressed than Cohen was by the kinship between Plato and Aristotle on the one hand and the kinship between Kant and Hegel on the other. In other words, the quarrel between the ancients and the moderns seems to us to be more fundamental than either the quarrel between Plato and Aristotle or that between Kant and Hegel.
We, moreover, prefer to speak of Socrates and the prophets rather than of Plato and the prophets, and for the following reasons. We are no longer as sure as Cohen was that we can draw a clear line between Socrates and Plato. There is traditional support for drawing such a clear line, above all in Aristotle; but Aristotle’s statements on this kind of subject no longer possess for us the authority that they formerly possessed, and this is clue partly to Cohen himself. The clear distinction between Socrates and Plato is based not only on tradition, but on the results of modern historical criticism; yet these results are in the decisive respect hypothetical. The decisive fact for us is that Plato points, as it were, away from himself to Socrates. If we wish to understand Plato, we must take him seriously; we must take seriously in particular his deference to Socrates. Plato points not only to Socrates’s speeches but to his whole life, and to his fate as well. Hence Plato’s life and fate do not have the symbolic character of Socrates’s life and fate. Socrates, as presented by Plato, had a mission; Plato did not claim to have a mission. It is in the first place this fact—the fact that Socrates had a mission—that induces us to consider, not Plato and the prophets, but Socrates and the prophets.
I cannot speak in my own words of the mission of the prophets. Let me, however, remind the reader of some prophetic utterances of singular force and grandeur. Isaiah 6:
In the year that King Uzziah died I saw also the Lord sitting upon a throne, high and lifted up, and his train filled the temple. Above it stood the seraphim: each one had six wings; with twain he covered his face, and with twain he covered his feet, and with twain he did fly. And one cried unto another, and said, Holy, holy, holy is the Lord of hosts: the whole world is full of his glory. . . . Then I said, Woe is me! for I am undone; because I am a man of unclean lips, and I dwell in the midst of a people of unclean lips. . . . Then flew one of the seraphim unto me, having a live coal in his hand, which he had taken with the tongs from off the altar: And he laid it upon my mouth, and said, Lo, this hath touched thy lips; and thine iniquity is taken away, and thy sin purged. Also I heard the voice of the Lord, saying, Whom shall I send, and who will go for us? Then said I, Here am I; send me.
Isaiah, it seems, volunteered for his mission. Could he not have remained silent? Could he refuse to volunteer? When the word of the Lord came unto Jonah, “Arise, go to Nineveh, that great city, and cry against it; for their wickedness is come up before me,” “Jonah rose up to flee unto Tarshish from the presence of the Lord”; Jonah ran away from his mission; but God did not allow him to run away; He compelled him to fulfill it. Of this compulsion we hear in different ways from Amos and Jeremiah. Amos 3:7-8: “Surely the Lord God will do nothing but he revealeth his secret unto his servants the prophets. The lion hath roared, who will not fear? The Lord God hath spoken; who will not prophesy?” The prophets, overpowered by the majesty of the Lord, bring the message of His wrath and His mercy. Jeremiah 1:4-10.
Then the word of the Lord came unto me, saying, Before I formed thee in the belly I knew thee and before thou camest out of the womb I sanctified thee, and I ordained thee a prophet unto the nations. Then said I, Ah, Lord God! behold, I cannot speak; for I am a child. But the Lord said unto me, Say not, I am a child; for thou shalt go to all that I shall send thee, and whatsoever I command thee thou shalt speak. . . . Then the Lord put forth his hand, and touched my mouth. And the Lord said unto me, Behold I have put my words in thy mouth. See, I have this day set thee over the nations and over the kingdoms, to root out, and to pull down, and to destroy, and to throw down, to build, and to plant.
To be sure, the claim to have been sent by God was raised also by men who were not truly prophets but prophets of falsehood, false prophets. Many or most hearers were therefore uncertain as to which kinds of claimants to prophecy were to be trusted or believed. According to the Bible, the false prophets simply lied in saying that they were sent by God. The false prophets tell the people what the people like to hear; hence they are much more popular than the true prophets. The false prophets are “prophets of the deceit of their own heart” (ibid. 26); they tell the people what they themselves imagined (consciously or unconsciously) because they wished it or their hearers wished it. But: “Is not my word like as a fire saith the Lord, and like a hammer that breaketh rock in pieces?” (ibid. 29). Or, as Jeremiah put it when opposing the false prophet, Hananiah: “The prophets that have been before me and before thee of old prophesied both against many countries, and against great kingdoms, of war, and of evil, and of pestilence” (28:8). This does not mean that a prophet is true only if he is a prophet of doom; the true prophets are also prophets of ultimate salvation. We understand the difference between the true and the false prophets if we listen to and meditate on these words of Jeremiah: “Thus saith the Lord; Cursed is the man, that trusteth in man, and makes flesh his arm, and whose heart departeth from the Lord. . . . Blessed is the man that trusteth in the Lord, and whose hope the Lord is.” The false prophets trust in flesh, even if that flesh is the temple in Jerusalem, the promised land, the chosen people itself, or even God’s promise to the chosen people (if that promise is taken to be an unconditional promise and not as a part of a covenant). The true prophets, regardless of whether they predict doom or salvation, predict the unexpected, the humanly unforeseeable—what would not occur to men, left to themselves, to fear or to hope. The true prophets speak and act by the spirit and in the spirit of Ehyeh-asher-ehyeh. For the false prophets, on the other hand, there cannot be the wholly unexpected, whether bad or good.
Of Socrates’s mission we know only through Plato’s Apology of Socrates, which presents itself as the speech delivered by Socrates when he defended himself against the charge that he did not believe in the existence of the gods worshipped by the city of Athens and that he corrupted the young. In that speech he denies possessing any more than human wisdom. This denial was understood by Judah Halevi among others as follows: “Socrates said to the people: ‘I do not deny your divine wisdom, but I say that I do not understand it; I am wise only in human wisdom.’”5 While this interpretation points in the right direction, it goes somewhat too far. Socrates, at least, immediately after having denied possessing anything more than human wisdom, refers to the speech that originated his mission, and of this speech he says that it is not his but he seems to ascribe to it divine origin. He does trace what he says to a speaker who is worthy of the Athenians’ credence. But it is probable that he means by that speaker his companion, Chairephon, who is more worthy of credence than Socrates because he was attached to the democratic regime. This Chairephon, having once come to Delphi, asked Apollo’s oracle whether there was anyone wiser than Socrates. The Pythia replied that no one was wiser. This reply originated Socrates’s mission. We see at once that Socrates’s mission originated in human initiative, in the initiative of one of Socrates’s companions. Socrates, on the other hand, takes it for granted that the reply given by the Pythia was given by the god Apollo himself. Yet this does not induce him to take it for granted that the god’s reply is true. He does take it for granted that it is not meet for the god to lie. Yet this does not make the god’s reply convincing to him. In fact he tries to refute that reply by discovering men who are wiser than he. Engaging in this quest, he finds out that the god spoke the truth: Socrates is wiser than other men because he knows nothing, i.e., nothing about the most important things, whereas the others believe that they know the truth about the most important things. Thus his attempt to refute the oracle turns into a vindication of the oracle. Without intending it, he comes to the assistance of the god; he serves the god; he obeys the god’s command. Although no god had ever spoken to him, he is satisfied that the god had commanded him to examine himself and the others, i.e., to philosophize, or to exhort everyone he meets to the practice of virtue: he has been given by the god to the city of Athens as a gadfly.
While Socrates does not claim to have heard the speech of a god, he claims that a voice—something divine and demonic—speaks to him from time to time, his daimonion. This daimonion, however, has no connection with Socrates’s mission, for it never urges him forward but only keeps him back. While the Delphic oracle urged him forward toward philosophizing, toward examining his fellow men, and thus made him generally hated and thus brought him into mortal danger, his daimonion kept him back from political activity and thus saved him from mortal danger.
The fact that both Socrates and the prophets have a divine mission means, or at any rate implies, that both Socrates and the prophets are concerned with justice or righteousness, with the perfectly just society which, as such, would be free of all evils. To this extent Socrates’s figuring out of the best social order and the prophets’ vision of the messianic age are in agreement. Yet whereas the prophets predict the coming of the messianic age, Socrates merely holds that the perfect society is possible: whether it will ever be actual depends on an unlikely, although not impossible, coincidence, the coincidence of philosophy and political power. For, according to Socrates, the coming-into-being of the best political order is not due to divine intervention; human nature will remain as it always has been; the decisive difference between the best political order and all other societies is that in the former the philosophers will be kings or the natural potentiality of the philosophers will reach its utmost perfection. In the most perfect social order, as Socrates sees it, knowledge of the most important things will remain, as it always was, the preserve of the philosophers, i.e., of a very small part of the population. According to the prophets, however, in the messianic age “the earth shall be full of knowledge of the Lord, as the waters cover the earth” (Isaiah 11:9), and this will be brought about by God Himself. As a consequence, the messianic age will be the age of universal peace: all nations shall come to the mountain of the Lord, to the house of the God of Jacob, “and they shall beat their swords into plowshares, and their spears into pruning hooks: nation shall not lift up sword against nation, neither shall they learn war any more” (Isaiah 2:2-4). The best regime, however, as Socrates envisages it, will animate a single city which, as a matter of course, will become embroiled in wars with other cities. The cessation of evils that Socrates expects from the establishment of the best regime will not include the cessation of war.
Finally, the perfectly just man, the man who is as just as is humanly possible, is, according to Socrates, the philosopher; according to the prophets, he is the faithful servant of the Lord. The philosopher is the man who dedicates his life to the quest for knowledge of the good, of the idea of the good; what we would call moral virtue is only the condition or by-product of that quest. According to the prophets, however, there is no need for the quest for knowledge of the good: God “hath shewed thee, O man, what is good; and what doth the Lord require of thee, but to do justly, and to love mercy, and to walk humbly with thy God” (Micah 6.8).
1 Cf. U. Cassuto, A Commentary on the Book of Genesis, Part I, Jerusalem, 1961, p. 42.
2 Cf. the characterization of the plants as ???e?a (“in or of the earth”) in Plato’s Republic, 491 d 1. Cf. Empedocles A 70.
3 Cf. the distinction between the two kinds of “other gods” in Deut. 4:15-19, between the idols on the one hand and sun, moon, and stars on the other.
4 Compare Plato’s Laws 905 a 4-b 2 with Amos 9:1-3 and Psalm 139:7-10.
5 Kuzari IV, 13 and V, 14.
Jerusalem and Athens: Some Introductory Reflections
Must-Reads from Magazine
A foreign-policy approach based in security and pragmatism is now characterized by retrenchment and radicalism
And yet realism is currently in crisis.
Realism was once a sophisticated intellectual tradition that represented the best in American statecraft. Eminent Cold War realists were broadly supportive of America’s postwar internationalism and its stabilizing role in global affairs, even as they stressed the need for prudence and restraint in employing U.S. power. Above all, Cold War–era realism was based on a hard-earned understanding that Americans must deal with the geopolitical realities as they are, rather than retreat to the false comfort provided by the Atlantic and Pacific oceans.
More recently, however, those who call themselves realists have lost touch with this tradition. Within academia, realism has become synonymous with a preference for radical retrenchment and the deliberate destruction of arrangements that have fostered international stability and prosperity for decades. Within government, the Trump administration appears to be embracing an equally misguided version of realism—an approach that masquerades as shrewd realpolitik but is likely to prove profoundly damaging to American power and influence. Neither of these approaches is truly “realist,” as neither promotes core American interests or deals with the world as it really is. The United States surely needs the insights that an authentically realist approach to global affairs can provide. But first, American realism will have to undergo a reformation.
The Realist Tradition
Realism has taken many forms over the years, but it has always been focused on the imperatives of power, order, and survival in an anarchic global arena. The classical realists—Thucydides, Machiavelli, Hobbes—considered how states and leaders should behave in a dangerous world in which there was no overarching morality or governing authority strong enough to regulate state behavior. The great modern realists—thinkers and statesmen such as Reinhold Niebuhr, Hans Morgenthau, George Kennan, and Henry Kissinger—grappled with the same issues during and after the catastrophic upheaval that characterized the first half of the 20th century.
They argued that it was impossible to transcend the tragic nature of international politics through good intentions or moralistic maxims, and that seeking to do so would merely empower the most ruthless members of the international system. They contended, on the basis of bitter experience, that aggression and violence were always a possibility in international affairs, and that states that desired peace would thus have to prepare for war and show themselves ready to wield coercive power. Most important, realist thinkers tended to place a high value on policies and arrangements that restrained potential aggressors and created a basis for stability within an inherently competitive global environment.
For this very reason, leading Cold War–era realists advocated a robust American internationalism as the best way of restraining malevolent actors and preventing another disastrous global crack-up—one that would inevitably reach out and touch the United States, just as the world wars had. Realist thinkers understood that America was uniquely capable of stabilizing the international order and containing Soviet power after World War II, even as they disagreed—sometimes sharply—over the precise nature and extent of American commitments. Moreover, although Cold War realists recognized the paramount role of power in international affairs, most also recognized that U.S. power would be most effective if harnessed to a compelling concept of American moral purpose and exercised primarily through enduring partnerships with nations that shared core American values. “An idealistic policy undisciplined by political realism is bound to be unstable and ineffective,” the political scientist Robert Osgood wrote. “Political realism unguided by moral purpose will be self-defeating and futile.” Most realists were thus sympathetic to the major initiatives of postwar foreign policy, such as the creation of U.S.-led military alliances and the cultivation of a thriving Western community composed primarily of liberal democracies.
At the same time, Cold War realists spoke of the need for American restraint. They worried that America’s liberal idealism, absent a sense of limits, would carry the country into quixotic crusades. They thought that excessive commitments at the periphery of the global system could weaken the international order against its radical challengers. They believed that a policy of outright confrontation toward the Kremlin could be quite dangerous. “Absolute security for one power means absolute insecurity for all others,” Kissinger wrote. Realists therefore advocated policies meant to temper American ambition and the most perilous aspects of superpower competition. They supported—and, in Kissinger’s case, led—arms-control agreements and political negotiations with Moscow. They often objected to America’s costliest interventions in the Third World. Kennan and Morgenthau were among the first mainstream figures to go public with opposition to American involvement in Vietnam (Morgenthau did so in the pages of Commentary in May 1962).
During the Cold War, then, realism was a supple, nuanced doctrine. It emphasized the need for balance in American statecraft—for energetic action blended with moderation, for hard-headed power politics linked to a regard for partnerships and values. It recognized that the United States could best mitigate the tragic nature of international relations by engaging with, rather than withdrawing from, an imperfect world.
This nuance has now been lost. Academics have applied the label of realism to dangerous and unrealistic policy proposals. More disturbing and consequential still, the distortion of realism seems to be finding a sympathetic hearing in the Trump White House.
Realism as Retrenchment
Consider the state of academic realism. Today’s most prominent self-identified realists—Stephen Walt, John Mearsheimer, Barry Posen, and Christopher Layne—advocate a thoroughgoing U.S. retrenchment from global affairs. Whereas Cold War realists were willing to see the world as it was—a world that required unequal burden-sharing and an unprecedented, sustained American commitment to preserve international stability—academic realists now engage in precisely the wishful thinking that earlier realists deplored. They assume that the international order can essentially regulate itself and that America will not be threatened by—and can even profit from—a more unsettled world. They thus favor discarding the policies that have proven so successful over the decades in providing a congenial international climate.
Why has academic realism gone astray? If the Cold War brokered the marriage between realists and American global engagement, the end of the Cold War precipitated a divorce. Following the fall of the Soviet Union, U.S. policymakers continued to pursue an ambitious global agenda based on preserving and deepening both America’s geopolitical advantage and the liberal international order. For many realists, however, the end of the Cold War removed the extraordinary threat—an expansionist USSR—that had led them to support such an agenda in the first place. Academic realists argued that the humanitarian interventions of the 1990s (primarily in the former Yugoslavia) reflected capriciousness rather than a prudent effort to deal with sources of instability. Similarly, they saw key policy initiatives—especially NATO enlargement and the Iraq war of 2003—as evidence that Washington was no longer behaving with moderation and was itself becoming a destabilizing force in global affairs.
These critiques were overstated, but not wholly without merit. The invasion and occupation of Iraq did prove far costlier than expected, as the academic realists had indeed warned. NATO expansion—even as it successfully promoted stability and liberal reform in Eastern Europe—did take a toll on U.S.–Russia relations. Having lost policy arguments that they thought they should have won, academic realists decided to throw the baby out with the bathwater, calling for a radical reformulation of America’s broader grand strategy.
The realists’ preferred strategy has various names—“offshore balancing,” “restraint,” etc.—but the key components and expectations are consistent. Most academic realists argue that the United States should pare back or eliminate its military alliances and overseas troop deployments, going back “onshore” only if a hostile power is poised to dominate a key overseas region. They call on Washington to forgo costly nation-building and counterinsurgency missions overseas and to downgrade if not abandon the promotion of democracy and human rights.
Academic realists argue that this approach will force local actors in Europe, the Middle East, and East Asia to assume greater responsibility for their own security, and that the United States can manipulate—through diplomacy, arms sales, and covert action—the resulting rivalries and conflicts to prevent any single power from dominating a key region and thereby threatening the United States. Should these calculations prove faulty and a hostile power be poised to dominate, Washington can easily swoop in to set things aright, as it did during the world wars. Finally, if even this calculation were to prove faulty, realists argue that America can ride out the danger posed by a regional hegemon because the Atlantic and Pacific Oceans and America’s nuclear deterrent provide geopolitical immunity against existential threats.
Today’s academic realists portray this approach as hard-headed, economical strategy. But in reality, it represents a stark departure from classical American realism. During the Cold War, leading realists placed importance on preserving international stability and heeded the fundamental lesson of World Wars I and II—that the United States, by dint of its power and geography, was the only actor that could anchor international arrangements. Today’s academic realists essentially argue that the United States should dismantle the global architecture that has undergirded the international order—and that Washington can survive and even thrive amid the ensuing disorder. Cold War realists helped erect the pillars of a peaceful and prosperous world. Contemporary academic realists advocate tearing down those pillars and seeing what happens.
The answer is “nothing good.” Contemporary academic realists sit atop a pyramid of faulty assumptions. They assume that one can remove the buttresses of the international system without that system collapsing, and that geopolitical burdens laid down by America will be picked up effectively by others. They assume that the United States does not need the enduring relationships that its alliances have fostered, and that it can obtain any cooperation it needs via purely transactional interactions. They assume that a world in which the United States ceases to promote liberal values will not be a world less congenial to America’s geopolitical interests. They assume that revisionist states will be mollified rather than emboldened by an American withdrawal, and that the transition from U.S. leadership to another global system will not unleash widespread conflict. Finally, they assume that if such upheaval does erupt, the United States can deftly manage and even profit from it, and that America can quickly move to restore stability at a reasonable cost should it become necessary to do so.
The founding generation of American realists had learned not to indulge in wishfully thinking that the international order would create or sustain itself, or that the costs of responding to rampant international disorder would be trivial. Today’s academic realists, by contrast, would stake everything on a leap into the unknown.
For many years, neither Democratic nor Republican policymakers were willing to make such a leap. Now, however, the Trump administration appears inclined to embrace its own version of foreign-policy realism, one that bears many similarities to—and contains many of the same liabilities as—the academic variant. One of the least academic presidents in American history may, ironically, be buying into some of the most misguided doctrines of the ivory tower.
Any assessment of the Trump administration must remain somewhat provisional, given that Donald Trump’s approach to foreign policy is still a work in progress. Yet Trump and his administration have so far taken multiple steps to outline a three-legged-stool vision of foreign policy that they explicitly describe as “realist” in orientation. Like modern-day academic realism, however, this vision diverges drastically from the earlier tradition of American realism and leads to deeply problematic policy.
The first leg is President Trump’s oft-stated view of the international environment as an inherently zero-sum arena in which the gains of other countries are America’s losses. The post–World War II realists, by contrast, believed that the United States could enjoy positive-sum relations with like-minded nations. Indeed, they believed that America could not enjoy economic prosperity and national security unless its major trading partners in Europe and Asia were themselves prosperous and stable. The celebrated Marshall Plan was high-mindedly generous in the sense of addressing urgent humanitarian needs in Europe, yet policymakers very much conceived of it as serving America’s parochial economic and security interests at the same time. President Trump, however, sees a winner and loser in every transaction, and believes—with respect to allies and adversaries alike—that it is the United States who generally gets snookered. The “reality” at the core of Trump’s realism is his stated belief that America is exploited “by every nation in the world virtually.”
This belief aligns closely with the second leg of the Trump worldview: the idea that all foreign policy is explicitly competitive in nature. Whereas the Cold War realists saw a Western community of states, President Trump apparently sees a dog-eat-dog world where America should view every transaction—even with allies—on a one-off basis. “The world is not a ‘global community’ but an arena where nations, nongovernmental actors and businesses engage and compete for advantage,” wrote National Security Adviser H.R. McMaster and National Economic Council Director Gary Cohn in an op-ed. “Rather than deny this elemental nature of international affairs, we embrace it.”
To be sure, Cold War realists were deeply skeptical about “one worldism” and appeals to a global community. But still they saw the United States and its allies as representing the “free world,” a community of common purpose forged in the battle against totalitarian enemies. The Trump administration seems to view U.S. partnerships primarily on an ad hoc basis, and it has articulated something akin to a “what have you done for me lately” approach to allies. The Cold War realists—who understood how hard it was to assemble effective alliances in the first place—would have found this approach odd in the extreme.
Finally, there is the third leg of Trump’s “realism”: an embrace of amorality. President Trump has repeatedly argued that issues such as the promotion of human rights and democracy are merely distractions from “winning” in the international arena and a recipe for squandering scarce resources. On the president’s first overseas trip to the Middle East in May, for instance, he promised not to “lecture” authoritarian countries on their internal behavior, and he made clear his intent to embrace leaders who back short-term U.S. foreign-policy goals no matter how egregious their violations of basic human rights and political freedoms. Weeks later, on a visit to Poland, the president did speak explicitly about the role that shared values played in the West’s struggle against Communism during the Cold War, and he invoked “the hope of every soul to live in freedom.” Yet his speech contained only the most cursory reference to Russia—the authoritarian power now undermining democratic governance and security throughout Europe and beyond. Just as significant, Trump failed to mention that Poland itself—until a few years ago, a stirring exemplar of successful transition from totalitarianism to democracy—is today sliding backwards toward illiberalism (as are other countries within Europe and the broader free world).
At first glance, this approach might seem like a modern-day echo of Cold War debates about whether to back authoritarian dictators in the struggle against global Communism. But, as Jeane Kirkpatrick explained in her famous 1979 Commentary essay “Dictatorships and Double Standards,” and as Kissinger himself frequently argued, Cold War realists saw such tactical alliances of convenience as being in the service of a deeper values-based goal: the preservation of an international environment favoring liberty and democracy against the predations of totalitarianism. Moreover, they understood that Americans would sustain the burdens of global leadership over a prolonged period only if motivated by appeals to their cherished ideals as well as their concrete interests. Trump, for his part, has given only faint and sporadic indications of any appreciation of the traditional role of values in American foreign policy.
Put together, these three elements have profound, sometimes radical, implications for America’s approach to a broad range of global issues. Guided by this form of realism, the Trump administration has persistently chastised and alienated long-standing democratic allies in Europe and the Asia-Pacific and moved closer to authoritarians in Saudi Arabia, China, and the Philippines. The president’s body language alone has been striking: Trump’s summits have repeatedly showcased conviviality with dictators and quasi-authoritarians and painfully awkward interactions with democratic leaders such as Germany’s Angela Merkel. Similarly, Trump has disdained international agreements and institutions that do not deliver immediate, concrete benefits for the United States, even if they are critical to forging international cooperation on key issues or advancing longer-term goods. As Trump has put it, he means to promote the interests of Pittsburgh, not Paris, and he believes that those interests are inherently at odds with each other.
To be fair, President Trump and his proxies do view the war on terror as a matter of defending both American security interests and Western civilization’s values against the jihadist onslaught. This was a key theme of Trump’s major address in Warsaw. Yet the administration has not explained how this civilizational mindset would inform any other aspect of its foreign policy—with the possible exception of immigration policy—and resorts far more often to the parochial lens of nationalism.
The Trump administration seems to be articulating a vision in which America has no lasting friends, little enduring concern with values, and even less interest in cultivating a community of like-minded nations that exists for more than purely deal-making purposes. The administration has often portrayed this as clear-eyed realism, even invoking the founding father of realism, Thucydides, as its intellectual lodestar. This approach does bear some resemblance to classical realism: an unsentimental approach to the world with an emphasis on the competitive aspects of the international environment. And insofar as Trump dresses down American allies, rejects the importance of values, and focuses on transactional partnerships, his version of realism has quite a lot in common with the contemporary academic version.
Daniel Drezner of Tufts University has noted the overlap, declaring in a Washington Post column, “This is [academic] realism’s moment in the foreign policy sun.” Randall Schweller of Ohio State University, an avowed academic realist and Trump supporter, has been even more explicit, noting approvingly that “Trump’s foreign-policy approach essentially falls under the rubric of ‘off-shore balancing’” as promoted by ivory-tower realists in recent decades.
Yet one suspects that the American realists who helped create the post–World War II order would not feel comfortable with either the academic or Trumpian versions of realism as they exist today. For although both of these approaches purport to be about power and concrete results, both neglect the very things that have allowed the United States to use its power so effectively in the past.
Both the academic and Trump versions of realism ignore the fact that U.S. power is most potent when it is wielded in concert with a deeply institutionalized community of like-minded nations. Alliances are less about addition and subtraction—the math of the burden-sharing emphasized by Trump and the academic realists—and more about multiplication, leveraging U.S. power to influence world events at a fraction of the cost of unilateral approaches. The United States would be vastly less powerful and influential in Europe and Central Asia without NATO; it would encounter far greater difficulties in rounding up partners to wage the ongoing war in Afghanistan or defeat the Islamic State; it would find itself fighting alone—rather than with some of the world’s most powerful partners—far more often. Likewise, without its longstanding treaty allies in Asia, the United States would be at an almost insurmountable disadvantage vis-à-vis revisionist powers in that region, namely China.
Both versions of realism also ignore the fact that America has been able to exercise its enormous power with remarkably little global resistance precisely because American leaders, by and large, have paid sufficient regard to the opinions of potential partners. Of course, every administration has sought to “put America first,” but the pursuit of American self-interest has proved most successful when it enjoys the acquiescence of other states. Likewise, the academic and Trump versions of realism too frequently forget that America draws power by supporting values with universal appeal. This is why every American president from Franklin Roosevelt to Barack Obama has recognized that a more democratic world is likely to be one that is both ideologically and geopolitically more congenial to the United States.
Most important, both the academic and Trump versions of realism ignore the fact that the classical post–World War II realists deliberately sought to overcome the dog-eat-dog world that modern variants take as a given. They did so by facilitating cooperation within the free world, suppressing the security competitions that had previously led to cataclysmic wars, creating the basis for a thriving international economy, and thereby making life a little less nasty, brutish, and short for Americans as well as for vast swaths of the world’s population.
If realism is about maximizing power, effectiveness, and security in a competitive global arena, then neither the academic nor the Trump versions of realism merits the name. And if realism is meant to reflect the world as it is, both of these versions are deeply deficient.
This is a tragedy. For if ever there were a moment for an informed realism, it would be now, as the strategic horizon darkens and a more competitive international environment reemerges. There is still time for Trump and his team to adapt, and realism can still make a constructive contribution to American policy. But first it must rediscover its roots—and absorb the lessons of the past 70 years.
The Seven Pillars of Realism
A reformed realism should be built upon seven bedrock insights, which President Trump would do well to embrace.
First, American leadership remains essential to restraining global disorder. Today’s realists channel the longstanding American hope that there would come a time when the United States could slough off the responsibilities it assumed after World War II and again become a country that relies on its advantageous geography to keep the world at arm’s length. Yet realism compels an awareness that America is exceptionally suited to the part it has played for nearly four generations. The combination of its power, geographic location, and values has rendered America uniquely capable of providing a degree of global order in a way that is more reassuring than threatening to most of the key actors in the international system. Moreover, given that today the most ambitious and energetic international actors besides the United States are not liberal democracies but aggressive authoritarian powers, an American withdrawal is unlikely to produce multipolar peace. Instead, it is likely to precipitate the upheaval that U.S. engagement and activism have long been meant to avert. As a corollary, realists must also recognize that the United States is unlikely to thrive amid such upheaval; it will probably find that the disorder spreads and ultimately implicates vital American interests, as was twice the case in the first half of the 20th century.
Second, true realism recognizes the interdependence of hard and soft power. In a competitive world, there is no substitute for American hard power, and particularly for military muscle. Without guns, there will not—over the long term—be butter. But military power, by itself, is an insufficient foundation for American strategy. A crude reliance on coercion will damage American prestige and credibility in the end; hard power works best when deployed in the service of ideas and goals that command widespread international approval. Similarly, military might is most effective when combined with the “softer” tools of development assistance, foreign aid, and knowledge of foreign societies and cultures. The Trump administration has sought to eviscerate these nonmilitary capabilities and bragged about its “hard-power budget”; it would do better to understand that a balance between hard and soft power is essential.
Third, values are an essential part of American realism. Of course, the United States must not undertake indiscriminate interventions in the name of democracy and human rights. But, fortunately, no serious policymaker—not Woodrow Wilson, not Jimmy Carter, not George W. Bush—has ever embraced such a doctrine. What most American leaders have traditionally recognized is that, on balance, U.S. interests will be served and U.S. power will be magnified in a world in which democracy and human rights are respected. Ronald Reagan, now revered for his achievements in improving America’s global position, understood this point and made the selective promotion of democracy—primarily through nonmilitary means—a key part of his foreign policy. While paying due heed to the requirements of prudence and the limits of American power, then, American realists should work to foster a climate in which those values can flourish.
Fourth, a reformed realism requires aligning relations with the major powers appropriately—especially today, as great-power tensions rise. That means appreciating the value of institutions that have bound the United States to some of the most powerful actors in the international system for decades and thereby given Washington leadership of the world’s dominant geopolitical coalition. It means not taking trustworthy allies for granted or picking fights with them gratuitously. It also means not treating actual adversaries, such as Vladimir Putin’s Russia, as if they were trustworthy partners (as Trump has often talked of doing) or as if their aggressive behavior were simply a defensive response to American provocations (as many academic realists have done). A realistic approach to American foreign policy begins by seeing great-power relations through clear eyes.
Fifth, limits are essential. Academic realists are wrong to suggest that values should be excised from U.S. policy; they are wrong to argue that the United States should pull back dramatically from the world. Yet they are right that good statecraft requires an understanding of limits—particularly for a country as powerful as the United States, and particularly at a time when the international environment is becoming more contested. The United States cannot right every wrong, fix every problem, or defend every global interest. America can and should, however, shoulder more of the burden than modern academic and Trumpian realists believe. The United States will be effective only if it chooses its battles carefully; it will need to preserve its power for dealing with the most pressing threat to its national interests and the international order—the resurgence of authoritarian challenges—even if that means taking an economy-of-force approach to other issues.
Sixth, realists must recognize that the United States has not created and sustained a global network of alliances, international institutions, and other embedded relationships out of a sense of charity. It has done so because those relationships provide forums through which the United States can exercise power at a bargain-basement price. Embedded relationships have allowed the United States to rally other nations to support American causes from the Korean War to the counter-ISIS campaign, and have reduced the transaction costs of collective action to meet common threats from international terrorism to p.iracy. They have provided institutional megaphones through which the United States can amplify its diplomatic voice and project its influence into key issues and regions around the globe. If these arrangements did not exist, the United States would find itself having to create them, or acting unilaterally at far greater cost. If realism is really about maximizing American power, true realists ought to be enthusiastic about relationships and institutions that serve that purpose. Realists should adopt the approach that every post–Cold War president has embraced: that the United States will act unilaterally in defense of its interests when it must, but multilaterally with partners whenever it can.
Finally, realism requires not throwing away what has worked in the past. One of the most astounding aspects of both contemporary academic realism and the Trumpian variant of that tradition is the cavalier attitude they display toward arrangements and partnerships that have helped produce a veritable golden age of international peace, stability, and liberalism since World War II, and that have made the United States the most influential and effective actor in the globe in the process. Of course, there have been serious and costly conflicts over the past decades, and U.S. policy has always been thoroughly imperfect. But the last 70 years have been remarkably good ones for U.S. interests and the global order—whether one compares them with the 70 years before the United States adopted its global leadership role, or compares them with the violent disorder that would have emerged if America followed the nostrums peddled today under the realist label. A doctrine that stresses that importance of prudence and discretion, and that was originally conservative in its preoccupation with stability and order, ought not to pursue radical changes in American statecraft or embrace a “come what may” approach to the world. Rather, such a doctrine ought to recognize that true achievements are enormously difficult to come by—and that the most realistic approach to American strategy would thus be to focus on keeping a good thing going.
The story of Britain’s unknown neoconservatives
During the decade that followed, the prospects of “the sick man of Europe” were seemingly transformed. With the free market unleashed and the authority of the democratic government restored, inflation fell, growth resumed, and the unions were tamed. Britain became the laboratory for an experiment—privatization—that would transform not just its economy, but that of many countries throughout the world that came to look to it for inspiration.
More than any other Briton, one person was responsible for this about-turn: Margaret Thatcher. The foundations for what came to be known as the Thatcher revolution were laid in the four years she spent as leader of the Opposition before the Conservative Party she led was returned to power at the 1979 general election. During this period, much of the groundwork was done by a curious and unlikely triumvirate. Thatcher, the daughter of a shopkeeper and Methodist lay preacher from the provincial Middle England town of Grantham, was both the leader and the follower of the other two. They were Sir Keith Joseph, the scion of a wealthy Anglo-Jewish family, and Alfred Sherman, a former Communist working-class Jew from London’s East End whose parents had fled Czarist Russia.
Traditionally, the relationship between Jews and the Conservative Party had been one of mutual distrust. It was the Tories, for instance, who had attempted to shut the door to Jewish immigrants at the turn of the 20th century, while it was the Labour Party in which many of their sons and daughters would find a sympathetic home. An all-too-common mix of snobbery and anti-Semitism dominated the upper echelons of the Conservative Party, seemingly undisturbed by the fact that, by the 1930s, upward mobility began to enable some Jews to leave behind the socialist citadels of the inner cities and find a home in Tory-voting suburbia.
After the war, the association between the Tory Party and prewar appeasement, indifference verging on hostility to the birth of the state of Israel, and occasional manifestations of anti-Semitism among its grassroots membership meant that many Jews continued to shun it. There were only two Jews on the Tory benches in the House of Commons in the 25 years between 1945 and 1970—as against, at its peak, 38 Jewish Labour MPs in 1966. During the 1970s, this began to shift: Further demographic changes within the Jewish community, Labour’s drift toward anti-Zionism, and the more meritocratic bent of the Conservative Party, begun under Prime Minister Ted Heath (1970–74) and accelerated by Thatcher, dramatically increased the number of Jews voting Tory and sitting on the party’s benches in parliament.
If the Tory Party had historically been unwelcoming toward Jews, it had also had little time for intellectuals. While the notion of the Conservatives as the “stupid party,” as Britain’s only Jewish prime minster called it, was overblown, it was also true that many Tories regarded ideas and those who traded in them as suspect and a distraction from the party’s mission to govern the nation unencumbered by the kind of intellectual baggage that might hinder its ruthlessly successful pursuit of power.
Thatcher, Joseph, and Sherman would change all that.
When Thatcher unseated Heath as the Conservative Party’s leader in February 1975, the party was suffering an acute crisis of confidence. Heath had lost three of the four elections he had fought against Labour’s wily leader, Harold Wilson. The previous October, the Tories had received their lowest share of the vote since 1945.
These political problems were accompanied by—indeed, caused by, Thatcher was certain—a lack of self-belief. For three decades, the Tories had embraced the postwar consensus of Keynesian economics and a welfare state. In 1970, the party’s “Selsdon Manifesto” had promised to break with that ignoble history by freeing up the economy, reining in government, and clipping the wings of the nation’s powerful trade unions. But, barely two years in office, Heath’s government had buckled at the first sign of resistance and executed a less than gracious U-turn: caving into miners in the face of a strike and rolling back some newly introduced restrictions on the unions; ditching fiscal caution in an ill-fated “dash for growth”; and introducing wage and price controls. Its Industry Act, crowed the leader of Labour’s left, Tony Benn, was “spadework for socialism.” As members of the Heath government, Thatcher and Joseph—respectively responsible for the high-spending education and health departments—were implicated in this intellectual and political betrayal. But, unlike many of their colleagues, the two most economically conservative members of Heath’s Cabinet were determined it would be the last.
The son of a former lord mayor of London, Joseph was an improbable revolutionary by both background and temperament. Sherman would later note his ally’s “tendency to wilt under pressure” and aversion to conflict.
And yet Joseph was to be the man who lit the touch paper that, as Sherman put it, “sparked off the Thatcher revolution.”
Thatcher and Joseph shared a common attribute: the sense that they were both outsiders. Hers stemmed from her grocer’s-daughter upbringing, the snobbery and disdain she encountered at Oxford from both the upper-class grandees of the Conservative Association and the liberal intelligentsia that dominated its academic body, and later, her gender, as she sought a safe Tory seat.
His originated from his Judaism. In later life, Joseph suggested that the advantage of being Jewish was that to be successful, “you have to spark on all four cylinders.” To put it less positively, Jews faced greater barriers to achievement than others and so had to be twice as able. Despite his rapid rise through the Tory ranks once he had entered parliament 1956, Joseph remained, in the words of one observer, “almost alien.” Nonetheless, Joseph was very much in the mainstream of postwar moderate Conservatism. He combined a liberal social outlook and concern for the poor with a belief in the importance of entrepreneurship.
Occasionally, as when the Conservatives lost power in 1964, Joseph would signal dissent with the leftward direction in which his party was drifting. In a series of speeches and articles, he bemoaned the Tories’ failure to free Britain from the collectivist constraints Labour had imposed upon it after the war, talking of the need to cut taxes further, give business greater freedom, and, perhaps most significantly for the future, raise the then virtually unheard-of prospect of privatization.
But for the most part he toed the party line, as did Thatcher. Neither indicated any personal misgivings or public signs of disagreement when Heath abandoned the free-market program on which the Conservative government had been elected in 1970.
Joseph’s weakness at this critical moment escaped neither the wrath nor the attention of Alfred Sherman. Sherman’s upbringing in the East End of London was one, he later suggested, in which “you were born a socialist, you didn’t have to become one.”
Struggling to assimilate against a backdrop of barely disguised official anti-Semitism, Sherman became a Communist. “When we deserted the God of our fathers,” he wrote, “we were bound to go whoring after strange gods, of which socialism in its various forms was a prominent choice.” At 17, he went to war in Spain. His turn from Marxism came after World War II, when he studied at the London School of Economics and came upon F.A. Hayek’s The Road to Serfdom. It “set him thinking”—and in 1948 he was expelled from the Communist Party for “deviationism.” In the unpromising terrain of 1950s socialist Israel, where he went to work as an economic advisor, he developed his fervent support for the free market. It was a cause he would vociferously promote on his return to Britain.
The two future collaborators in the Thatcher project first met when Sherman—at this point a journalist for the Daily Telegraph, the house journal of the Conservative Party—came to interview Joseph shortly after he had become a Cabinet minister in 1962. Sherman soon began to help write Joseph’s speeches, including those in which, before the Tories’ return to government in 1970, Joseph first began to tentatively break with the postwar consensus. Sherman was thus dismayed not only by the Heath government’s abandonment of its pre-election free-market pledges, but Joseph’s supposed connivance in this betrayal. He later labeled his friend “a lion in opposition and a lamb in government.”
But the shattering blow of the Tories’ ejection from office in 1974 at the hands of the unions brought the two men back together. “Keith,” Sherman bluntly told Joseph over lunch one day, “the trouble is that you agree with me but you haven’t got the backbone to say so.” While Sherman was a Conservative, his disdain for the establishment did not recognize party labels. The Tories, he believed, appeared to judge virtue by the measure of whether it won them elections. The free-market revolution that he wanted Joseph to lead was designed not simply to sweep away socialism, but to cleanse the Conservative Party of its postwar ideological sins. And so it was that, with Sherman acting as his confessor, Joseph underwent his very public recantation and conversion to Conservatism.
What Sherman would later dub “the London Spring” commenced on June 24, 1974, when Joseph delivered the first of a series of speeches eviscerating the Tories’ record and his own part in it. The introductory lines of this first speech, drafted by Sherman, represented the opening volley in what was to become a five-year assault on the postwar settlement:
This is no time to be mealy-mouthed. Since the end of the Second World War we have had altogether too much Socialism.…For half of that 30 years Conservative Governments, for understandable reasons, did not consider it practicable to reverse the vast bulk of the accumulating detritus of Socialism which on each occasion they found when they returned to office.
Just over two months later, on the eve of 1974’s second election, called by Labour’s Harold Wilson to boost his weak parliamentary position, Joseph returned to the fray once again. He assailed the last Tory government for abandoning “sound money policies,” suggested that it had been debilitated by an unwarranted fear of unemployment, and warned that inflation was “threatening to destroy our society.” His solution—neither “easy nor enjoyable”— was to cut the deficit, gradually bear down on the money supply, and accept that there was a resultant risk of a temporary increase in unemployment.
This was the moment at which the Tories began to break with the principal tenet of Keynesianism—that government’s overriding goal should be to secure full employment. As Thatcher argued in her memoirs, it was “one of the very few speeches which have fundamentally affected a political generation’s way of thinking.” A decade later, when she had been prime minister for five years, the import of Joseph’s words in Preston was clearer still. By that point, Britain was being led by a woman whose government had broken decisively with the policies of its predecessors, placed the defeat of inflation above that of unemployment, and turned monetarism into its economic lodestar. Thatcher had determined that she would not, as Joseph had cautioned against, “be stampeded again” into a Heath-like surrender to Keynes.
But at the time, Thatcher’s response to the Tory defeat in February 1974 was publicly muted. Her pronouncements—“I think we shall finish up being the more radical party”—verged on the anodyne. But she did become a vice-chair of the new Centre for Policy Studies, the think tank that Joseph and Sherman had newly established to “question the unquestioned, think the unthinkable, [and] blaze a trail,” in Sherman’s world. Not for nothing would Geoffrey Howe describe Sherman as “a zealot of the right.” During this period, as she later acknowledged, Thatcher “learned a great deal” from Sherman and Joseph. Thatcher began to attend lunches and seminars at the free-market Institute of Economic Affairs think tank and, as co-founder of the IEA, Lord Harris of High Crosssaid, said, “ponder our writing and our authors’ publications.”
That Joseph would lead while Thatcher followed was not, then, surprising. She had always regarded him as “the senior partner” in their close political friendship. Thatcher urged Joseph to challenge Heath for the Tory Party leadership and discouraged speculation that she herself might seek it. Then Joseph delivered an ill-advised speech on social policy in which he suggested that “the balance of our population, our human stock is threatened” by the birth rates of the poor. It led to a media furor and the abandonment of his still-embryonic campaign. Frustrated, Thatcher stepped into the breach. Two months later, she was elected leader.
In her campaign to take command of the Conservative Party, Thatcher sounded many of the same notes as Joseph: that voters believed too many Conservatives “had become Socialists already” and that Britain was moving inexorably in the direction of socialism, taking “two steps forward” under Labour, but only “half a step back” under the Tories. Nonetheless, she was under no illusions that her victory in the leadership election represented a “wholesale conversion” by the party to her and Joseph’s way of thinking. Over the next four years, the support and counsel of Joseph would prove invaluable.
Thatcher had, in the words of one of her Downing Street policy advisors, “no interest in ideas for their own sake,” but she did regard politics as a clash of opposing philosophies. “We must have an ideology,” she declared to the Conservative Philosophy Group, which was formed in the year she became party leader. “The other side have got an ideology they can test their policies against.” She thus looked to Joseph and Sherman to articulate her “beliefs, feelings, instincts, and intuitions into ideas, strategies, and policies,” in Sherman’s telling. They were the builders of the intellectual edifice for the instincts—that “profligacy was a vice” and government, like a prudent household, should live within its means—that, Thatcher proudly declared, she had learned from “the world in which I grew up.”
Many Tories regarded the very notion of a “battle of ideas” as dangerous nonsense. For others, it was the ideas themselves that were suspect. When Joseph presented a paper in April 1975 urging a break with the “path of consensus” and a much greater defense of “what some intellectuals disparagingly call ‘middle-class suburban values,’ a desire to enjoy economic independence, to be well thought of, patriotism”—it met with a furious response from the Tory Shadow Cabinet. Joseph’s call for the Conservatives to push an agenda of higher defense spending, an assault on union power, deep cuts in public expenditure, and measures to curb immigration and bolster the family was greeted with horror by his colleagues. But as Thatcher’s biographer, Charles Moore, has noted, “this startling paper furnished the main elements of what came to be called Thatcherism, both in specific policy and in general psychological terms.”
Meanwhile, memos, letters, and speeches poured forth from Sherman, invariably urging Thatcher and Joseph to go further and faster. With Sherman as his navigator and companion, Joseph himself assumed the role of outrider— “the licensed thinker scouting ahead in Indian country,” as future MP and Cabinet minister Oliver Letwin put it—helping to open up new territory for the Tory leader to occupy when she deemed it politically safe to do so. Her political antennae, much sharper and more finely attuned than those of Joseph or Sherman, proved critical to this creative mix. They drew fire from the Tory old guard, allowing Thatcher to rise above the fray and then later make public pronouncements that frequently followed the Joseph-Sherman line.
Joseph marked the territory between the two camps clearly. He urged the Tories to reach for the “common ground.” He did not mean the centrist midpoint between the two main parties’ positions, which had been the Conservative approach since the end of the war. He meant the territory where a majority of the public found itself, on the opposite side of the political establishment. As Sherman wrote to Thatcher, in trying to compete with Labour in the ephemeral center ground, the Tories had abandoned the defense of those values—“patriotism, the puritan ethic, Christianity, conventional family-based morality”— that most voters supported. More prosaically, he urged her to speak out on issues such as “national identity, law and order, and scrounging.” He thus provided her with an electoral and moral justification for pursuing a populist political strategy that dovetailed with her own instinctive convictions.
This son of Jewish immigrants would later speak of his disapproval of the term “Judeo-Christian values” and would insist that Thatcher should root her message in her own Methodist upbringing and the Tories’ close relationship with Britain’s Established Church. Thatcher proved more ecumenical. As her close friendship with Chief Rabbi Immanuel Jakobovits illustrated, she saw, and often remarked upon, the close harmony between Judaism and the nonconformist insistence on individual responsibility, community self-help, and the moral necessity of self-improvement and wealth creation imparted by her father. Not for nothing would the Sunday Telegraph later admiringly suggest during her premiership that Judaism had become “the new creed of Thatcherite Britain.”
Sherman’s early political convictions had both positive and negative ramifications. Thatcher said he brought a “convert’s zeal to the task of plotting out a new kind of free-market Conservatism.” What Sherman referred to as his “Communist decade,” he wrote, had taught him “to think big, to believe that, aligned with the forces of history, a handful of people with sufficient faith could move mountains.” His understanding of the left also allowed him to recognize, in a way neither Joseph nor Thatcher intuitively did, the need to cast Thatcherism as an anti-establishment, radical force. Combined with his assiduous wooing of disenchanted former Labour supporters, this helped Thatcher win some high-profile converts, such as the novelist Kingsley Amis, the writer Paul Johnson, and the academic John Vaizey.
The intellectual development of Thatcherism in the 1970s was, of course, the work of many hands. While not by any means exclusively so, many were Jewish and some came from outside the Tory fold. The political scientist Shirley Robin Letwin and her husband, the economist Bill Letwin, both American-born, began to offer advice and assistance with Thatcher’s speeches. While recoiling from her devotion to “Victorian values,” the economist Samuel Brittan was nonetheless an influential exponent of monetarism. His economic commentary in the Financial Times was the only newspaper column Thatcher never missed reading. Arthur Seldon, a founder of the IEA, was a supporter of the Liberal Party who hankered in vain for it return to its Gladstonian belief in limited government. He ensured the flame of free-market economics was not completely extinguished in the 1950s, helped introduce the ideas of Milton Friedman to Britain, and willingly assisted in Thatcher’s effort to smash the postwar settlement.
However, it was Joseph and Sherman who were the preeminent warriors in the battle of ideas. Joseph’s 1976 Stockton Lecture, “Monetarism Is Not Enough,” called for a squeeze on the money supply to bring down inflation, substantial cuts in taxes and spending, and “bold incentives and encouragements” to wealth-creators. It encapsulated the governing agenda and underlying philosophy of the Thatcher governments. Thatcher biographer Hugo Young believed that Joseph’s speeches during this time contained “everything that is distinctive about the economic and political philosophy” of Thatcherism. Joseph took “the moral case for capitalism” into the lion’s den of the campuses, delivering 150 speeches in three years on the virtues of the free market. Despite the frequent attempts of hard-left students to disrupt his appearances, Thatcher later concluded that Joseph’s work had been critical in restoring the right’s “intellectual self-confidence.” She said that “all that work with the intellectuals” helped underlay her government’s later successes.
In the settling of scores that followed her dramatic defenestration in November 1990, Thatcher’s sense of betrayal was evident. Among the few who escaped her harsh words were Joseph and Sherman. In the first volume of her memoirs, which she dedicated to Joseph’s memory, Thatcher wrote simply: “I could not have become Leader of the Opposition, or achieved what I did as Prime Minister, without Keith. But nor, it is fair to say, could Keith have achieved what he did without …Alfred Sherman.”
Joseph and Sherman’s presence underlines the leading role played by Jews in the intellectual regeneration of British conservatism, a prominence akin to—and perhaps even greater than—that played by Jewish neoconservatives in the Reagan revolution.
Review of 'The Strange Death of Europe' By Douglas Murray
Since Christianity had shaped the “humanism of which Europe feels legitimately proud,” the ailing pontiff argued, the constitution should make some reference to Europe’s Christian patrimony. His appeal was met with accusations of bigotry. The pope had inflamed the post-9/11 atmosphere of “Islamophobia,” one “anti-racism” outfit said. Another group asked: What about the contributions made by the “tolerant Islam of al-Andalus”? Former French President Valéry Giscard d’Estaing spoke for the political class: “Europeans live in a purely secular political system, where religion does not play an important role.”
Douglas Murray recounts this episode early on in his fiery, lucid, and essential polemic. It epitomized the folly of European elites who would sooner discard the Continent’s civilizational heritage than show partiality for their own culture over others’. To Murray, this tendency is quite literally suicidal—hence the “death” in his title.
The book deals mainly with Western Europe’s disastrous experiment in admitting huge numbers of Muslim immigrants without bothering to assimilate them. These immigrants now inhabit parallel communities on the outskirts of most major cities. They reject mainstream values and not infrequently go boom. Murray’s account ranges from the postwar guest-worker programs to the 2015 crisis that brought more than a million people from the Middle East and Africa.
This is dark-night-of-the-soul stuff. The author, a director at London’s Henry Jackson Society (where I was briefly a nonresident fellow), has for more than a decade been among Europe’s more pessimistic voices on immigration. My classically liberal instincts primed me to oppose him at every turn. Time and again, I found myself conceding that, indeed, he has a point. This is in large part because I have been living in and reporting on Europe for nearly four years. Events of the period have vindicated Murray’s bleak vision and confounded his critics.
Murray is right: Time isn’t mellowing out Europe’s Muslims. “The presumption of those who believed in integration is that in time everybody who arrives will become like Europeans,” Murray writes. Yet it is the young who are usually the most fanatical. Second- and third-generation immigrants make up the bulk of the estimated 5,000 Muslims who have gone off to fight with the Islamic State.
The first large wave of Muslim immigrants to Britain arrived soon after World War II. Seven decades later, an opinion survey conducted (in 2016) by the polling firm ICM found that half of Muslim Britons would proscribe homosexuality, a third would legalize polygamy, and a fifth would replace civil law with Shariah. A different survey, also conducted in 2016, found that 83 percent of young French Muslims describe their faith as “important or very important” to them, compared with 22 percent of young Catholics. I could go on with such polling data; Murray does for many pages.
He is also correct that all the various “integration” models have failed. Whether it is consensus-based social democracy in the Nordic countries, multiculturalism in Britain, or republican secularism in France, the same patterns of disintegration and social incohesion persist nearly everywhere. Different European governments have treated this or that security measure, economic policy, or urban-planning scheme as the integration panacea, to no avail.
Murray argues that the successive failures owe to a basic lack of political will. To prove the point he cites, among other things, female genital mutilation in the UK. Laws against the practice have been on the books for three decades. Even so, an estimated 130,000 British women have had their genitals cut, and not a single case has been successfully prosecuted.
Pusillanimity and retreat have been the norm among governments and cultural elites on everything from FGM to free speech to counterterrorism. The result has been that the “people who are most criticized both from within Muslim communities in Europe and among the wider population are in fact the people who fell hardest for the integration promises of liberal Europe.” It was Ayaan Hirsi Ali, the fierce Somali-born proponent of Enlightenment values and women’s equality, who had to escape Holland under a death threat, not her persecutors.
And Murray is right when he says that Europeans hadn’t staged a real debate on immigration until very recently. The author might be too quick to dismiss the salutary fiscal and social effects of economic growth and immigration’s role in promoting it. At various points he even suggests that Europeans forgo economic as well as population growth if it means having to put up with fewer migrants. He praises hermetically sealed Japan, but he elides the Japanese model’s serious economic, demographic, and even psychological disadvantages.
All this is secondary to Murray’s unanswerable argument that European elites had for years cordoned off immigration from normal political debate. As he writes, “whereas the benefits of mass immigration undoubtedly exist and everybody is made very aware of them, the disadvantages of importing huge numbers of people from another culture take a great deal of time to admit to.” In some cases, most notably the child-sex grooming conspiracy in Rotherham, England, the institutions have tried to actively suppress the truth. Writes Murray: “Instead of carrying out their jobs without fear or favor, police, prosecutors, and journalists behaved as though their job was to mediate between the public and the facts.”I s it possible to imagine an alternative history, one in which Europe would absorb this many migrants from Islamic lands but suffer fewer and less calamitous harms? Murray’s surprising answer is yes. Had Europe retained its existential confidence over the course of the previous two centuries, things might have turned out differently. As it was, however, mass migration saw a “strong religious culture”—Islam—“placed into a weak and relativistic culture.”
In the book’s best chapters, Murray departs from the policy debate to attend to the sources of Europe’s existential insecurity. Germans bear much of the blame, beginning with 19th-century Bible scholarship that applied the methods of history, philology, and literary criticism to sacred scripture. That pulled the rug of theological certainty from under Europe’s feet, in Murray’s account, and then Darwin’s discoveries heightened the disorientation. Europeans next tried to substitute totalistic ideology for religion, with catastrophic results.
Finally, after World War II, they settled on human rights as the central meaning of Europe. But since Europeans could no longer believe, these rights were cut off from one of their main wellsprings: the Judeo-Christian tradition. The Catholic Church—having circumscribed the power of earthly kings across centuries and thereby “injected an anti-totalitarian vaccine into the European bloodstream,” as George Weigel has written in these pages–was scorned or ignored. Europeans forgot how they came to be free.
Somehow Europe must recover its vitality. But how? Murray is torn. On one hand, he sees how a rights-based civilization needs a theological frame, lest it succumb before a virile and energetic civilization like Islam. On the other, he thinks the leap of faith is impossible today. Murray can’t blame François, the professor-protagonist of Michel Houellebecq’s 2016 novel Submission. Faced with an Islamic takeover of France, François heads to a monastery desperate to shake his spiritual torpor. But kneeling before the Virgin doesn’t do anything for him. Islam, with its simplicity and practicality (not least the offer of up to four nubile wives), is much harder to resist.
Murray wonders whether the answer lies in art. Maybe in beauty Europeans can recover the fulfillment and sense of mystery that their ancestors once found in liturgy–only without the cosmic truth claims. He laments that contemporary European art has “given up that desire to connect us to something like the spirit of religion,” though it is possible that the current period of crisis will engender a revival. In the meanwhile, Murray has suggested, even nonbelievers should go to church as a way to mark and show gratitude for Christianity’s foundational role in Europe.
He is onto something. Figure out the identity bit in the book’s subtitle—“Immigration, Identity, Islam”—and the other two will prove much easier to sort out.
A maestro’s morality
How is it possible that a man who made his conducting debut when Grover Cleveland was president should still be sufficiently well known and revered that most of his recordings remain in print to this day? Toscanini: Musician of Conscience, Harvey Sachs’s new biography, goes a long way toward defining what made Toscanini unique.1 A conductor himself, Sachs is also the author of, among other excellent books, a previous biography of Toscanini that was published in 1978. Since then, several large caches of important primary-source material, most notably some 1,500 of the conductor’s letters, have become available to researchers. Sachs’s new biography draws on this new material and other fresh research. It is vastly longer and more detailed than its predecessor and supersedes it in every way.
Despite its length and thoroughness, Toscanini: Musician of Conscience is not a pedant’s vade mecum. Clearly and attractively written, it ranks alongside Richard Osborne’s 1998 biography of Herbert von Karajan as one of the most readable biographies of a conductor ever published. For Toscanini, as Sachs shows us, had a volatile, immensely strong-willed character, one that in time caused him to clash not only with his colleagues but with the dangerous likes of Adolf Hitler and Benito Mussolini. The same fierce integrity that energized his conducting also led him to put his life at risk at a time when many of his fellow musicians were disinclined to go even slightly out of their way to push back against the Fascist tyrants of the ’30s.T oscanini: Musician of Conscience does not devote much space to close analysis of Toscanini’s interpretative choices and technical methods. For the most part, Sachs shows us Toscanini’s art through the eyes of others, and the near-unanimity of the admiration of his contemporaries, whose praise is quoted in extenso, is striking, even startling. Richard Strauss, as distinguished a conductor as he was a composer, spoke for virtually everyone in the world of music when he said, “When you see that man conduct, you feel that there is only one thing for you to do: take your baton, break it in pieces, and never conduct again.”
Fortunately for posterity, Toscanini’s unflashy yet wondrously supple baton technique can be seen up close in the 10 concerts he gave with the NBC Symphony between 1948 and 1952 that were telecast live (most of which can now be viewed in part or whole on YouTube). But while his manual gestures, whose effect was heightened by the irresistible force of his piercing gaze, were by all accounts unfailingly communicative, Toscanini’s ability to draw unforgettable performances out of the orchestras that he led had at least as much to do with his natural musical gifts. These included an infallible memory—he always conducted without a score—and an eerily exact ear for wrong notes. Such attributes would have impressed orchestra players, a hard-nosed lot, even if they had not been deployed in the service of a personality so galvanizing that most musicians found it all but impossible not to do Toscanini’s musical bidding.
What he wanted was for the most part wholly straightforward. Toscanini believed that it was his job—his duty, if you will—to perform the classics with note-perfect precision, singing tone, unflagging intensity, and an overall feeling of architectural unity that became his trademark. When an orchestra failed to give of its best, he flew into screaming rages whose verbal violence would likely not be believed were it not for the fact that there were secret tapes made. In one of his most spectacular tantrums, which has been posted on YouTube, he can be heard telling the bass players of the NBC Symphony that “you have no ears, no eyes, nothing at all…you have ears in—in your feet!”
Toscanini was able to get away with such behavior because his own gifts were so extraordinary that the vast majority of his players worshipped him. In the words of the English bassoonist Archie Camden, who played under Toscanini in the BBC Symphony from 1935 to 1939, he was “the High Priest of Music,” a man “almost of another world” whose artistic integrity was beyond question. And while his personal integrity was not nearly so unblemished—he was, as Sachs reports with unsalacious candor, a compulsive philanderer whose love letters to his mistresses are explicit to the point of pornography—there is nonetheless a parallel between the passionate conscientiousness of his music-making and his refusal to compromise with Hitler and Mussolini, both of whom were sufficiently knowledgeable about music to understand what a coup it would have been to co-opt the world’s greatest conductor.
Among the most valuable parts of Toscanini: Musician of Conscience are the sections in which Sachs describes Toscanini’s fractious relations with the German and Italian governments. Like many of his fellow countrymen, he had been initially impressed by Mussolini, so much so that he ran for the Italian parliament as a Fascist candidate in 1919. But he soon saw through Mussolini’s modernizing rodomontade to the tyrant within, and by the late ’20s he was known throughout Italy and the world as an unswerving opponent of the Fascist regime. In 1931 he was beaten by a mob of blackshirted thugs, after which he stopped conducting in Italy, explaining that he would not perform there so long as the Fascists were in power. Mussolini thereupon started tapping his telephone line, and seven years later the conductor’s passport was confiscated when he described the Italian government’s treatment of Jews as “medieval stuff” in a phone call. Had public and private pressure not been brought to bear, he might well have been jailed or murdered. Instead he was allowed to emigrate to the U.S. He did not return to Italy until after World War II.
If anything, Toscanini’s hatred for the Nazis was even more potent, above all because he was disgusted by their anti-Semitism. A philo-Semite who referred to the Jews as “this marvelous people persecuted by the modern Nero,” he wrote a letter to one of his mistresses in the immediate wake of the Anschluss that makes for arresting reading eight decades later:
My heart is torn in bits and pieces. When you think about this tragic destruction of the Jewish population of Austria, it makes your blood turn cold. Think of what a prominent part they’d played in Vienna’s life for two centuries! . . . Today, with all the great progress of our civilization, none of the so-called liberal nations is making a move. England, France, and the United States are silent!
Toscanini felt so strongly about the rising tide of anti-Semitism that he agreed in 1936 to conduct the inaugural concerts of the Palestine Symphony (later the Israel Philharmonic) as a gesture of solidarity with the Jews. In an even more consequential gesture, he had already terminated his relationship with the Bayreuth Festival, where he had conducted in 1930 and 1931, the first non-German conductor to do so. While the founder of the festival, Richard Wagner, ranked alongside Beethoven, Brahms, and Verdi at the top of Toscanini’s pantheon of musical gods, he was well aware many of the members of the Wagner family who ran Bayreuth were close friends of Adolf Hitler, and he decided to stop conducting in Germany—Bayreuth included—when the Nazis came to power. Hitler implored him to return to the festival in a personal letter that praised him as “the great representative of art and of a people friendly to Germany.” Once again, though, there was to be no compromise: Toscanini never performed in Germany again, nor would he forgive those musicians, Wilhelm Furtwängler among them, who continued to do so.I mplicit throughout Sachs’s book is the idea that Toscanini the man and Toscanini the musician were, as his subtitle suggests, inseparable—that, in other words, his conscience drove him to oppose totalitarianism in much the same way that it drove him to pour his heart and soul into his work. He was in every sense of the word a driven man, one capable of writing in an especially revealing letter that “when I’m working I don’t have time to feel joy; on the contrary, I suffer without interruption, and I feel that I’m going through all the pain and suffering of a woman giving birth.”
Toscanini was not striking a theatrical pose when he wrote these melodramatic-sounding words. The rare moments of ecstasy that he experienced on the podium were more than offset by his obsessive struggle to make the mere mortals who sang and played for him realize, as closely as possible, his vision of artistic perfection. That was why he berated them, why he ended his rehearsals drenched with sweat, why he flogged himself as unsparingly as he flogged his musicians. It was, he believed, what he had been born to do, and he was willing to move heaven and earth in order to do it.
To read of such terrifying dedication is awe-inspiring—yet it is also strangely demoralizing. To be sure, there are still artists who drive themselves as relentlessly as did Toscanini, and who pull great art out of themselves with the same iron determination. But his quasi-religious consecration to music inevitably feels alien to the light-minded spirit of our own age, dominated as it is by pop culture. It is hard to believe that NBC, the network of Jimmy Fallon and Superstore, maintained for 17 years a full-time symphony orchestra that had been organized in 1937 for the specific purpose of allowing Toscanini to give concerts under conditions that he found satisfactory. A poll taken by Fortune that year found that 40 percent of Americans could identify Toscanini as a conductor. By 1954, the year in which he gave up conducting the NBC Symphony (which was then disbanded), the number was surely much higher.
Will there ever again be a time when high art in general and classical music in particular mean as much to the American people as they did in Toscanini’s heyday? Very likely not. But at least there will be Harvey Sachs’s fine biography—and, far more important, Toscanini’s matchlessly vivid recordings—to remind us of what we once were, what we have lost, and what Arturo Toscanini himself aspired to be and to do.
1 Liveright, 923 pages. Many of Toscanini’s best commercial American recordings, made with the NBC Symphony, the New York Philharmonic, and the Philadelphia Orchestra, were reissued earlier this year in a budget-priced box set called Arturo Toscanini: The Essential Recordings (RCA Red Seal, 20 CD’s) whose contents were chosen by Sachs and Christopher Dyment, another noted Toscanini scholar. Most of the recordings that he made in the ’30s with the BBC Symphony are on Arturo Toscanini: The HMV Recordings (Warner Classics, six CD’s).
A blockbuster movie gets the spirit right and the details wrong
But enough about Brexit; what about Christopher Nolan’s new movie about Dunkirk?
Dunkirk is undoubtedly a blockbuster with a huge cast—Nolan has splendidly used thousands of extras rather than computer cartooning to depict the vast numbers of Allied troops trapped on the beaches—and a superb score by Hans Zimmer. Kenneth Branagh is a stiff upper-lipped rear-admiral, whose rather clunking script is all too obviously designed to tell the audience what’s going on; One Direction pop star Harry Styles is a British Tommy, and Tom Hardy is a Spitfire pilot who somehow shoots down two Heinkels while gliding, having run out of fuel about halfway through the movie. Mark Rylance, meanwhile, plays the brave skipper of a small boat taking troops off the beaches in the manner of Walter Pidgeon in Mrs. Miniver.
Yet for all the clichéd characterization, almost total lack of dialogue, complete lack of historical context (not even a cameo role for Winston Churchill), a ludicrous subplot in which a company of British soldiers stuck on a sinking boat do not use their Bren guns to defend themselves, problems with continuity (sunny days turn immediately into misty ones as the movie jumps confusingly through time), and Germans breaking into central Dunkirk whereas in fact they were kept outside the perimeter throughout the evacuation, Dunkirk somehow works well.
It works for the same reason that the 1958 film of the same name directed by Leslie Norman and starring Richard Attenborough and John Mills did. The story of the nine-day evacuation of the British Expeditionary Force from Dunkirk in late May and early June 1940 is a tale of such extraordinary heroism, luck, and intimate proximity to utter disaster that it would carry any film, even a bad one, and Nolan’s is emphatically not a bad one. Although the dogfights take place at ridiculously low altitudes, they are thrilling, and the fact that one doesn’t see a single German soldier until the closing scene, and then only two of them in silhouette, somehow works, too. See the film on the biggest screen you can, which will emphasize the enormity of the challenge faced by the Allies in getting over 336,000 troops off the beaches for the loss of only 40,000 killed, wounded and captured.
There is a scene when the armada of small boats arrives at the beaches that will bring a lump to the throat of any patriotic Briton; similarly, three swooping Spitfires are given a wonderfully evocative moment. The microcosm of the evacuation that Nolan concentrates on works well, despite another silly subplot in which a British officer with PTSD (played by Cillian Murphy) kills a young boy on Rylance’s small boat. That all the British infantry privates, not just Harry Styles, look like they sing in boy-bands doesn’t affect the power of seeing them crouch en masse under German attack in their greatcoats and helmets on the foam-flecked beaches.
On the tenth of May in 1940, Adolf Hitler invaded France, Belgium, and Holland, unleashing Blitzkrieg on the British and French armies—a new all-arms tactic of warfare that left his enemies reeling. He also sent tanks through the forests of the Ardennes mountains, which were considered impassable, and by May 16, some panzer units had already reached the English Channel. With the British and French in full retreat, on May 24 the Fuhrer halted his tanks’ headlong advance for various sound military reasons—he wanted to give his men some rest, did not want to over-extend the German army, needed to protect against counter-attack, and wanted his infantry to catch up. From May 26 to June 3, the Allies used this pause to throw up a perimeter around the French port of Dunkirk, from whose pleasure beaches more than a quarter of a million British and more than 80,000 French troops embarked to cross the Channel to safety in Britain.
Protected by the Royal Air Force, which lost 144 pilots in the skies over Dunkirk, and by the French air force (which plays no part in this movie) and transported by the Royal Navy (which doesn’t seem to be able to use its guns against the Luftwaffe in this film, but which luckily did in real life), British and French troops made it to Dover, albeit without any heavy equipment which they had to destroy on the beach. An allusion is made to that when Tom Hardy destroys the Spitfire he has (I must say quite unbelievably) landed on a beach in order to prevent its falling into German hands.
In response to a call from the British government, more than 700 private vessels were requisitioned, including yachts, paddle steamers, ferries, fishing trawlers, packet steamers and lifeboats. Even today when boating down the Thames it is possible to see small pleasure vessels sometimes only fifteen feet long with the plaque “Dunkirk 1940” proudly displayed on the cabins. That 226 were sunk by the Luftwaffe, along with six destroyers of the 220 warships that took part, shows what it meant to rise to what was afterwards called “the Dunkirk Spirit.” It was a spirit of defiance of tyranny that one glimpses regularly in this film, even if Nolan does have to pay obeisance to the modern demands for stories of cowardice alongside heroism, and the supposedly redemptive cowardice-into-heroism stories that Hollywood did not find necessary when it made Mrs. Miniver in 1942.
Nolan’s Dunkirk implies that it was the small boats that brought back the majority of the troops, whereas in fact the 39 destroyers and one cruiser involved in Operation Dynamo brought back the huge majority while the little ships did the crucial job of ferrying troops from the beaches to the destroyers. Six of which were sunk, though none by U-boats (which the film wrongly suggests were present).
Where Nolan’s film commits a libel on the British armed services is in its tin ear for the Anglo-French relations of the time. In the movie, a British beach-master prevents French infantrymen from boarding a naval vessel, saying “This is a British ship. You get your own ships.” The movie later alleges that no Frenchmen were allowed to be evacuated until all the Britons were safely back home. This was not what happened. The French were brought across the Channel in Royal Navy vessels and small boats when their units arrived on the beaches.
There was no discrimination whatsoever, and to suggest there was injects false nationalist tension into what was in truth a model of good inter-Allied cooperation. Only much later, when the Nazi-installed Vichy government in France needed to create an Anglophobic myth of betrayal at Dunkirk, did such lies emerge. It is a shame that Nolan is now propagating them—especially since this might be the only contact that millions of people will ever have with the Dunkirk story for years, perhaps even a generation. At a time when schools simply do not teach the histories of anything so patriotism-inducing as Dunkirk, it was incumbent on Nolan to get this right.
In a touching scene at the end, one of the Tommies is depicted reading from a newspaper Churchill’s famous “We shall fight on the beaches” speech of June 4, 1940, with its admonition: “We must be very careful not to assign to this deliverance the attributes of a victory. Wars are not won by evacuations.” Churchill made no attempt to minimize the scale of what he called a “colossal military disaster,” but he also spoke, rightly, of the fact that it had been a “miracle of deliverance.” That is all that matters in this story.
So despite my annoyance at how many little details are off here—for example, Tom Hardy firing 75 seconds’ worth of ammunition when he would really have only had 14.7, or choppy weather when the Channel was really like a mill pond—I must confess that such problems are only for military history pedants like me. What Nolan has gotten right is the superb spirit of the British people in overcoming hatred, resentment, and fury with calmness, courage, and good humor.
Which brings us back to Brexit.
The Swoon has several symptoms: extreme praise, a disinclination to absorb contrary facts, a weakness for adulation, and a willingness to project one’s own beliefs and dispositions onto an ill-suited target, regardless of evidence. The first thing to know about the Swoon, though, is that it is well rooted in reality. John McCain is perhaps the most interesting non-presidential figure in Washington politics since Daniel Patrick Moynihan. Any piece of journalism that aims to assess him objectively should be required to include, as a stipulation, a passage like this one from Robert Timberg’s masterful book about Vietnam, The Nightingale’s Song.
“Do you want to go home?”
“Now, McCain, it will be very bad for you.”
The [chief jailer] gleefully led the charge as the guards, at [another guard’s] command, drove fists and knees and boots into McCain. Amid laughter and muttered oaths, he was slammed from one guard to another, bounced from wall to wall, knocked down, kicked, dragged to his feet, knocked back down, punched again and again in the face. When the beating was over, he lay on the floor, bloody, arms and legs throbbing, ribs cracked, several teeth broken off at the gum line.
“Are you ready to confess your crimes?” asked [the guard].
The ropes came next . . .
This scene is, of course, from McCain’s five years in a North Vietnamese prisoner of war camp. It helps to know that before this gruesome episode began—there were many more to come—McCain’s arms had been broken and gone untreated. It helps, too, to know that the point of the torture was to force McCain to leave the prison and return home to his father, the highest ranking naval officer in the Pacific. In other words, they hung him by his broken arms because he refused to let them let him go.
Every reporter who’s done his homework knows this about McCain, and most civilians who meet him know it, too. This is the predicate for the Swoon. It began to afflict liberal journalists of the Boomer generation during the warm-up to his first run for president, against Governor George W. Bush, in the late 1990s. The reporter would be brought onto McCain’s campaign bus and receive a mock-gruff welcome from the candidate. No nervous handlers would be in evidence, like those who ever attend other candidates during interviews.
And then it happens: In casual, preliminary conversation, McCain makes an indiscreet comment about a Senate colleague. “Is that off the record?” the reporter asks, and McCain waves his hand: “It’s the truth, isn’t it?” In a minute or two, the candidate, a former fighter pilot, drops the F bomb. Then, on another subject, he makes an offhanded reference to being “in prison.” The reporter, who went through four deferments in the late 1960s smoking weed with half-naked co-eds at an Ivy League school, feels the hot, familiar surge of guilt. As the interview winds down, the reporter sees an unexpected and semi-obscure literary work—the collected short stories of William Maxwell, let’s say—that McCain keeps handy for casual reading.
By the time he’s shown off the bus—after McCain has complimented a forgotten column the reporter wrote two years ago—the man is a goner. If I saw it once in my years writing about McCain, I saw it a dozen times. (I saw it happen to me!) Soon the magazine feature appears, with a headline like “The Warrior,” or “A Question of Honor,” or even “John McCain Walks on Water.” Those are all real headlines from his first presidential campaign. This really got printed, too: “It is a perilous thing, this act of faith in a faithless time—perilous for McCain and perilous for the people who have come to him, who must realize the constant risk that, sometimes, God turns out to be just a thunderstorm, and the gold just stones agleam in the sun.”
Judging from inquiries I’ve made over the years, the only person who knows what that sentence means is the writer of it, an employee of Esquire magazine named Charles Pierce. No liberal journalist got the Swoon worse than Pierce, and no one was left with a bitterer hangover when it emerged that McCain was, in nearly every respect, a conventionally conservative, generally loyal Republican—with complications, of course. The early Swooners had mistaken those complications (support for campaign-finance reform, for example, and his willingness to strike back at evangelical bullies like Jerry Falwell Sr.) as the essence of McCain. When events proved this not to be so, culminating in his dreary turn as the 2008 Republican presidential nominee—when he committed the ultimate crime in liberal eyes, midwifing the national career of Sarah Palin—it was only Republicans who were left to swoon.
So matters rested until this July, when McCain released the news that he suffers from a particularly aggressive form of brain cancer. Many appropriate encomiums rolled in, some from the original Swooners. But another complication arose. Desperate to pass a “motion to proceed” so that a vote could be taken on a lame and toothless “repeal” of Obamacare, Senate Republicans could muster only a tie vote. McCain announced he would rise from his hospital bed and fly to Washington to break the tie and vote for the motion to proceed.
Even conservatives who had long remained resistant to the Swoon succumbed. Even Donald Trump tweet-hailed McCain as a returning hero. His old fans from the left, those with long memories, wrote, or tweeted, more in sorrow than in anger. Over at Esquire, poor Charles Peirce reaffirmed that God had turned out to be just a thunderstorm again. “The ugliest thing to witness on a very ugly day in the United States Senate,” he wrote, “was what John McCain did to what was left of his legacy as a national figure.” A longtime Swooner in the Atlantic: “Senator McCain gave us a clearer idea of who he is and what he stands for.” Answers: a hypocrite, and nothing!
The old fans weren’t mollified by a speech McCain made after his vote, in which he sounded notes they had once thrilled to—he praised bipartisanship and cooperation across the aisle. Several critics in the press dismissed the speech with the same accusation that his conservative enemies had always leveled at McCain when he committed something moderate. He was pandering…to them! “McCain so dearly wants the press to think better of him for [this] speech,” wrote the ex-fan in the Atlantic. But the former Swooners were having none of it. Swoon me once, shame on me. Swoon me twice . . .
Then the next day in the wee hours, McCain voted against the actual bill to repeal Obamacare. Democrats were elated, and Republicans were forced to halt in mid-Swoon. His reasons for voting as he did were sound enough, but reasons seldom enter in when people are in thrall to their image of McCain. The people who had once loved him so, and who had suffered so cruelly in disappointment, were once more in love. Let’s let Pierce have the last word: “The John McCain the country had been waiting for finally showed up early Friday morning.” He had done what they wanted him to do; why he had done it was immaterial.
The condescension is breathtaking. Sometimes I think McCain is the most misunderstood man in Washington. True enough, he’s hard to pin down. He’s a screen onto which the city’s ideologues and party hacks project their own hopes and forebodings. Now, as he wages another battle in a long and eventful life, what he deserves from us is something simpler—not a swoon but a salute, offered humbly, with much reverence, affection, and gratitude.