All the hopes that we entertain in the midst of the confusion and dangers of the present are founded, positively…
I. The Beginning of the Bible and Its Greek Counterparts
All the hopes that we entertain in the midst of the confusion and dangers of the present are founded, positively or negatively, directly or indirectly, on the experiences of the past. Of these experiences, the broadest and deepest—so far as Western man is concerned—are indicated by the names of two cities: Jerusalem and Athens. Western man became what he is, and is what he is, through the coming together of biblical faith and Greek thought. In order to understand ourselves and to illuminate our trackless way into the future, we must understand Jerusalem and Athens. It goes without saying that this is a task whose proper performance goes much beyond my power; but we cannot define our tasks by our powers, for our powers become known to us through the performance of our tasks, and it is better to fail nobly than to succeed basely.
The objects to which we refer when we speak of Jerusalem and Athens are understood today, by the science devoted to such objects, as cultures; “culture” is meant to be a scientific concept. According to this concept there is an indefinitely large number of cultures: n cultures. The scientist who studies them beholds them as objects; as scientist, he stands outside all of them; he has no preference for any of them; he is not only impartial but objective; he is anxious not to distort any of them; in speaking about them he avoids any “culture-bound” concepts—i.e., concepts bound to any particular culture or kind of culture. In many cases the objects studied by the scientist of culture do or did not know that they are or were cultures. This causes no difficulty for him: electrons also do not know that they are electrons; even dogs do not know that they are dogs. By the mere fact that he speaks of his objects as cultures, the scientific student takes it for granted that he understands the people whom he studies better than they understood or understand themselves.
This whole approach has been questioned for some time, but the questioning does not seem to have had any effect on the scientists. The man who started the questioning was Nietzsche. We have said that according to the prevailing view there were or are n cultures. Let us say there were or are 1,001 cultures, thus reminding ourselves of the 1,001 Arabian Nights; the account of the cultures, if it is well done, will be a series of exciting stories, perhaps of tragedies. Accordingly, Nietzsche speaks of our subject in a speech by his Zarathustra that is entitled “Of 1,000 Goals and One.” The Hebrews and the Greeks appear in this speech as two among a number of nations, not superior to the two others that are mentioned or to the 996 that are not. The peculiarity of the Greeks, according to Nietzsche, is the full dedication of the individual to the contest for excellence, distinction, supremacy. The peculiarity of the Hebrews is the utmost honoring of father and mother. Nietzsche’s reverence for the sacred tables of the Hebrews, as well as for those of the other nations in question, is deeper than that of any other beholder. Yet since he too is only a beholder of these tables, since what one table commends or commands is incompatible with what others command, he himself is not subject to the commandments of any. This is true also and especially of the tables, or “values,” of modern Western culture. But according to him, all scientific concepts, and hence in particular the concept of culture, are culture-bound; the concept of culture is an outgrowth of 19th-century Western culture; its application to the “cultures” of other ages and climates is an act stemming from the spiritual imperialism of that particular culture. There is, then, for Nietzsche, a glaring contradiction between the claimed objectivity of the science of cultures and the subjectivity of that science. To state the case differently, one cannot behold—i.e., truly understand—any culture unless one is firmly rooted in one’s own culture or unless one belongs, in one’s capacity as a beholder, to some culture. But if the universality of the beholding of all cultures is to be preserved, the culture to which the beholder of all cultures belongs must be the universal culture, the culture of mankind, the world culture; the universality of beholding presupposes, if only by anticipating, the universal culture which is no longer one culture among many. Nietzsche sought therefore for a culture that would no longer be particular and hence in the last analysis arbitrary. The single goal of mankind is conceived by him as in a sense super-human: he speaks of the super-man of the future. The super-man is meant to unite in himself, on the highest level, both Jerusalem and Athens.
However much the science of all cultures may protest its innocence of all preferences or evaluations, it fosters a specific moral posture. Since it requires openness to all cultures, it fosters universal tolerance and the exhilaration which derives from the beholding of diversity; it necessarily affects all cultures that it can still affect by contributing to their transformation in one and the same direction; it willy-nilly brings about a shift of emphasis from the particular to the universal. By asserting, if only implicitly, the Tightness of pluralism, it asserts that pluralism is the right way; it asserts the monism of universal tolerance and respect for diversity; for by virtue of being an “-ism,” pluralism is a monism.
One remains somewhat closer to the science of culture as it is commonly practiced if one limits oneself to saying that every attempt to understand the phenomena in question remains dependent upon a conceptual framework that is alien to most of these phenomena and therefore necessarily distorts them. “Objectivity” can be expected only if one attempts to understand the various cultures or peoples exactly as they understand or understood themselves. Men of ages and climates other than our own did not understand themselves in terms of cultures because they were not concerned with culture in the present-day meaning of the term. What we now call culture is the accidental result of concerns that were not concerns with culture but with other things—above all with the Truth.
Yet our intention to speak of Jerusalem and Athens seems to compel us to go beyond the self-understanding of either. Or is there a notion, a word that points to the highest that both the Bible and the greatest works of the Greeks claim to convey? There is such a word: wisdom. Not only the Greek philosophers but the Greek poets as well were considered to be wise men, and the Torah is said, in the Torah, to be “your wisdom in the eyes of the nations.” We, then, must try to understand the difference between biblical wisdom and Greek wisdom. We see at once that each of the two claims to be the true wisdom, thus denying to the other its claim to be wisdom in the strict and highest sense. According to the Bible, the beginning of wisdom is fear of the Lord; according to the Greek philosophers, the beginning of wisdom is wonder. We are thus compelled from the very beginning to make a choice, to take a stand. Where then do we stand? Confronted by the incompatible claims of Jerusalem and Athens, we are open to both and willing to listen to each. We ourselves are not wise but we wish to become wise. We are seekers for wisdom, “philo-sophoi.” Yet since we say that we wish to hear first and then to act or to decide, we have already decided in favor of Athens against Jerusalem.
This, indeed, seems to be the necessary position for all of us who cannot be Orthodox and therefore must accept the principle of the historical-critical study of the Bible. The Bible was traditionally understood to be the true and authentic account of the deeds of God and men from the beginning till the restoration after the Babylonian exile. The deeds of God include His legislation as well as His inspirations to the prophets, and the deeds of men include their praises of God and their prayers to Him as well as their God-inspired admonitions. Biblical criticism starts from the observation that the biblical account is in important respects not authentic but derivative or consists not of “histories” but of “memories of ancient histories,” to borrow a Machiavellian expression. Biblical criticism reached its first climax in Spinoza’s Theological-Political Treatise, which is frankly anti-theological; Spinoza read the Bible as he read the Talmud and the Koran. The result of his criticism can be summarized as follows: the Bible consists to a considerable extent of self-contradictory assertions, of remnants of ancient prejudices or superstitions, and of the outpourings of an uncontrolled imagination; in addition, it is poorly compiled and poorly preserved. He arrived at this conclusion by presupposing the impossibility of miracles. The considerable differences between 19th- and 20th-century biblical criticism and that of Spinoza can be traced to their difference in regard to the evaluation of imagination: whereas for Spinoza imagination is simply sub-rational, it was assigned a much higher rank in later times when it was understood as the vehicle of religious or spiritual experience, which necessarily expresses itself in symbols and the like. The historical-critical study of the Bible is the attempt to understand the various layers of the Bible as they were understood by their immediate addressees, i.e., the contemporaries of its authors. Of course, the Bible speaks of many things—for instance, the creation of the world—that for the biblical authors themselves belong to the remote past. But there is undoubtedly much history in the Bible—accounts of events written by contemporaries or near-contemporaries. One is thus led to say that the Bible contains both “myth” and “history.” Yet this distinction is alien to the Bible; it is a special form of the Greek distinction between mythos and logos. From the point of view of the Bible, the “myths” are as true as the “histories”: what Israel “in fact” did or suffered cannot be understood except in the light of the “facts” of Creation and Election. What is now called “historical” are those deeds and speeches that are equally accessible to the believer and to the unbeliever. But from the point of view of the Bible, the unbeliever is the fool who has said in his heart “there is no God”; the Bible narrates everything as it is credible to the wise in the biblical sense of wisdom. Let us never forget that there is no biblical word for doubt. The biblical signs and wonders convince men who have little faith or who believe in other gods; they are not addressed to “the fools who say in their hearts ‘there is no God.’”
It is true that we cannot ascribe to the Bible the theological concept of miracles, for that concept presupposes the concept of nature, and the concept of nature is foreign to the Bible. One is, however, tempted to ascribe to the Bible what one may call the poetic concept of miracles as illustrated by Psalm 114:
When Israel went out of Egypt, the house of Jacob from a people of strange tongue, Judah became his sanctuary and Israel his dominion. The sea saw and fled; the Jordan turned back. The mountains skipped like rams, the hills like lambs. What ails thee, sea, that thou fleest, thou Jordan that thou turnst back? Ye mountains that ye skip like rams, ye hills like lambs? From the presence of the Lord tremble thou earth, from the presence of the God of Jacob who turns the rock into a pond of water, the flint into a fountain of waters.
The presence of God calls forth from His creatures a conduct that differs strikingly from their ordinary conduct: it enlivens the lifeless; it makes fluid the fixed. It is not easy to say whether the author of the psalm did not mean his utterance to be simply or literally true. It is easy to say that the concept of poetry—as distinguished from that of song—is foreign to the Bible. It is perhaps more simple to say that owing to the victory of science over natural theology the impossibility of miracles can no longer be said to be established but has degenerated to the status of an undemonstrable hypothesis. One may trace to the hypothetical character of this fundamental premise the hypothetical character of many, not to say all, results of biblical criticism. Certain it is that biblical criticism in all its forms makes use of terms having no biblical equivalents and is to this extent unhistorical.
How then must we proceed? We shall not take issue with the findings or even the premises of biblical criticism. Let us grant that the Bible and in particular the Torah consists to a considerable extent of “memories of ancient histories,” even of memories of memories. But memories of memories are not necessarily distorted or pale reflections of the original; they may be recollections of recollections, deepenings through meditation of the primary experience. We shall therefore take the latest and uppermost layer as seriously as the earlier ones. We shall start from the uppermost layer—from what is first for us, even though it may not be simply the first. We shall start, that is, where both the traditional and the historical study of the Bible necessarily start. In thus proceeding we avoid the compulsion to make an advance decision in favor of Athens against Jerusalem. For the Bible does not require us to believe in the miraculous character of events that the Bible does not present as miraculous. God’s speaking to men may be described as miraculous, but the Bible does not claim that the putting-together of those speeches was done miraculously. We begin at the beginning, at the beginning of the beginning. The beginning of the beginning happens to deal with the beginning: the creation of heaven and earth. The Bible begins reasonably.
“In the beginning God created heaven and earth.” Who says this? We are not told; hence we do not know. We have no right to assume that God said it, for the Bible introduces God’s sayings by expressions like “God said.” We shall then assume that the words were spoken by a nameless man. Yet no man can have been an eyewitness of God’s creating heaven and earth; the only eyewitness was God. Since “There did not arise in Israel a prophet like Moses whom the Lord saw face to face,” it is understandable that tradition ascribed to Moses the sentence quoted and its whole sequel. But what is understandable or plausible is not as such certain. The narrator does not claim to have heard the account from God; perhaps he heard it from some man or men; perhaps he is retelling a tale. The Bible continues: “And the earth was unformed and void. . . .” It is not clear whether the earth thus described was created by God or antedated His creation. But it is quite clear that while speaking about how the earth looked at first, the Bible is silent about how heaven looked at first. The earth, i.e., that which is not heaven, seems to be more important than heaven. This impression is confirmed by the sequel.
God created everything in six days. On the first day He created light; on the second, heaven; on the third, the earth, the seas, and vegetation; on the fourth, the sun, the moon, and the stars; on the fifth, the water animals and the birds; and on the sixth, the land animals and man. The most striking difficulties are these: light and hence day (and nights) are presented as preceding the sun, and vegetation is presented as preceding the sun. The first difficulty is disposed of by the observation that creation-days are not sun-days. One must add at once, however, that there is a connection between the two kinds of days, for there is a connection, a correspondence between light and sun. The account of creation manifestly consists of two parts, the first part dealing with the first three creation-days and the second part dealing with the last three. The first part begins with the creation of light and the second with the creation of the heavenly light-givers. Correspondingly, the first part ends with the creation of vegetation and the second with the creation of man. All creatures dealt with in the first part lack local motion; all creatures dealt with in the second part possess local motion.1 Vegetation precedes the sun because vegetation lacks local motion and the sun possesses it. Vegetation belongs to the earth; it is rooted in the earth; it is the fixed covering of the fixed earth.2 Vegetation was brought forth by the earth at God’s command; the Bible does not speak of God’s “making” vegetation; but as regards the living beings in question, God commanded the earth to bring them forth and yet God “made” them. Vegetation was created at the end of the first half of the creation-days; at the end of the last half, the living beings that spend their whole lives on the firm earth were created. The living beings—beings that possess life in addition to local motion—were created on the fifth and sixth days, on the days following the day on which the heavenly light-givers were created. The Bible presents the creatures in an ascending order. Heaven is lower than earth. The heavenly light-givers lack life; they are lower than the lowliest living beast; they serve the living creatures, which are to be found only beneath heaven; they have been created in order to rule over day and night: they have not been made in order to rule over the earth, let alone over man.
The most striking characteristic of the biblical account of creation is its demoting or degrading of heaven and the heavenly lights. Sun, moon, and stars precede the living things because they are lifeless: they are not gods. What the heavenly lights lose, man gains; man is the peak of creation. The creatures of the first three days cannot change their places; the heavenly bodies change their places but not their courses; the living beings change their courses but not their “ways”; men alone can change their “ways.” Man is the only being created in God’s image. Only in the case of man’s creation does the biblical account of creation repeatedly speak of God’s “creating” him; in the case of the creation of heaven and the heavenly bodies, that account speaks of God’s “making” them. Similarly, only in the case of man’s creation does the Bible intimate that there is a multiplicity in God: “Let us make man in our image, after our likeness. . . . So God created man in His image, in the image of God He created him; male and female He created them.” Bisexuality is not a preserve of man, but only man’s bisexuality could give rise to the view that there are gods and goddesses: there is no biblical word for “goddess.” Hence creation is not begetting. The biblical account of creation teaches silently what the Bible teaches elsewhere explicitly: there is only one God, the God whose name is written as the Tetragrammaton, the living God Who lives from ever to ever, Who alone has created heaven and earth and all their hosts; He has not created any gods and hence there are no gods besides Him. The many gods whom men worship are either nothings that owe such being as they possess to man’s making them, or if they are something (like sun, moon, and stars), they surely are not gods.3 All non-polemical references to “other gods” occurring in the Bible are fossils whose preservation indeed poses a question but only a rather unimportant one. Not only did the biblical God not create any gods; on the basis of the biblical account of creation, one could doubt whether He created any beings one would be compelled to call “mythical”: heaven and earth and all their hosts are always accessible to man as man. One would have to start from this fact in order to understand why the Bible contains so many sections that, on the basis of the distinction between mythical (or legendary) and historical, would have to be described as historical.
According to the Bible, creation was completed by, and culminated in, the creation of man. Only after the creation of man did God “see all that He had made, and behold, it was very good.” What then is the origin of the evil or the bad? The biblical answer seems to be that since everything of divine origin is good, evil is of human origin. Yet if God’s creation as a whole is very good, it does not follow that all its parts are good or that creation as a whole contains no evil whatsoever: God did not find all parts of His creation to be good. Perhaps creation as a whole cannot be “very good” if it does not contain some evils. There cannot be light if there is not darkness, and the darkness is as much created as is the light: God creates evil as well as He makes peace (Isaiah 45:7). However this may be, the evils whose origin the Bible lays bare, after it has spoken of creation, are a particular kind of evils: the evils that beset man. Those evils are not due to creation or implicit in it, as the Bible shows by setting forth man’s original condition. In order to set forth that condition, the Bible must retell man’s creation by making man’s creation as much as possible the sole theme. This second account answers the question, not of how heaven and earth and all their hosts have come into being but of how human life as we know it—beset with evils with which it was not beset originally—has come into being. This second account may only supplement the first account but it may also correct it and thus contradict it. After all, the Bible never teaches that one can speak about creation without contradicting oneself. In post-biblical parlance, the mysteries of the Torah (sithre torah) are the contradictions of the Torah; the mysteries of God are the contradictions regarding God.
The first account of creation ended with man; the second account begins with man. According to the first account, God created man and only man in His image; according to the second account, God formed man from the dust of the earth and He blew into his nostrils the breath of life. The second account makes clear that man consists of two profoundly different ingredients, a high one and a low one. According to the first account, it would seem that man and woman were created simultaneously; according to the second account, man was created first. The life of man as we know it, the life of most men, is that of tillers of the soil; their life is needy and harsh. If human life had been needy and harsh from the very beginning, man would have been compelled or at least almost irresistibly tempted to be harsh, uncharitable, unjust; he would not have been fully responsible for his lack of charity or justice. But man is to be fully responsible. Hence the harshness of human life must be due to man’s fault. His original condition must have been one of ease: he was not in need of rain nor of hard work; he was put by God into a well-watered garden that was rich in trees that were good for food. Yet while man was created for a life of ease, he was not created for a life of luxury: there was no gold or precious stones in the garden of Eden. Man was created for a simple life. Accordingly, God permitted him to eat of every tree of the garden except the tree of knowledge of good and evil, “for in the day that you eat of it, you shall surely die.” Man was not denied knowledge; without knowledge he could not have known the tree of knowledge, nor the woman, nor the brutes; nor could he have understood the prohibition. Man was denied knowledge of good and evil, i.e., the knowledge sufficient for guiding himself, his life. Though not being a child, he was to live in childlike simplicity and obedience to God. We are free to surmise that there is a connection between the demotion of heaven in the first account and the prohibition against eating of the tree of knowledge in the second. While man was forbidden to eat of the tree of knowledge, he was not forbidden to eat of the tree of life.
Man, lacking knowledge of good and evil, was content with his condition and in particular with his loneliness. But God, possessing knowledge of good and evil, found that “it is not good for man to be alone, so I will make him a helper as his counterpart.” So God formed the brutes and brought them to man, but they proved not to be the desired helpers. Thereupon God formed the woman out of a rib of the man. The man welcomed her as bone of his bones and flesh of his flesh but, lacking knowledge of good and evil, he did not call her good. The narrator adds that “therefore [namely because the woman is bone of man’s bone and flesh of his flesh] a man leaves his father and his mother, and cleaves to his wife, and they become one flesh.” Both were naked but, lacking knowledge of good and evil, they were not ashamed.
Thus the stage was set for the fall of our first parents. The first move came from the serpent, the most cunning of all the beasts of the field; it seduced the woman into disobedience and then the woman seduced the man. The seduction moves from the lowest to the highest. The Bible does not tell what induced the serpent to seduce the woman into disobeying the divine prohibition. It is reasonable to assume that the serpent acted as it did because it was cunning, i.e., possessed a low kind of wisdom, a congenital malice; everything that God had created would not be very good if it did not include something congenitally bent on mischief. The serpent begins its seduction by suggesting that God might have forbidden man and woman to eat of any tree in the garden, i.e., that God’s prohibition might be malicious or impossible to comply with. The woman corrects the serpent and in so doing makes the prohibition more stringent than it was: “We may eat of the fruit of the other trees of the garden; it is only about the tree in the middle of the garden that God said: you shall not eat of it or touch it, lest you die.”
Now, God did not forbid the man to touch the fruit of the tree of knowledge of good and evil. Besides, the woman does not explicitly speak of the tree of knowledge; she may have had in mind the tree of life. Moreover, God had issued the prohibition only to the man, whereas the woman claims that God had spoken to her as well; she surely knew the divine prohibition only through human tradition. The serpent assures her that they will not die, “for God knows that when you eat of it, your eyes will be opened and you will be like God, knowing good and evil.” The serpent tacitly questions God’s veracity. At the same time it glosses over the fact that eating of the tree involves disobedience to God. In this it is followed by the woman. According to the serpent’s assertion, knowledge of good and evil makes man immune to death (although we cannot know whether the serpent believes this). But the woman, having forgotten the divine prohibition, having therefore in a manner tasted of the tree of knowledge, is no longer wholly unaware of good and evil: she “saw that the tree was good for eating and a delight to the eyes and that the tree was to be desired to make one wise”; therefore she took of its fruit and ate. She thus made the fall of the man almost inevitable, for he was cleaving to her: she gave some of the fruit of the tree to the man, and he ate. The man drifts into disobedience by following the woman. After they had eaten of the tree, their eyes were opened and they knew that they were naked, and they sewed fig leaves together and made themselves aprons: through the fall they became ashamed of their nakedness; eating of the tree of knowledge of good and evil made them realize that nakedness is evil.
The Bible says nothing to the effect that our first parents fell because they were prompted by the desire to be like God; they did not rebel highhandedly against God; rather, they forgot to obey God; they drifted into disobedience. Nevertheless, God punished them severely. But the punishment did not do away with the fact that, as God Himself said, as a consequence of his disobedience “man has become like one of us, knowing good and evil.” There was now the danger that man might eat of the tree of life and live forever. Therefore God expelled him from the garden and made it impossible for him to return to it. One may wonder why man, while he was still in the garden of Eden, had not eaten of the tree of life of which he had not been forbidden to eat. Perhaps he did not think of it because, lacking knowledge of good and evil, he did not fear to die and, besides, the divine prohibition drew his attention away from the tree of life to the tree of knowledge.
The Bible intends to teach that man was meant to live in simplicity, without knowledge of good and evil. But the narrator seems to be aware of the fact that a being which can be forbidden to strive for knowledge of good and evil, i.e., that can understand to some degree that knowledge of good and evil is evil for it, necessarily possesses such knowledge. Human suffering from evil presupposes human knowledge of good and evil and vice versa. Man wishes to live without evil. The Bible tells us that he was given the opportunity to live without evil and that he cannot blame God for the evils from which he suffers. By giving man that opportunity, God convinces him that his deepest wish cannot be fulfilled. The story of the fall is the first part of the story of God’s education of man.
Man has to live with knowledge of good and evil and with the sufferings inflicted on him because of that knowledge or its acquisition. Human goodness or badness presupposes that knowledge and its concomitants. The Bible gives us the first inkling of human goodness and badness in the story of the first brothers. The older brother, Cain, was a tiller of the soil; the younger brother, Abel, a keeper of sheep. God preferred the offering of the keeper of sheep, who brought the choicest of the firstlings of his flock, to that of the tiller of the soil. There were many reasons for this preference but one of them seems to be that the pastoral life is closer to original simplicity than the life of the tillers of the soil. Cain, however was vexed, and despite his having been warned by God against sinning in general, killed his brother. After a futile attempt to deny his guilt—an attempt that increased that guilt (“Am I my brother’s keeper?”)—he was cursed by God as the serpent and the soil had been after the Fall, in contradistinction to Adam and Eve who were not cursed. He was punished by God, but not with death: anyone slaying Cain would be punished much more severely than Cain himself. The relatively mild punishment of Cain cannot be explained by the fact that murder had not been expressly forbidden: Cain possessed some knowledge of good and evil, and he knew that Abel was his brother, even assuming that he did not know that man was created in the image of God. It is better to explain Cain’s punishment by assuming that punishments were milder in the beginning than later on. Cain—like his fellow fratricide, Romulus—founded a city, and some of his descendants were the ancestors of men practicing various arts: the city and the arts, so alien to man’s original simplicity, owe their origin to Cain and his race rather than to Seth, the substitute for Abel, and his race. It goes without saying that this is not the last word of the Bible on the city and the arts but it is its first word, just as the prohibition against eating of the tree of knowledge is, one may say, its first word simply, and the revelation of the Torah—i.e., the highest kind of knowledge of good and evil that is vouchsafed to men—is its last word. The account of the race of Cain culminates in the song of Lamech who boasted to his wives of his slaying of men, of his being superior to God as an avenger. The (antediluvian) race of Seth cannot boast of a single inventor; its only distinguished members were Enoch, who walked with God, and Noah, who was a righteous man and walked with God: civilization and piety are two very different things.
By the time of Noah the wickedness of man had become so great that God repented of His creation of man and all other earthly creatures, Noah alone excepted; so He brought on the flood. Generally speaking, prior to the flood, man’s lifespan was much longer than after it. Man’s antediluvian longevity was a relic of his original condition. Man originally lived in the garden of Eden where he could have eaten of the tree of life and thus become immortal. The longevity of antediluvian man reflects this lost chance. To this extent the transition from antediluvian to postdiluvian man is a decline. This impression is confirmed by the fact that before the flood rather than after it the sons of God consorted with the daughters of man and thus generated the mighty men of old, the men of renown. On the other hand, the fall of our first parents made possible or necessary in due time God’s revelation of His Torah, and this was decisively prepared, as we shall see, by the flood. In this respect, the transition from antediluvian to postdiluvian mankind is a progress. The ambiguity regarding the Fall—the fact that it was a sin and hence avoidable and that it was at the same time inevitable—is reflected in the ambiguity regarding the status of antediluvian mankind.
The link between antediluvian mankind and the revelation of the Torah is supplied by the first covenant between God and men, the covenant following the flood. The flood was the proper punishment for the extreme and well-nigh universal wickedness of antediluvian men. Prior to the flood, mankind lived, so to speak, without restraint, without law. While our first parents were still in the garden of Eden, they were not forbidden anything except to eat of the tree of knowledge. The vegetarianism of antediluvian men was not due to an explicit prohibition (Gen. 1:29); rather, their abstention from meat belongs together with their abstention from wine (cf. 9:20); both were relics of man’s original simplicity. After the expulsion from the garden of Eden, God did not punish men, apart from the relatively mild punishment which He inflicted on Cain. Nor did He establish human judges. God experimented, as it were, for the instruction of mankind, with the possibility of mankind’s living free of the law. The experiment, just like the experiment of having men remain like innocent children, ended in failure. Fallen or awake man needs restraint, must live under law. But this law must not be simply imposed. It must form part of a covenant in which God and man are equally, though not equal, partners. Such a partnership was established only after the flood; it did not exist in antediluvian times either before or after the fall.
The inequality regarding the covenant is shown especially by the fact that when God undertook never again to destroy almost all life on earth as long as the earth lasts, He did not do so on the condition that all or almost all men obey the laws promulgated by God after the flood: God makes His promise despite, or because of, His knowing that the devisings of man’s heart are evil from his youth. Noah is the ancestor of all later men just as Adam was; the purgation of the earth through the flood is to some extent a restoration of mankind to its original state; it is a kind of second creation. Within the limits indicated, the condition of postdiluvian men is superior to that of antediluvian men. One point requires special emphasis: in the legislation following the flood, murder is expressly forbidden and made punishable by death on the ground that man was created in the image of God (9:6). The first covenant brought an increase in hope and at the same time an increase in punishment. Not until after the flood was man’s rule over the beasts, ordained or established from the beginning, to be accompanied by the beasts’ fear and dread of man (cf. 9:2 with 1:26-30 and 2:15).
The covenant following the flood prepares the covenant with Abraham. The Bible singles out three events that took place between the covenant after the flood and God’s calling of Abraham: Noah’s curse of Canaan, a son of Ham; the achievement of excellence by Nimrod, a grandson of Ham; and men’s attempt to prevent their dispersal over the earth by building a city which had a tower that reached to the heavens. Canaan, whose land came to be the promised land, was cursed because Ham saw the nakedness of his father, Noah—because Ham transgressed a most sacred, if unpromulgated, law; the curse of Canaan was accompanied by the blessing of Shem and Japheth who turned their eyes away from the nakedness of their father. Here we have the first and the most fundamental division of mankind, at any rate of postdiluvian mankind, the division into “cursed” and “blessed.” Nimrod was the first to be a mighty man on earth—a mighty hunter before the Lord; his kingdom included Babel (big kingdoms are attempts to overcome by force the division of mankind, conquest and hunting being akin to each other). The city that men built in order to remain together and thus to make a name for themselves was Babel; God scattered them by confounding their speech, by bringing about the division of mankind into groups that could not understand one another: into nations, i.e., groups united not only by descent but also by language. The division of mankind into nations may be described as a milder alternative to the flood.
The three events that took place between God’s covenant with mankind after the flood and His calling of Abraham point to God’s way of dealing with men who know good and evil and devise evil from their youth. Well-nigh universal wickedness will no longer be punished with well-nigh universal destruction, but will be prevented through the division of mankind into nations. Mankind will be divided, not into the cursed and the blessed (the curses and blessings were Noah’s, not God’s), but into a chosen nation and into nations that are not chosen. The emergence of nations made it possible to replace Noah’s Ark—which floated alone on the waters covering the entire earth—by a whole, numerous nation living in the midst of the nations covering the earth. The election of the holy nation begins with the election of Abraham. Noah was distinguished from his contemporaries by his righteousness; Abraham separates himself from his contemporaries and in particular from his country and kindred at God’s command—a command accompanied by God’s promise to make of him a great nation. The Bible does not say that this primary election of Abraham was preceded by the fact of Abraham’s righteousness. However this may be, Abraham shows his righteousness by obeying God’s command at once, by trusting in God’s promise whose fulfillment he could not possibly live to see, given the short lifespan of postdiluvian man: only after Abraham’s offspring would have become a great nation would the land of Canaan be given to them forever.
The fulfillment of the promise required that Abraham not remain childless, and he was already quite old. Accordingly, God promised him that he would have issue. It was Abraham’s trust in God’s promise that, above everything else, made him righteous in the eyes of the Lord. It was God’s intention that His promise be fulfilled through the offspring of Abraham and his wife Sarah. But this promise seemed laughable to Abraham, to say nothing of Sarah: Abraham was one hundred years old and Sarah, ninety. Yet nothing is too wondrous for the Lord. The laughable announcement became a joyous one. It was followed immediately by God’s announcement to Abraham of His concern with the wickedness of the people of Sodom and Gomorrah. God did not yet know whether those people were as wicked as they were said to be. But they might be; they might deserve total destruction as much as did the generation of the flood. Noah had accepted the destruction of his generation without any questioning. Abraham, however, who had a deeper trust in God, in God’s righteousness, and a deeper awareness of his being only dust and ashes, presumed in fear and trembling to appeal to God’s righteousness lest He, the judge of the whole earth, destroy the righteous along with the wicked. In response to Abraham’s insistent pleading, God as it were promised to Abraham that He would not destroy Sodom if ten righteous men could be found in the city: He would save the city for the sake of the ten righteous men within it. Abraham acted as the mortal partner in God’s righteousness; he acted as if he had some share in the responsibility for God’s acting righteously. No wonder God’s covenant with Abraham was incomparably more incisive than His covenant immemediately following the flood.
Abraham’s trust in God thus appears to be the trust that God in His righteousness will not do anything incompatible with His righteousness and that while, or because, nothing is too wondrous for the Lord, there are firm boundaries set to Him by His own righteousness, by Himself. This awareness is deepened and therewith modified by the last and severest test of Abraham’s trust: God’s command to him to sacrifice Isaac, his only son by Sarah. Abraham’s supreme test presupposes the wondrous character of Isaac’s birth: the very son who was to be the sole link between Abraham and the chosen people and who was born against all reasonable expectations, was to be sacrificed by his father. This command contradicted not only the divine promise, but also the divine prohibition against the shedding of innocent blood. Yet Abraham did not argue with God as he had done in the case of Sodom’s destruction. In the case of Sodom, Abraham was not confronted with a divine command to do a certain thing and more particularly he was not confronted with a command to surrender to God what was dearest to him: Abraham did not argue with God for the preservation of Isaac because he loved God—not himself or his most cherished hope—with all his heart, with all his soul, and with all his might. The same concern with God’s righteousness that had induced him to plead with God for the preservation of Sodom if ten just men could be found in that city, induced him not to plead for the preservation of Isaac, for God rightfully demands that He alone be loved unqualifiedly. The fact that the command to sacrifice Isaac contradicted the prohibition against the shedding of innocent blood must be understood in the light of the difference between human justice and divine justice: God alone is unqualifiedly, if un-fathomably, just. God promised Abraham that He would spare Sodom if ten righteous men could be found in it, and Abraham was satisfied with this promise; He did not promise that He would spare the city if nine righteous men were found in it; would those nine be destroyed together with the wicked? And even if all Sodomites were wicked and hence justly destroyed, did their infants who were destroyed with them deserve their destruction? The apparent contradiction between the command to sacrifice Isaac and the divine promise to the descendants of Isaac is disposed of by the consideration that nothing is too wondrous for the Lord. Abraham’s supreme trust in God, his simple, singleminded, childlike faith was rewarded although, or because, it presupposed his entire unconcern with any reward, for Abraham was willing to forgo, to destroy, to kill the only reward with which he was concerned: God prevented the sacrifice of Isaac. Abraham’s intended action needed a reward although he was not concerned with a reward because his intended action cannot be said to have been intrinsically rewarding. The preservation of Isaac is as wondrous as his birth. These two wonders illustrate more clearly than anything else the origin of the holy nation.
The God Who created heaven and earth, Who is the only God, Whose only image is man, Who forbade man to eat of the tree of knowledge of good and evil, Who made a covenant with mankind after the flood and thereafter a convenant with Abraham which became His covenant with Abraham, Isaac, and Jacob—what kind of God is He? Or, to speak more reverently and more adequately, what is His name? This question was addressed to God Himself by Moses when he was sent by Him to the sons of Israel. God replied: “Ehyeh-Asher-Ehyeh,” which is most often translated: “I am That (Who) I am.” I believe, however, that we ought to render this statement, “I shall be What I shall be,” thus preserving the connection between God’s name and the fact that He makes covenants with men, i.e., that He reveals Himself to men above all by His commandments and by His promises and His fulfillment of those promises. “I shall be What I shall be” is, as it were, explained in the verse (Ex. 33:19), “I shall be gracious to whom I shall be gracious and I shall show mercy to whom I shall show mercy.” God’s actions cannot be predicted, unless He Himself has predicted them, i.e., promised them. But as is shown precisely by the account of Abraham’s binding of Isaac, the way in which He fulfills His promises cannot be known in advance. The biblical God is a mysterious God: He comes in a thick cloud (Ex. 19:4); He cannot be seen; His presence can be sensed but not always and everywhere; what is known of Him is only what He chose to communicate by His word through His chosen servants. The rest of the chosen people knows His word—apart from the Ten Commandments (Deut. 4:12 and 5:4-5)—only mediately and does not wish to know it immediately (Ex. 20:19, and 21, 24:1-2; Deut. 10:15-18; Amos 3:7). For almost all purposes the word of God as revealed to His prophets and especially to Moses became the source of knowledge of good and evil, the true tree of knowledge which is at the same time the tree of life.
Having said this much about the beginning of the Bible and what it entails, let us now cast a glance at some Greek counterparts to the beginning of the Bible—to begin with, at Hesiod’s Theogony as well as the remains of Parmenides’s and Empedocles’s works. They are all the works of known authors. This does not mean that they are, or present themselves as being, merely human. Hesiod sings what the Muses, the daughters of Zeus who is the father of gods and men, taught him or commanded him to sing. One could say that the Muses vouch for the truth of Hesiod’s song, were it not for the fact that they sometimes speak lies which resemble what is true. Parmenides transmits the teaching of a goddess, and so does Empedocles. Yet these men composed their books; their songs or speeches are books. The Bible, on the other hand, is not a book. The most one could say is that it is a collection of books. The author of a book, in the strict sense of the term, excludes everything that is not necessary, that does not fulfill a function necessary to the purpose his book is meant to fulfill. The compilers of the Bible as a whole and of the Torah in particular seem to have followed an entirely different rule. Confronted with a variety of preexisting holy speeches, which as such had to be treated with the utmost respect, they excluded only what could not by any stretch of the imagination be rendered compatible with the fundamental and authoritative teaching; their very piety, aroused and fostered by the pre-existing holy speeches, led them to make such changes in those holy speeches as they did make. Their work may then abound in contradictions and repetitions that no one ever intended as such, whereas in a book in the strict sense there is nothing that is not intended by the author.
Hesiod’s Theogony sings of the generation or begetting of the gods; the gods were not “made” by anybody. Far from having been created by a god, earth and heaven are the ancestors of the immortal gods. More precisely, according to Hesiod everything that is has come to be. First there arose Chaos, Gaia (Earth), and Eros. Gaia gave birth first to Ouranos (Heaven) and then, mating with Ouranos, she brought forth Kronos and his brothers and sisters. Ouranos hated his children and did not wish them to come to life. At the wish and advice of Gaia, Kronos deprived his father of his generative power and thus unintentionally brought about the emergence of Aphrodite; Kronos became the king of the gods. Kronos’s evil deed was avenged by his son Zeus whom he had generated by mating with Rheia and whom he had planned to destroy; Zeus dethroned his father and thus became the king of the gods, the father of gods and men, the mightiest of all gods. Given his ancestors it is not surprising that while he is the father of men and belongs to the gods who are the givers of good things, he is far from being kind to men. Mating with Mnemosyne, the daughter of Gaia and Ouranos, Zeus generated the nine Muses. The Muses give sweet and gentle eloquence and understanding to the kings whom they wish to honor. Through the Muses there are singers on earth, just as through Zeus there are kings. While kingship and song may go together, there is a profound difference between the two—a difference that, guided by Hesiod, one may compare to that between the hawk and the nightingale. Surely Metis (Wisdom), while being Zeus’s first spouse and having become inseparable from him, is not identical with him; the relation of Zeus and Metis may remind one of the relation of God and wisdom in the Bible.
Hesiod speaks of the creation or making of men not in the Theogony but in his Works and Days, i.e., in the context of his speeches regarding how man should live, regarding man’s right life, which includes the teaching regarding the right seasons (the “days”); the question of the right life does not arise regarding the gods. The right life for man is the just life, the life devoted to working, especially to tilling the soil. Work thus understood is a blessing ordained by Zeus who blesses the just and crushes the proud: often even a whole city is destroyed for the deeds of a single bad man. Yet Zeus takes cognizance of men’s justice and injustice only if he so wills. Accordingly, work appears to be not a blessing but a curse: men must work because the gods keep hidden from them the means of life and they do this in order to punish them for Prometheus’s theft of fire—a theft inspired by philanthropy. But was not Prometheus’s action itself prompted by the fact that men were not properly provided for by the gods and in particular by Zeus? Be this as it may, Zeus did not deprive men of the fire that Prometheus had stolen for them; he punished them by sending them Pandora and her box, that was filled with countless evils like hard labor. The evils with which human life is beset cannot be traced to human sin. Hesiod conveys the same message by his story of the five races of men which came into being successively. The first of these, the golden race, was made by the gods while Kronos was still ruling in heaven. These men lived without toil or grief; they had all good things in abundance because the earth by itself gave them abundant fruit. Yet the men made by father Zeus lack this bliss. Hesiod does not make clear whether this is due to Zeus’s ill-will or to his lack of power; he gives us no reason to think that it is due to man’s sin. He creates the impression that human life becomes ever more miserable as one race of men succeeds another: there is no divine promise, supported by the fulfillment of earlier divine promises, that permits one to trust and to hope.
The most striking difference between the poet Hesiod and the philosophers Parmenides and Empedocles is that according to the philosophers, not everything has come into being: that which truly is, has not come into being and does not perish. This does not necessarily mean that what exists always is a god or gods. For if Empedocles calls one of the eternal four elements Zeus, this Zeus has hardly anything in common with what Hesiod, or the people generally, understood by Zeus. At any rate, according to both philosophers, the gods as ordinarily understood have come into being, just like heaven and earth, and will therefore perish again.
At the time when the opposition between Jerusalem and Athens reached the level of what one may call its classical struggle, in the 12th and 13th centuries, philosophy was represented by Aristotle. The Aristotelian god, like the biblical God, is a thinking being, but in opposition to the biblical God he is only a thinking being, pure thought: pure thought that thinks itself and only itself. Only by thinking himself and nothing but himself does he rule the world. He surely does not rule by giving orders and laws. Hence he is not a creator-god: the world is as eternal as god. Man is not his image: man is much lower in rank than other parts of the world. For Aristotle it is almost a blasphemy to ascribe justice to his god; he is above justice as well as injustice.
It has often been said that the philosopher who comes closest to the Bible is Plato. This was said not least during the classical struggle between Jerusalem and Athens in the Middle Ages. Both Platonic philosophy and biblical piety are animated by the concern with purity and purification: “pure reason” in Plato’s sense is closer to the Bible than “pure reason” in Kant’s sense or for that matter in Anaxagoras’s and Aristotle’s sense. Plato teaches, just as the Bible does, that heaven and earth were created or made by an invisible God whom he calls the Father, who is eternal, who is good, and hence whose creation is good. The coming-into-being and the preservation of the world that he has created depend on the will of its maker. What Plato himself calls theology consists of two teachings: (1) God is good and hence in no way the cause of evil; (2) God is simple and hence unchangeable. On the question of divine concern with men’s justice and injustice, Platonic teaching is in fundamental agreement with biblical teaching; it even culminates in a statement that agrees almost literally with biblical statements.4 Yet the differences between the Platonic and biblical teachings are no less striking than the similarities. The Platonic teaching on creation does not claim to be more than a likely tale. The Platonic God is a creator also of gods, of visible living beings, i.e., of the stars; the created gods rather than the creator God create the mortal living beings and in particular man; heaven is a blessed god. The Platonic God does not create the world by his word; he creates it after having looked to the eternal ideas which therefore are higher than he. In accordance with this, Plato’s explicit theology is presented within the context of the first discussion of education in the Republic, within the context of what one may call the discussion of elementary education; in the second and final discussion of education—the education of philosophers—theology is replaced by the doctrine of ideas. As for the thematic discussion of providence in the Laws, it may suffice here to say that it occurs within the context of the discussion of penal law.
In his likely tale of how God created the visible whole, Plato makes a distinction between two kinds of gods, the visible cosmic gods and the traditional gods—between the gods who revolve manifestly, i.e., who manifest themselves regularly, and the gods who manifest themselves so far as they will. The least one would have to say is that according to Plato the cosmic gods are of much higher rank than the traditional gods, the Greek gods. Inasmuch as the cosmic gods are accessible to man as man—to his observations and calculations—whereas the Greek gods are accessible only to the Greeks through Greek tradition, one may, in comic exaggeration, ascribe the worship of the cosmic gods to barbarians. This ascription is made in a manner and with an intention altogether non-comic in the Bible: Israel is forbidden to worship the sun and the moon and the stars which the Lord has allotted to the other peoples everywhere under heaven. This implies that the worship of the cosmic gods by other peoples, the barbarians, is not due to a natural or rational cause, to the fact that those gods are accessible to man as man, but to an act of God’s will. It goes without saying that according to the Bible the God Who manifests Himself as far as He wills, Who is not universally worshipped as such, is the only true God. The Platonic statement taken in conjunction with the biblical statement brings out the fundamental opposition of Athens at its peak to Jerusalem: the opposition of the God or gods of the philosophers to the God of Abraham, Isaac, and Jacob, the opposition of reason and revelation.
II. On Socrates and the Prophets
Fifty years ago, in the middle of World War I, Hermann Cohen, the greatest representative of, and spokesman for, German Jewry, the most powerful figure among the German professors of philosophy of his time, stated his view on Jerusalem and Athens in a lecture entitled “The Social Ideal in Plato and the Prophets.” He repeated that lecture shortly before his death, and we may regard it as stating his final view on Jerusalem and Athens and therewith on the truth. For, as Cohen says right at the beginning, “Plato and the prophets are the two most important sources of modern culture.” Being concerned with “the social ideal,” he does not say a single word about Christianity in the whole lecture.
Cohen’s view may be restated as follows. The truth is the synthesis of the teachings of Plato and the prophets. What we owe to Plato is the insight that the truth is in the first place the truth of science but that science must be supplemented, overarched, by the idea of the good which to Cohen means, not God, but rational, scientific ethics. The ethical truth must not only be compatible with the scientific truth; the ethical truth needs the scientific truth. The prophets are very much concerned with knowledge: with the knowledge of God. But this knowledge, as the prophets understood it, has no connection whatever with scientific knowledge; it is knowledge only in a metaphorical sense. It is perhaps with a view to this fact that Cohen speaks once of the divine Plato but never of the divine prophets. Why then can he not leave matters at Platonic philosophy? What is the fundamental defect of Platonic philosophy that is remedied by the prophets and only by the prophets? According to Plato, the cessation of evil requires the rule of the philosophers, of the men who possess the highest kind of human knowledge, i.e., of science in the broadest sense of the term. But this kind of knowledge like, to some extent, all scientific knowledge, is, according to Plato, the preserve of a small minority: of the men who possess a certain nature and certain gifts that most men lack. Plato presupposes that there is an unchangeable human nature and, as a consequence, a fundamental structure of the good human society which is unchangeable. This leads him to assert or to assume that there will be wars as long as there will be human beings, that there ought to be a class of warriors and that the class ought to be higher in rank and honor than the class of producers and exchangers. These defects in Plato’s system are remedied by the prophets precisely because they lack the idea of science and hence the idea of nature, and therefore they can believe that men’s conduct toward one another can undergo a change much more radical than any change ever dreamed of by Plato.
Cohen brought out very well the antagonism between Plato and the prophets. Nevertheless we cannot leave matters at his view of that antagonism. Cohen’s thought belongs to the world preceding World War I, and accordingly reflects a greater faith in the power of modern Western culture to mold the fate of mankind than seems to be warranted now. The worst things experienced by Cohen were the Dreyfus scandal and the pogroms instigated by Tsarist Russia: he did not experience Communist Russia and Hitler Germany. More disillusioned than he regarding modern culture, we wonder whether the two separate ingredients of modern culture, of the modern synthesis, are not more solid than the synthesis itself. Catastrophes and horrors of a magnitude hitherto unknown, which we have seen and through which we have lived, were better provided for, or made intelligible, by both Plato and the prophets than by the modern belief in progress. Since we are less certain than Cohen was that the modern synthesis is superior to its pre-modern ingredients, and since the two ingredients are in fundamental opposition to each other, we are ultimately confronted by a problem rather than by a solution.
More particularly, Cohen understood Plato in the light of the opposition between Plato and Aristotle—an opposition that he understood in turn in the light of the opposition between Kant and Hegel. We, however, are more impressed than Cohen was by the kinship between Plato and Aristotle on the one hand and the kinship between Kant and Hegel on the other. In other words, the quarrel between the ancients and the moderns seems to us to be more fundamental than either the quarrel between Plato and Aristotle or that between Kant and Hegel.
We, moreover, prefer to speak of Socrates and the prophets rather than of Plato and the prophets, and for the following reasons. We are no longer as sure as Cohen was that we can draw a clear line between Socrates and Plato. There is traditional support for drawing such a clear line, above all in Aristotle; but Aristotle’s statements on this kind of subject no longer possess for us the authority that they formerly possessed, and this is clue partly to Cohen himself. The clear distinction between Socrates and Plato is based not only on tradition, but on the results of modern historical criticism; yet these results are in the decisive respect hypothetical. The decisive fact for us is that Plato points, as it were, away from himself to Socrates. If we wish to understand Plato, we must take him seriously; we must take seriously in particular his deference to Socrates. Plato points not only to Socrates’s speeches but to his whole life, and to his fate as well. Hence Plato’s life and fate do not have the symbolic character of Socrates’s life and fate. Socrates, as presented by Plato, had a mission; Plato did not claim to have a mission. It is in the first place this fact—the fact that Socrates had a mission—that induces us to consider, not Plato and the prophets, but Socrates and the prophets.
I cannot speak in my own words of the mission of the prophets. Let me, however, remind the reader of some prophetic utterances of singular force and grandeur. Isaiah 6:
In the year that King Uzziah died I saw also the Lord sitting upon a throne, high and lifted up, and his train filled the temple. Above it stood the seraphim: each one had six wings; with twain he covered his face, and with twain he covered his feet, and with twain he did fly. And one cried unto another, and said, Holy, holy, holy is the Lord of hosts: the whole world is full of his glory. . . . Then I said, Woe is me! for I am undone; because I am a man of unclean lips, and I dwell in the midst of a people of unclean lips. . . . Then flew one of the seraphim unto me, having a live coal in his hand, which he had taken with the tongs from off the altar: And he laid it upon my mouth, and said, Lo, this hath touched thy lips; and thine iniquity is taken away, and thy sin purged. Also I heard the voice of the Lord, saying, Whom shall I send, and who will go for us? Then said I, Here am I; send me.
Isaiah, it seems, volunteered for his mission. Could he not have remained silent? Could he refuse to volunteer? When the word of the Lord came unto Jonah, “Arise, go to Nineveh, that great city, and cry against it; for their wickedness is come up before me,” “Jonah rose up to flee unto Tarshish from the presence of the Lord”; Jonah ran away from his mission; but God did not allow him to run away; He compelled him to fulfill it. Of this compulsion we hear in different ways from Amos and Jeremiah. Amos 3:7-8: “Surely the Lord God will do nothing but he revealeth his secret unto his servants the prophets. The lion hath roared, who will not fear? The Lord God hath spoken; who will not prophesy?” The prophets, overpowered by the majesty of the Lord, bring the message of His wrath and His mercy. Jeremiah 1:4-10.
Then the word of the Lord came unto me, saying, Before I formed thee in the belly I knew thee and before thou camest out of the womb I sanctified thee, and I ordained thee a prophet unto the nations. Then said I, Ah, Lord God! behold, I cannot speak; for I am a child. But the Lord said unto me, Say not, I am a child; for thou shalt go to all that I shall send thee, and whatsoever I command thee thou shalt speak. . . . Then the Lord put forth his hand, and touched my mouth. And the Lord said unto me, Behold I have put my words in thy mouth. See, I have this day set thee over the nations and over the kingdoms, to root out, and to pull down, and to destroy, and to throw down, to build, and to plant.
To be sure, the claim to have been sent by God was raised also by men who were not truly prophets but prophets of falsehood, false prophets. Many or most hearers were therefore uncertain as to which kinds of claimants to prophecy were to be trusted or believed. According to the Bible, the false prophets simply lied in saying that they were sent by God. The false prophets tell the people what the people like to hear; hence they are much more popular than the true prophets. The false prophets are “prophets of the deceit of their own heart” (ibid. 26); they tell the people what they themselves imagined (consciously or unconsciously) because they wished it or their hearers wished it. But: “Is not my word like as a fire saith the Lord, and like a hammer that breaketh rock in pieces?” (ibid. 29). Or, as Jeremiah put it when opposing the false prophet, Hananiah: “The prophets that have been before me and before thee of old prophesied both against many countries, and against great kingdoms, of war, and of evil, and of pestilence” (28:8). This does not mean that a prophet is true only if he is a prophet of doom; the true prophets are also prophets of ultimate salvation. We understand the difference between the true and the false prophets if we listen to and meditate on these words of Jeremiah: “Thus saith the Lord; Cursed is the man, that trusteth in man, and makes flesh his arm, and whose heart departeth from the Lord. . . . Blessed is the man that trusteth in the Lord, and whose hope the Lord is.” The false prophets trust in flesh, even if that flesh is the temple in Jerusalem, the promised land, the chosen people itself, or even God’s promise to the chosen people (if that promise is taken to be an unconditional promise and not as a part of a covenant). The true prophets, regardless of whether they predict doom or salvation, predict the unexpected, the humanly unforeseeable—what would not occur to men, left to themselves, to fear or to hope. The true prophets speak and act by the spirit and in the spirit of Ehyeh-asher-ehyeh. For the false prophets, on the other hand, there cannot be the wholly unexpected, whether bad or good.
Of Socrates’s mission we know only through Plato’s Apology of Socrates, which presents itself as the speech delivered by Socrates when he defended himself against the charge that he did not believe in the existence of the gods worshipped by the city of Athens and that he corrupted the young. In that speech he denies possessing any more than human wisdom. This denial was understood by Judah Halevi among others as follows: “Socrates said to the people: ‘I do not deny your divine wisdom, but I say that I do not understand it; I am wise only in human wisdom.’”5 While this interpretation points in the right direction, it goes somewhat too far. Socrates, at least, immediately after having denied possessing anything more than human wisdom, refers to the speech that originated his mission, and of this speech he says that it is not his but he seems to ascribe to it divine origin. He does trace what he says to a speaker who is worthy of the Athenians’ credence. But it is probable that he means by that speaker his companion, Chairephon, who is more worthy of credence than Socrates because he was attached to the democratic regime. This Chairephon, having once come to Delphi, asked Apollo’s oracle whether there was anyone wiser than Socrates. The Pythia replied that no one was wiser. This reply originated Socrates’s mission. We see at once that Socrates’s mission originated in human initiative, in the initiative of one of Socrates’s companions. Socrates, on the other hand, takes it for granted that the reply given by the Pythia was given by the god Apollo himself. Yet this does not induce him to take it for granted that the god’s reply is true. He does take it for granted that it is not meet for the god to lie. Yet this does not make the god’s reply convincing to him. In fact he tries to refute that reply by discovering men who are wiser than he. Engaging in this quest, he finds out that the god spoke the truth: Socrates is wiser than other men because he knows nothing, i.e., nothing about the most important things, whereas the others believe that they know the truth about the most important things. Thus his attempt to refute the oracle turns into a vindication of the oracle. Without intending it, he comes to the assistance of the god; he serves the god; he obeys the god’s command. Although no god had ever spoken to him, he is satisfied that the god had commanded him to examine himself and the others, i.e., to philosophize, or to exhort everyone he meets to the practice of virtue: he has been given by the god to the city of Athens as a gadfly.
While Socrates does not claim to have heard the speech of a god, he claims that a voice—something divine and demonic—speaks to him from time to time, his daimonion. This daimonion, however, has no connection with Socrates’s mission, for it never urges him forward but only keeps him back. While the Delphic oracle urged him forward toward philosophizing, toward examining his fellow men, and thus made him generally hated and thus brought him into mortal danger, his daimonion kept him back from political activity and thus saved him from mortal danger.
The fact that both Socrates and the prophets have a divine mission means, or at any rate implies, that both Socrates and the prophets are concerned with justice or righteousness, with the perfectly just society which, as such, would be free of all evils. To this extent Socrates’s figuring out of the best social order and the prophets’ vision of the messianic age are in agreement. Yet whereas the prophets predict the coming of the messianic age, Socrates merely holds that the perfect society is possible: whether it will ever be actual depends on an unlikely, although not impossible, coincidence, the coincidence of philosophy and political power. For, according to Socrates, the coming-into-being of the best political order is not due to divine intervention; human nature will remain as it always has been; the decisive difference between the best political order and all other societies is that in the former the philosophers will be kings or the natural potentiality of the philosophers will reach its utmost perfection. In the most perfect social order, as Socrates sees it, knowledge of the most important things will remain, as it always was, the preserve of the philosophers, i.e., of a very small part of the population. According to the prophets, however, in the messianic age “the earth shall be full of knowledge of the Lord, as the waters cover the earth” (Isaiah 11:9), and this will be brought about by God Himself. As a consequence, the messianic age will be the age of universal peace: all nations shall come to the mountain of the Lord, to the house of the God of Jacob, “and they shall beat their swords into plowshares, and their spears into pruning hooks: nation shall not lift up sword against nation, neither shall they learn war any more” (Isaiah 2:2-4). The best regime, however, as Socrates envisages it, will animate a single city which, as a matter of course, will become embroiled in wars with other cities. The cessation of evils that Socrates expects from the establishment of the best regime will not include the cessation of war.
Finally, the perfectly just man, the man who is as just as is humanly possible, is, according to Socrates, the philosopher; according to the prophets, he is the faithful servant of the Lord. The philosopher is the man who dedicates his life to the quest for knowledge of the good, of the idea of the good; what we would call moral virtue is only the condition or by-product of that quest. According to the prophets, however, there is no need for the quest for knowledge of the good: God “hath shewed thee, O man, what is good; and what doth the Lord require of thee, but to do justly, and to love mercy, and to walk humbly with thy God” (Micah 6.8).
1 Cf. U. Cassuto, A Commentary on the Book of Genesis, Part I, Jerusalem, 1961, p. 42.
2 Cf. the characterization of the plants as ???e?a (“in or of the earth”) in Plato’s Republic, 491 d 1. Cf. Empedocles A 70.
3 Cf. the distinction between the two kinds of “other gods” in Deut. 4:15-19, between the idols on the one hand and sun, moon, and stars on the other.
4 Compare Plato’s Laws 905 a 4-b 2 with Amos 9:1-3 and Psalm 139:7-10.
5 Kuzari IV, 13 and V, 14.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Jerusalem and Athens: Some Introductory Reflections
Must-Reads from Magazine
Exactly one week later, a Star Wars cantina of the American extremist right featuring everyone from David Duke to a white-nationalist Twitter personality named “Baked Alaska” gathered in Charlottesville, Virginia, to protest the removal of a statue honoring the Confederate general Robert E. Lee. A video promoting the gathering railed against “the international Jewish system, the capitalist system, and the forces of globalism.” Amid sporadic street battles between far-right and “antifa” (anti-fascist) activists, a neo-Nazi drove a car into a crowd of peaceful counterprotestors, killing a 32-year-old woman.
Here, in the time span of just seven days, was the dual nature of contemporary American anti-Semitism laid bare. The most glaring difference between these two displays of hate lies not so much in their substance—both adhere to similar conspiracy theories articulating nefarious, world-altering Jewish power—but rather their self-characterization. The animosity expressed toward Jews in Charlottesville was open and unambiguous, with demonstrators proudly confessing their hatred in the familiar language of Nazis and European fascists.
The socialists in Chicago, meanwhile, though calling for a literal second Holocaust on the shores of the Mediterranean, would fervently and indignantly deny they are anti-Semitic. On the contrary, they claim the mantle of “anti-fascism” and insist that this identity naturally makes them allies of the Jewish people. As for those Jews who might oppose their often violent tactics, they are at best bystanders to fascism, at worst collaborators in “white supremacy.”
So, whereas white nationalists explicitly embrace a tribalism that excludes Jews regardless of their skin color, the progressives of the DSA and the broader “woke” community conceive of themselves as universalists—though their universalism is one that conspicuously excludes the national longings of Jews and Jews alone. And whereas the extreme right-wingers are sincere in their anti-Semitism, the socialists who called for the elimination of Israel are just as sincere in their belief that they are not anti-Semitic, even though anti-Semitism is the inevitable consequence of their rhetoric and worldview.
The sheer bluntness of far-right anti-Semitism makes it easier to identify and stigmatize as beyond the pale; individuals like David Duke and the hosts of the “Daily Shoah” podcast make no pretense of residing within the mainstream of American political debate. But the humanist appeals of the far left, whose every libel against the Jewish state is paired with a righteous invocation of “justice” for the Palestinian people, invariably trigger repetitive and esoteric debates over whether this or that article, allusion, allegory, statement, policy, or political initiative is anti-Semitic or just critical of Israel. What this difference in self-definition means is that there is rarely, if ever, any argument about the substantive nature of right-wing anti-Semitism (despicable, reprehensible, wicked, choose your adjective), while the very existence of left-wing anti-Semitism is widely doubted and almost always indignantly denied by those accused of practicing it.T o be sure, these recent manifestations of anti-Semitism occur on the left and right extremes. And statistics tell a rather comforting story about the state of anti-Semitism in America. Since the Anti-Defamation League began tracking it in 1979, anti-Jewish hate crime is at an historic low; indeed, it has been declining since a recent peak of 1,554 incidents in 2006. America, for the most part, remains a very philo-Semitic country, one of the safest, most welcoming countries for Jews on earth. A recent Pew poll found Jews to be the most admired religious group in the United States.1 If American Jews have anything to dread, it’s less anti-Semitism than the loss of Jewish peoplehood through assimilation, that is being “loved to death” by Gentiles.2 Few American Jews can say that anti-Semitism has a seriously deleterious impact on their life, that it has denied them educational or employment opportunities, or that they fear for the physical safety of themselves or their families because of their Jewish identity.
The question is whether the extremes are beginning to move in on the center. In the past year alone, the DSA’s rolls tripled from 8,000 to 25,000 dues-paying members, who have established a conspicuous presence on social media reaching far beyond what their relatively miniscule numbers attest. The DSA has been the subject of widespread media coverage, ranging from the curious to the adulatory. The white supremacists, meanwhile, found themselves understandably heartened by the strange difficulty President Donald Trump had in disavowing them. He claimed, in fact, that there had been some “very fine people” among their ranks. “Thank you President Trump for your honesty & courage to tell the truth about #Charlottesville,” tweeted David Duke, while the white-nationalist Richard Spencer said, “I’m proud of him for speaking the truth.”
Indeed, among the more troubling aspects of our highly troubling political predicament—and one that, from a Jewish perspective, provokes not a small amount of angst—is that so many ideas, individuals, and movements that could once reliably be categorized as “extreme,” in the literal sense of articulating the views of a very small minority, are no longer so easily dismissed. The DSA is part of a much broader revival of the socialist idea in America, as exemplified by the growing readership of journals like Jacobin and Current Affairs, the popularity of the leftist Chapo Trap House podcast, and the insurgent presidential campaign of self-described democratic socialist Bernie Sanders—who, according to a Harvard-Harris poll, is now the most popular politician in the United States. Since 2015, the average age of a DSA member dropped from 64 to 30, and a 2016 Harvard poll found a majority of Millennials do not support capitalism.
Meanwhile, the Republican Party of Donald Trump offers “nativism and culture war wedges without the Reaganomics,” according to Nicholas Grossman, a lecturer in political science at the University of Illinois. A party that was once reliably internationalist and assertive against Russian aggression now supports a president who often preaches isolationism and never has even a mildly critical thing to say about the KGB thug ruling over the world’s largest nuclear arsenal.
Like ripping the bandage off an ugly and oozing wound, Trump’s presidential campaign unleashed a bevy of unpleasant social forces that at the very least have an indirect bearing on Jewish welfare. The most unpleasant of those forces has been the so-called alternative right, or “alt-right,” a highly race-conscious political movement whose adherents are divided on the “JQ” (Jewish Question). Throughout last year’s campaign, Jewish journalists (this author included) were hit with a barrage of luridly anti-Semitic Twitter messages from self-described members of the alt-right. The tamer missives instructed us to leave America for Israel, others superimposed our faces onto the bodies of concentration camp victims.3
I do not believe Donald Trump is himself an anti-Semite, if only because anti-Semitism is mainly a preoccupation—as distinct from a prejudice—and Trump is too narcissistic to indulge any preoccupation other than himself. And there is no evidence to suggest that he subscribes to the anti-Semitic conspiracy theories favored by his alt-right supporters. But his casual resort to populism, nativism, and conspiracy theory creates a narrative environment highly favorable to anti-Semites.
Nativism, of which Trump was an early and active practitioner, is never good for the Jews, no matter how affluent or comfortable they may be and notwithstanding whether they are even the target of its particular wrath. Racial divisions, which by any measure have grown significantly worse in the year since Trump was elected, hurt all Americans, obviously, but they have a distinct impact on Jews, who are left in a precarious position as racial identities calcify. Not only are the newly emboldened white supremacists of the alt-right invariably anti-Semites, but in the increasingly racialist taxonomy of the progressive left—which more and more mainstream liberals are beginning to parrot—Jews are considered possessors of “white privilege” and, thus, members of the class to be divested of its “power” once the revolution comes. In the racially stratified society that both extremes envision, Jews lose out, simultaneously perceived (by the far right) as wily allies and manipulators of ethnic minorities in a plot to mongrelize America and (by the far left) as opportunistic “Zionists” ingratiating themselves with a racist and exploitative “white” establishment that keeps minorities down.T his politics is bad for all Americans, and Jewish Americans in particular. More and more, one sees the racialized language of the American left being applied to the Middle East conflict, wherein Israel (which is, in point of fact, one of the most racially diverse countries in the world) is referred to as a “white supremacist” state no different from that of apartheid South Africa. In a book just published by MIT Press, ornamented with a forward by Cornel West and entitled “Whites, Jews, and Us,” a French-Algerian political activist named Houria Bouteldja asks, “What can we offer white people in exchange for their decline and for the wars that will ensue?” Drawing the Jews into her race war, Bouteldja, according to the book’s jacket copy, “challenges widespread assumptions among the left in the United States and Europe—that anti-Semitism plays any role in Arab–Israeli conflicts, for example, or that philo-Semitism doesn’t in itself embody an oppressive position.” Jew-hatred is virtuous, and appreciation of the Jews is racism.
Few political activists of late have done more to racialize the Arab–Israeli conflict—and, through insidious extension of the American racial hierarchy, designate American Jews as oppressors—than the Brooklyn-born Arab activist Linda Sarsour. An organizer of the Women’s March, Sarsour has seamlessly insinuated herself into a variety of high-profile progressive campaigns, a somewhat incongruent position given her reactionary views on topics like women’s rights in Saudi Arabia. (“10 weeks of PAID maternity leave in Saudi Arabia,” she tweets. “Yes PAID. And ur worrying about women driving. Puts us to shame.”) Sarsour, who is of Palestinian descent, claims that one cannot simultaneously be a feminist and a Zionist, when it is the exact opposite that is true: No genuine believer in female equality can deny the right of Israel to exist. The Jewish state respects the rights of women more than do any of its neighbors. In an April 2017 interview, Sarsour said that she had become a high-school teacher for the purpose of “inspiring young people of color like me.” Just three months earlier, however, in a video for Vox, Sarsour confessed, “When I wasn’t wearing hijab I was just some ordinary white girl from New York City.” The donning of Muslim garb, then, confers a racial caste of “color,” which in turn confers virtue, which in turn confers a claim on political power.
This attempt to describe the Israeli–Arab conflict in American racial vernacular marks Jews as white (a perverse mirror of Nazi biological racism) and thus implicates them as beneficiaries of “structural racism,” “white privilege,” and the whole litany of benefits afforded to white people at birth in the form of—to use Ta-Nehisi Coates’s abstruse phrase—the “glowing amulet” of “whiteness.” “It’s time to admit that Arthur Balfour was a white supremacist and an anti-Semite,” reads the headline of a recent piece in—where else? —the Forward, incriminating Jewish nationalism as uniquely perfidious by dint of the fact that, like most men of his time, a (non-Jewish) British official who endorsed the Zionist idea a century ago held views that would today be considered racist. Reading figures like Bouteldja and Sarsour brings to mind the French philosopher Pascal Bruckner’s observation that “the racialization of the world has to be the most unexpected result of the antidiscrimination battle of the last half-century; it has ensured that the battle continuously re-creates the curse from which it is trying to break free.”
If Jews are white, and if white people—as a group—enjoy tangible and enduring advantages over everyone else, then this racially essentialist rhetoric ends up with Jews accused of abetting white supremacy, if not being white supremacists themselves. This is one of the overlooked ways in which the term “white supremacy” has become devoid of meaning in the age of Donald Trump, with everyone and everything from David Duke to James Comey to the American Civil Liberties Union accused of upholding it. Take the case of Ben Shapiro, the Jewish conservative polemicist. At the start of the school year, Shapiro was scheduled to give a talk at UC Berkeley, his alma matter. In advance, various left-wing groups put out a call for protest in which they labeled Shapiro—an Orthodox Jew—a “fascist thug” and “white supremacist.” An inconvenient fact ignored by Shapiro’s detractors is that, according to the ADL, he was the top target of online abuse from actual white supremacists during the 2016 presidential election. (Berkeley ultimately had to spend $600,000 protecting the event from leftist rioters.)
A more pernicious form of this discourse is practiced by left-wing writers who, insincerely claiming to have the interests of Jews at heart, scold them and their communal organizations for not doing enough in the fight against anti-Semitism. Criticizing Jews for not fully signing up with the “Resistance” (which in form and function is fast becoming the 21st-century version of the interwar Popular Front), they then slyly indict Jews for being complicit in not only their own victimization but that of the entire country at the hands of Donald Trump. The first and foremost practitioner of this bullying and rather artful form of anti-Semitism is Jeet Heer, a Canadian comic-book critic who has achieved some repute on the American left due to his frenetic Twitter activity and availability when the New Republic needed to replace its staff that had quit en masse in 2014. Last year, when Heer came across a video of a Donald Trump supporter chanting “JEW-S-A” at a rally, he declared on Twitter: “We really need to see more comment from official Jewish groups like ADL on way Trump campaign has energized anti-Semitism.”
But of course “Jewish groups” have had plenty to say about the anti-Semitism expressed by some Trump supporters—too much, in the view of their critics. Just two weeks earlier, the ADL had released a report analyzing over 2 million anti-Semitic tweets targeting Jewish journalists over the previous year. This wasn’t the first time the ADL raised its voice against Trump and the alt-right movement he emboldened, nor would it be the last. Indeed, two minutes’ worth of Googling would have shown Heer that his pronouncements about organizational Jewish apathy were wholly without foundation.4
It’s tempting to dismiss Heer’s observation as mere “concern trolling,” a form of Internet discourse characterized by insincere expressions of worry. But what he did was nastier. Immediately presented with evidence for the inaccuracy of his claims, he sneered back with a bit of wisdom from the Jewish sage Hillel the Elder, yet cast as mild threat: “If I am not for myself, who will be for me?” In other words: How can you Jews expect anyone to care about your kind if you don’t sufficiently oppose—as arbitrarily judged by moi, Jeet Heer—Donald Trump?
If this sort of critique were coming from a Jewish donor upset that his preferred organization wasn’t doing enough to combat anti-Semitism, or a Gentile with a proven record of concern for Jewish causes, it wouldn’t have turned the stomach. What made Heer’s interjection revolting is that, to put it mildly, he’s not exactly known for being sympathetic toward the Jewish plight. In 2015, Heer put his name to a petition calling upon an international comic-book festival to drop the Israeli company SodaStream as a co-sponsor because the Jewish state is “built on the mass ethnic cleansing of Palestinian communities and sustained through racism and discrimination.” Heer’s name appeared alongside that of Carlos Latuff, a Brazilian cartoonist who won second place in the Iranian government’s 2006 International Holocaust Cartoon Competition. For his writings on Israel, Heer has been praised as being “very good on the conflict” by none other than Philip Weiss, proprietor of the anti-Semitic hate site Mondoweiss.
In light of this track record, Heer’s newfound concern about anti-Semitism appeared rather dubious. Indeed, the bizarre way in which he expressed this concern—as, ultimately, a critique not of anti-Semitism per se but of the country’s foremost Jewish civil-rights organization—suggests he cares about anti-Semitism insofar as its existence can be used as a weapon to beat his political adversaries. And since the incorrigibly Zionist American Jewish establishment ranks high on that list (just below that of Donald Trump and his supporters), Heer found a way to blame it for anti-Semitism. And what does that tell you? It tells you that—presented with a 16-second video of a man chanting “JEW-S-A” at a Donald Trump rally—Heer’s first impulse was to condemn not the anti-Semite but the Jews.
Heer isn’t the only leftist (or New Republic writer) to assume this rhetorical cudgel. In a piece entitled “The Dismal Failure of Jewish Groups to Confront Trump,” one Stephen Lurie attacked the ADL for advising its members to stay away from the Charlottesville “Unite the Right Rally” and let police handle any provocations from neo-Nazis. “We do not have a Jewish organizational home for the fight against fascism,” he quotes a far-left Jewish activist, who apparently thinks that we live in the Weimar Republic and not a stable democracy in which law-enforcement officers and not the balaclava-wearing thugs of antifa maintain the peace. Like Jewish Communists of yore, Lurie wants to bully Jews into abandoning liberalism for the extreme left, under the pretext that mainstream organizations just won’t cut it in the fight against “white supremacy.” Indeed, Lurie writes, some “Jewish institutions and power players…have defended and enabled white supremacy.” The main group he fingers with this outrageous slander is the Republican Jewish Coalition, the implication being that this explicitly partisan Republican organization’s discrete support for the Republican president “enables white supremacy.”
It is impossible to imagine Heer, Lurie, or other progressive writers similarly taking the NAACP to task for its perceived lack of concern about racism, or castigating the Human Rights Campaign for insufficiently combating homophobia. No, it is only the cowardice of Jews that is condemned—condemned for supposedly ignoring a form of bigotry that, when expressed on the left, these writers themselves ignore or even defend. The logical gymnastics of these two New Republic writers is what happens when, at base, one fundamentally resents Jews: You end up blaming them for anti-Semitism. Blaming Jews for not sufficiently caring enough about anti-Semitism is emotionally the same as claiming that Jews are to blame for anti-Semitism. Both signal an envy and resentment of Jews predicated upon a belief that they have some kind of authority that the claimant doesn’t and therefore needs to undermine.T his past election, one could not help but notice how the media seemingly discovered anti-Semitism when it emanated from the right, and then only when its targets were Jews on the left. It was enough to make one ask where they had been when left-wing anti-Semitism had been a more serious and pervasive problem. From at least 1996 (the year Pat Buchanan made his last serious attempt at securing the GOP presidential nomination) to 2016 (when the Republican presidential nominee did more to earn the support of white supremacists and neo-Nazis than any of his predecessors), anti-Semitism was primarily a preserve of the American left. In that two-decade period—spanning the collapse of the Oslo Accords and rise of the Second Intifada to the rancorous debate over the Iraq War and obsession with “neocons” to the presidency of Barack Obama and the 2015 Iran nuclear deal—anti-Israel attitudes and anti-Semitic conspiracy made unprecedented inroads into respectable precincts of the American academy, the liberal intelligentsia, and the Democratic Party.
The main form that left-wing anti-Semitism takes in the United States today is unhinged obsession with the wrongs, real or perceived, of the state of Israel, and the belief that its Jewish supporters in the United States exercise a nefarious control over the levers of American foreign policy. In this respect, contemporary left-wing anti-Semitism is not altogether different from that of the far right, though it usually lacks the biological component deeming Jews a distinct and inferior race. (Consider the left-wing anti-Semite’s eagerness to identify and promote Jewish “dissidents” who can attest to their co-religionists’ craftiness and deceit.) The unholy synergy of left and right anti-Semitism was recently epitomized by former CIA agent and liberal stalwart Valerie Plame’s hearty endorsement, on Twitter, of an article written for an extreme right-wing website by a fellow former CIA officer named Philip Giraldi: “America’s Jews Are Driving America’s Wars.” Plame eventually apologized for sharing the article with her 50,000 followers, but not before insisting that “many neocon hawks are Jewish” and that “just FYI, I am of Jewish descent.”
The main fora in which left-wing anti-Semitism appears is academia. According to the ADL, anti-Semitic incidents on college campuses doubled from 2014 to 2015, the latest year that data are available. Writing in National Affairs, Ruth Wisse observes that “not since the war in Vietnam has there been a campus crusade as dynamic as the movement of Boycott, Divestment, and Sanctions against Israel.” Every academic year, a seeming surfeit of controversies erupt on campuses across the country involving the harassment of pro-Israel students and organizations, the disruption of events involving Israeli speakers (even ones who identify as left-wing), and blatantly anti-Semitic outbursts by professors and student activists. There was the Oberlin professor of rhetoric, Joy Karega, who posted statements on social media claiming that Israel had created ISIS and had orchestrated the murderous attack on Charlie Hebdo in Paris. There is the Rutgers associate professor of women’s and gender studies, Jasbir Puar, who popularized the ludicrous term “pinkwashing” to defame Israel’s LGBT acceptance as a massive conspiracy to obscure its oppression of Palestinians. Her latest book, The Right to Maim, academically peer-reviewed and published by Duke University Press, attacks Israel for sparing the lives of Palestinian civilians, accusing its military of “shooting to maim rather than to kill” so that it may keep “Palestinian populations as perpetually debilitated, and yet alive, in order to control them.”
One could go on and on about such affronts not only to Jews and supporters of Israel but to common sense, basic justice, and anyone who believes in the prudent use of taxpayer dollars. That several organizations exist solely for the purpose of monitoring anti-Israel and anti-Semitic agitation on American campuses attests to the prolificacy of the problem. But it’s unclear just how reflective these isolated examples of the college experience really are. A 2017 Stanford study purporting to examine the issue interviewed 66 Jewish students at five California campuses noted for “being particularly fertile for anti-Semitism and for having an active presence of student groups critical of Israel and Zionism.” It concluded that “contrary to widely shared impressions, we found a picture of campus life that is neither threatening nor alarmist…students reported feeling comfortable on their campuses, and, more specifically, comfortable as Jews on their campuses.” To the extent that Jewish students do feel pressured, the report attempted to spread the blame around, indicting pro-Israel activists alongside those agitating against it. “[Survey respondents] fear that entering political debate, especially when they feel the social pressures of both Jewish and non-Jewish activist communities, will carry social costs that they are unwilling to bear.”
Yet by its own admission, the report “only engaged students who were either unengaged or minimally engaged in organized Jewish life on their campuses.” Researchers made a study of anti-Semitism, then, by interviewing the Jews least likely to experience it. “Most people don’t really think I’m Jewish because I look very Latina…it doesn’t come up in conversation,” one such student said in an interview. Ultimately, the report revealed more about the attitudes of unengaged (and, thus, uninformed) Jews than about the state of anti-Semitism on college campuses. That may certainly be useful in its own right as a means of understanding how unaffiliated Jews view debates over Israel, but it is not an accurate marker of developments on college campuses more broadly.
A more extensive 2016 Brandeis study of Jewish students at 50 schools found 34 percent agreed at least “somewhat” that their campus has a hostile environment toward Israel. Yet the variation was wide; at some schools, only 3 percent agreed, while at others, 70 percent did. Only 15 percent reported a hostile environment towards Jews. Anti-Semitism was found to be more prevalent at public universities than private ones, with the determinative factor being the presence of a Students for Justice in Palestine chapter on campus. Important context often lost in conversations about campus anti-Semitism, and reassuring to those concerned about it, is that it is simply not the most important issue roiling higher education. “At most schools,” the report found, “fewer than 10 percent of Jewish students listed issues pertaining to either Jews or Israel as among the most pressing on campus.”F or generations, American Jews have depended on anti-Semitism’s remaining within a moral quarantine, a cordon sanitaire, and America has reliably kept this societal virus contained. While there are no major signs that this barricade is breaking down in the immediate future, there are worrying indications on the political horizon.
Surveying the situation at the international level, the declining global position of the United States—both in terms of its hard military and economic power relative to rising challengers and its position as a credible beacon of liberal democratic values—does not portend well for Jews, American or otherwise. American leadership of the free world, has, in addition to ensuring Israel’s security, underwritten the postwar liberal world order. And it is the constituent members of that order, the liberal democratic states, that have served as the best guarantor of the Jews’ life and safety over their 6,000-year history. Were America’s global leadership role to diminish or evaporate, it would not only facilitate the rise of authoritarian states like Iran and terrorist movements such as al-Qaeda, committed to the destruction of Israel and the murder of Jews, but inexorably lead to a worldwide rollback of liberal democracy, an outcome that would inevitably redound to the detriment of Jews.
Domestically, political polarization and the collapse of public trust in every American institution save the military are demolishing what little confidence Americans have left in their system and governing elites, not to mention preparing the ground for some ominous political scenarios. Widely cited survey data reveal that the percentage of American Millennials who believe it “essential” to live in a liberal democracy hovers at just over 25 percent. If Trump is impeached or loses the next election, a good 40 percent of the country will be outraged and susceptible to belief in a stab-in-the-back theory accounting for his defeat. Whom will they blame? Perhaps the “neoconservatives,” who disproportionately make up the ranks of Trump’s harshest critics on the right?
Ultimately, the degree to which anti-Semitism becomes a problem in America hinges on the strength of the antibodies within the country’s communal DNA to protect its pluralistic and liberal values. But even if this resistance to tribalism and the cult of personality is strong, it may not be enough to abate the rise of an intellectual and societal disease that, throughout history, thrives upon economic distress, xenophobia, political uncertainty, ethnic chauvinism, conspiracy theory, and weakening democratic norms.
1 Somewhat paradoxically, according to FBI crime statistics, the majority of religiously based hate crimes target Jews, more than double the amount targeting Muslims. This indicates more the commitment of the country’s relatively small number of hard-core anti-Semites than pervasive anti-Semitism.
4 The ADL has had to maintain a delicate balancing act in the age of Trump, coming under fire by many conservative Jews for a perceived partisan tilt against the right. This makes Heer’s complaint all the more ignorant — and unhelpful.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
Review of 'The Once and Future Liberal' By Mark Lilla
Lilla, a professor at Columbia University, tells us that “the story of how a successful liberal politics of solidarity became a failed pseudo-politics of identity is not a simple one.” And about this, he’s right. Lilla quotes from the feminist authors of the 1977 Combahee River Collective Manifesto: “The most profound and potentially most radical politics come directly out of our own identity, as opposed to working to end somebody else’s oppression.” Feminists looked to instantiate the “radical” and electrifying phrase which insisted that “the personal is political.” The phrase, argues Lilla, was generally seen in “a somewhat Marxist fashion to mean that everything that seems personal is in fact political.”
The upshot was fragmentation. White feminists were deemed racist by black feminists—and both were found wanting by lesbians, who also had black and white contingents. “What all these groups wanted,” explains Lilla, “was more than social justice and an end to the [Vietnam] war. They also wanted there to be no space between what they felt inside and what they saw and did in the world.” He goes on: “The more obsessed with personal identity liberals become, the less willing they become to engage in reasoned political debate.” In the end, those on the left came to a realization: “You can win a debate by claiming the greatest degree of victimization and thus the greatest outrage at being subjected to questioning.”
But Lilla’s insights into the emotional underpinnings of political correctness are undercut by an inadequate, almost bizarre sense of history. He appears to be referring to the 1970s when, zigzagging through history, he writes that “no recognition of personal or group identity was coming from the Democratic Party, which at the time was dominated by racist Dixiecrats and white union officials of questionable rectitude.”
What is he talking about? Is Lilla referring to the Democratic Party of Lyndon Johnson, Hubert Humphrey, and George McGovern? Is he referring obliquely to George Wallace? If so, why is Wallace never mentioned? Lilla seems not to know that it was the 1972 McGovern Democratic Convention that introduced minority seating to be set aside for blacks and women.
At only 140 pages, this is a short book. But even so, Lilla could have devoted a few pages to Frankfurt ideologist Herbert Marcuse and his influence on the left. In the 1960s, Marcuse argued that leftists and liberals were entitled to restrain centrist and conservative speech on the grounds that the universities had to act as a counterweight to society at large. But this was not just rhetoric; in the campus disruption of the early 1970s at schools such as Yale, Cornell, and Amherst, Marcuse’s ideals were pushed to the fore.
If Lilla’s argument comes off as flaccid, perhaps that’s because the aim of The Once and Future Liberal is more practical than principled. “The only way” to protect our rights, he tells the reader, “is to elect liberal Democratic governors and state legislators who’ll appoint liberal state attorneys.” According to Lilla, “the paradox of identity liberalism” is that it undercuts “the things it professes to want,” namely political power. He insists, rightly, that politics has to be about persuasion but then contradicts himself in arguing that “politics is about seizing power to defend the truth.” In other words, Lilla wants a better path to total victory.
Given what Lilla, descending into hysteria, describes as “the Republican rage for destruction,” liberals and Democrats have to win elections lest the civil rights of blacks, women, and gays are rolled back. As proof of the ever-looming danger, he notes that when the “crisis of the mid-1970s threatened…the country turned not against corporations and banks, but against liberalism.” Yet he gives no hint of the trail of liberal failures that led to the crisis of the mid-’70s. You’d never know reading Lilla, for example, that the Black Power movement intensified racial hostilities that were then further exacerbated by affirmative action and busing. And you’d have no idea that, at considerable cost, the poverty programs of the Great Society failed to bring poorer African Americans into the economic mainstream. Nor does Lilla deal with the devotion to Keynesianism that produced inflation without economic growth during the Carter presidency.
Despite his discursive ambling through the recent history of American political life, Lilla has a one-word explanation for identity politics: Reaganism. “Identity,” he writes, is “Reaganism for lefties.” What’s crucial in combating Reaganism, he argues, is to concentrate on our “shared political” status as citizens. “Citizenship is a crucial weapon in the battle against Reaganite dogma because it brings home that fact that we are part of a legitimate common enterprise.” But then he asserts that the “American right uses the term citizenship today as a means of exclusion.” The passage might lead the reader to think that Lilla would take up the question of immigration and borders. But he doesn’t, and the closing passages of the book dribble off into characteristic zigzags. Lilla tells us that “Black Lives Matter is a textbook example of how not to build solidarity” but then goes on, without evidence, to assert the accuracy of the Black Lives Matter claim that African-Americans have been singled out for police mistreatment.
It would be nice to argue that The Once and Future Liberal is a near miss, a book that might have had enduring importance if only it went that extra step. But Lilla’s passing insights on the perils of a politically correct identity politics drown in the rhetoric of conventional bromides that fill most of the pages of this disappointing book.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
n Athens several years ago, I had dinner with a man running for the national parliament. I asked him whether he thought he had a shot at winning. He was sure of victory, he told me. “I have hired a very famous political consultant from Washington,” he said. “He is the man who elected Reagan. Expensive. But the best.”
The political genius he then described was a minor political flunky I had met in Washington long ago, a more-or-less anonymous member of the Republican National Committee before he faded from view at the end of Ronald Reagan’s second term. Mutual acquaintances told me he still lived in a nice neighborhood in Northern Virginia, but they never could figure out what the hell he did to earn his money. (This is a recurring mystery throughout the capital.) I had to come to Greece to find the answer.
It is one of the dark arts of Washington, this practice of American political hacks traveling to faraway lands and suckering foreign politicians into paying vast sums for splashy, state-of-the-art, essentially worthless “services.” And it’s perfectly legal. Paul Manafort, who briefly managed Donald Trump’s campaign last summer, was known as a pioneer of the globe-trotting racket. If he hadn’t, as it were, veered out of his gutter into the slightly higher lane of U.S. presidential politics, he likely could have hoovered cash from the patch pockets of clueless clients from Ouagadougou to Zagreb for the rest of his natural life and nobody in Washington would have noticed.
But he veered, and now he and a colleague find themselves indicted by Robert Mueller, the Inspector Javert of the Russian-collusion scandal. When those indictments landed, they instantly set in motion the familiar scramble. Trump fans announced that the indictments were proof that there was no collusion between the Trump campaign and the Russians—or, in the crisp, emphatic phrasing of a tweet by the world’s Number One Trump Fan, Donald Trump: “NO COLLUSION!!!!” The Russian-scandal fetishists in the press corps replied in chorus: It’s still early! Javert required more time, and so will Mueller, and so will they.
A good Washington scandal requires a few essential elements. One is a superabundance of information. From these data points, conspiracy-minded reporters can begin to trace associations, warranted or not, and from the associations, they can infer motives and objectives with which, stretched together, they can limn a full-blown conspiracy theory. The Manafort indictment released a flood of new information, and at once reporters were pawing for nuggets that might eventually form a compelling case for collusion.
They failed to find any because Manafort’s indictment, in essence, involved his efforts to launder his profits from his international political work, not his work for the Trump campaign. Fortunately for the obsessives, another element is required for a good scandal: a colorful cast. The various Clinton scandals brought us Asian money-launderers and ChiCom bankers, along with an entire Faulkner-novel’s worth of bumpkins, sharpies, and backwoods swindlers, plus that intern in the thong. Watergate, the mother lode of Washington scandals, featured a host of implausible characters, from the central-casting villain G. Gordon Liddy to Sam Ervin, a lifelong segregationist and racist who became a hero to liberals everywhere.
Here, at last, is one area where the Russian scandal has begun to show promise. Manafort and his business partner seem too banal to hold the interest of anyone but a scandal obsessive. Beneath the pile of paper Mueller dumped on them, however, another creature could be seen peeking out shyly. This would be the diminutive figure of George Papadopoulos. An unpaid campaign adviser to Trump, Papadopoulos pled guilty to lying to the FBI about the timing of his conversations with Russian agents. He is quickly becoming the stuff of legend.
Papadopoulos is an exemplar of a type long known to American politics. He is the nebbish bedazzled by the big time—achingly ambitious, though lacking the skill, or the cunning, to climb the greasy pole. So he remains at the periphery of the action, ever eager to serve. Papadopoulos’s résumé, for a man under 30, is impressively padded. He said he served as the U.S. representative to the Model United Nations in 2012, though nobody recalls seeing him there. He boasted of a four-year career at the Hudson Institute, though in fact he spent one year there as an unpaid intern and three doing contract research for one of Hudson’s scholars. On his LinkedIn page, he listed himself as a keynote speaker at a Greek American conference in 2008, but in fact he participated only in a panel discussion. The real keynoter was Michael Dukakis.
With this hunger for achievement, real or imagined, Papadopoulos could not let a presidential campaign go by without climbing aboard. In late 2015, he somehow attached himself to Ben Carson’s campaign. He was never paid and lasted four months. His presence went largely unnoticed. “If there was any work product, I never saw it,” Carson’s campaign manager told Time. The deputy campaign manager couldn’t even recall his name. Then suddenly, in April 2016, Papadopoulos appeared on a list of “foreign-policy advisers” to Donald Trump—and, according to Mueller’s court filings, resolved to make his mark by acting as a liaison between Trump’s campaign and the Russian government.
While Mueller tells the story of Papadopoulos’s adventures in the dry, Joe Friday prose of a legal document, it could easily be the script for a Peter Sellers movie from the Cold War era. The young man’s résumé is enough to impress the campaign’s impressionable officials as they scavenge for foreign-policy advisers: “Hey, Corey! This dude was in the Model United Nations!”
Papadopoulus (played by Sellers) sets about his mission. A few weeks after signing on to the campaign, he travels to Europe, where he meets a mysterious “Professor” (Peter Ustinov). “Initially the Professor seemed uninterested in Papadopoulos,” says Mueller’s indictment. A likely story! Yet when Papadopoulus lets drop that he’s an adviser to Trump, the Professor suddenly “appeared to take great interest” in him. They arrange a meeting in London to which the Professor invites a “female Russian national” (Elke Sommer). Without much effort, the femme fatale convinces Papadopoulus that she is Vladimir Putin’s niece. (“I weel tell z’American I em niece of Great Leader! Zat idjut belief ennytink!”) Over the next several months our hero sends many emails to campaign officials and to the Professor, trying to arrange a meeting between them. As far we know from the indictment, nothing came of his mighty efforts.
And there matters lay until January 2017, when the FBI came calling. Agents asked Papadopoulos about his interactions with the Russians. Even though he must have known that hundreds of his emails on the subject would soon be available to the FBI, he lied and told the agents that the contacts had occurred many months before he joined the campaign. History will record Papadopoulos as the man who forgot that emails carry dates on them. After the FBI interview, according to the indictment, he tried to destroy evidence with the same competence he has brought to his other endeavors. He closed his Facebook account, on which several communications with the Russians had taken place. He threw out his old cellphone. (That should do it!) After that, he began wearing a blindfold, on the theory that if he couldn’t see the FBI, the FBI couldn’t see him.
I made that last one up, obviously. For now, the great hope of scandal hobbyists is that Papadopoulus was wearing a wire between the time he secretly pled guilty and the time his plea was made public. This would have allowed him to gather all kinds of incriminating dirt in conversations with former colleagues. And the dirt is there, all right, as the Manafort indictment proves. Unfortunately for our scandal fetishists, so far none of it shows what their hearts most desire: active collusion between Russia and the Trump campaign.
Choose your plan and pay nothing for six Weeks!
An affair to remember
All this changed with the release in 1967 of Arthur Penn’s Bonnie and Clyde and Mike Nichols’s The Graduate. These two films, made in nouveau European style, treated familiar subjects—a pair of Depression-era bank robbers and a college graduate in search of a place in the adult world—in an unmistakably modern manner. Both films were commercial successes that catapulted their makers and stars into the top echelon of what came to be known as “the new Hollywood.”
Bonnie and Clyde inaugurated a new era in which violence on screen simultaneously became bloodier and more aestheticized, and it has had enduring impact as a result. But it was The Graduate that altered the direction of American moviemaking with its specific appeal to younger and hipper moviegoers who had turned their backs on more traditional cinematic fare. When it opened in New York in December, the movie critic Hollis Alpert reported with bemusement that young people were lining up in below-freezing weather to see it, and that they showed no signs of being dismayed by the cold: “It was as though they all knew they were going to see something good, something made for them.”
The Graduate, whose aimless post-collegiate title character is seduced by the glamorous but neurotic wife of his father’s business partner, is part of the common stock of American reference. Now, a half-century later, it has become the subject of a book-length study, Beverly Gray’s Seduced by Mrs. Robinson: How The Graduate Became the Touchstone of a Generation.1 As is so often the case with pop-culture books, Seduced by Mrs. Robinson is almost as much about its self-absorbed Baby Boomer author (“The Graduate taught me to dance to the beat of my own drums”) as its subject. It has the further disadvantage of following in the footsteps of Mark Harris’s magisterial Pictures at a Revolution: Five Movies and the Birth of the New Hollywood (2008), in which the film is placed in the context of Hollywood’s mid-’60s cultural flux. But Gray’s book offers us a chance to revisit this seminal motion picture and consider just why it was that The Graduate spoke to Baby Boomers in a distinctively personal way.T he Graduate began life in 1963 as a novella of the same name by Charles Webb, a California-born writer who saw his book not as a comic novel but as a serious artistic statement about America’s increasingly disaffected youth. It found its way into the hands of a producer named Lawrence Turman who saw The Graduate as an opportunity to make the cinematic equivalent of Salinger’s The Catcher in the Rye. Turman optioned the book, then sent it to Mike Nichols, who in 1963 was still best known for his comic partnership with Elaine May but had just made his directorial debut with the original Broadway production of Barefoot in the Park.
Both men saw that The Graduate posed a problem to anyone seeking to put it on the screen. In Turman’s words, “In the book the character of Benjamin Braddock is sort of a whiny pain in the fanny [whom] you want to shake or spank.” To this end, they turned to Buck Henry, who had co-created the popular TV comedy Get Smart with Mel Brooks, to write a screenplay that would retain much of Webb’s dryly witty dialogue (“I think you’re the most attractive of all my parents’ friends”) while making Benjamin less priggish.
Nichols’s first major act was casting Dustin Hoffman, an obscure New York stage actor pushing 30, for the title role. No one but Nichols seems to have thought him suitable in any way. Not only was Hoffman short and nondescript-looking, but he was unmistakably Jewish, whereas Benjamin is supposedly the scion of a newly monied WASP family from southern California. Nevertheless, Nichols decided he wanted “a short, dark, Jewish, anomalous presence, which is how I experience myself,” in order to underline Benjamin’s alienation from the world of his parents.
Nichols filled the other roles in equally unexpected ways. He hired the Oscar winner Anne Bancroft, only six years Hoffman’s senior, to play the unbalanced temptress who lures Benjamin into her bed, then responds with volcanic rage when he falls in love with her beautiful daughter Elaine. He and Henry also steered clear of on-screen references to the campus protests that had only recently started to convulse America. Instead, he set The Graduate in a timeless upper-middle-class milieu inhabited by people more interested in social climbing than self-actualization—the same milieu from which Benjamin is so alienated that he is reduced to near-speechlessness whenever his family and their friends ask him what he plans to do now that he has graduated.
The film’s only explicit allusion to its cultural moment is the use on the soundtrack of Simon & Garfunkel’s “The Sound of Silence,” the painfully earnest anthem of youthful angst that is for all intents and purposes the theme song of The Graduate. Nevertheless, Henry’s screenplay leaves little doubt that the film was in every way a work of its time and place. As he later explained to Mark Harris, it is a study of “the disaffection of young people for an environment that they don’t seem to be in sync with.…Nobody had made a film specifically about that.”
This aspect of The Graduate is made explicit in a speech by Benjamin that has no direct counterpart in the novel: “It’s like I was playing some kind of game, but the rules don’t make any sense to me. They’re being made up by all the wrong people. I mean, no one makes them up. They seem to make themselves up.”
The Graduate was Nichols’s second film, following his wildly successful movie version of Edward Albee’s Who’s Afraid of Virginia Woolf?. Albee’s play was a snarling critique of the American dream, which he believed to be a snare and a delusion. The Graduate had the same skeptical view of postwar America, but its pessimism was played for laughs. When Benjamin is assured by a businessman in the opening scene that the secret to success in America is “plastics,” we are meant to laugh contemptuously at the smugness of so blinkered a view of life. Moreover, the contempt is as real as the laughter: The Graduate has it both ways. For the same reason, the farcical quality of the climactic scene (in which Benjamin breaks up Elaine’s marriage to a handsome young WASP and carts her off to an unknown fate) is played without musical underscoring, a signal that what Benjamin is doing is really no laughing matter.
The youth-oriented message of The Graduate came through loud and clear to its intended audience, which paid no heed to the mixed reviews from middle-aged reviewers unable to grasp what Nichols and Henry were up to. Not so Roger Ebert, the newly appointed 25-year-old movie critic of the Chicago Sun-Times, who called The Graduate “the funniest American comedy of the year…because it has a point of view. That is to say, it is against something.”
Even more revealing was the response of David Brinkley, then the co-anchor of NBC’s nightly newscast, who dismissed The Graduate as “frantic nonsense” but added that his college-age son and his classmates “liked it because it said about the parents and others what they would have said about us if they had made the movie—that we are self-centered and materialistic, that we are licentious and deeply hypocritical about it, that we try to make them into walking advertisements for our own affluence.”
A year after the release of The Graduate, a film-industry report cited in Pictures at a Revolution revealed that “48 percent of all movie tickets in America were now being sold to filmgoers under the age of 24.” A very high percentage of those tickets were to The Graduate and Bonnie and Clyde. At long last, Hollywood had figured out what the Baby Boomers wanted to see.A nd how does The Graduate look a half-century later? To begin with, it now appears to have been Mike Nichols’s creative “road not taken.” In later years, Nichols became less an auteur than a Hollywood director who thought like a Broadway director, choosing vehicles of solid middlebrow-liberal appeal and serving them faithfully without imposing a strong creative vision of his own. In The Graduate, by contrast, he revealed himself to be powerfully aware of the same European filmmaking trends that shaped Bonnie and Clyde. Within a naturalistic framework, he deployed non-naturalistic “new wave” cinematographic techniques with prodigious assurance—and he was willing to end The Graduate on an ambiguous note instead of wrapping it up neatly and pleasingly, letting the camera linger on the unsure faces of Hoffman and Ross as they ride off into an unsettling future.
It is this ambiguity, coupled with Nichols’s prescient decision not to allow The Graduate to become a literal portrayal of American campus life in the troubled mid-’60s, that has kept the film fresh. But The Graduate is fresh in a very particular way: It is a young person’s movie, the tale of a boy-man terrified by the prospect of growing up to be like his parents. Therein lay the source of its appeal to young audiences. The Graduate showed them what they, too, feared most, and hinted at a possible escape route.
In the words of Beverly Gray, who saw The Graduate when it first came out in 1967: “The Graduate appeared in movie houses just as we young Americans were discovering how badly we wanted to distance ourselves from the world of our parents….That polite young high achiever, those loving but smothering parents, those comfortable but slightly bland surroundings: They combined to form an only slightly exaggerated version of my own cozy West L.A. world.”
Yet to watch The Graduate today—especially if you first saw it when much younger—is also to be struck by the extreme unattractiveness of its central character. Hoffman plays Benjamin not as the comically ineffectual nebbish of Jewish tradition but as a near-catatonic robot who speaks by turns in a flat monotone and a frightened nasal whine. It is impossible to understand why Mrs. Robinson would want to go to bed with such a mousy creature, much less why Elaine would run off with him—an impression that has lately acquired an overlay of retrospective irony in the wake of accusations that Hoffman has sexually harassed female colleagues on more than one occasion. Precisely because Benjamin is so unlikable, it is harder for modern-day viewers to identify with him in the same way as did Gray and her fellow Boomers. To watch a Graduate-influenced film like Noah Baumbach’s Kicking and Screaming (1995), a poignant romantic comedy about a group of Gen-X college graduates who deliberately choose not to get on with their lives, is to see a closely similar dilemma dramatized in an infinitely more “relatable” way, one in which the crippling anxiety of the principal characters is presented as both understandable and pitiable, thus making it funnier.
Be that as it may, The Graduate is a still-vivid snapshot of a turning point in American cultural history. Before Benjamin Braddock, American films typically portrayed men who were not overgrown, smooth-faced children but full-grown adults, sometimes misguided but incontestably mature. After him, permanent immaturity became the default position of Hollywood-style masculinity.
For this reason, it will be interesting to see what the Millennials, so many of whom demand to be shielded from the “triggering” realities of adult life, make of The Graduate if and when they come to view it. I have a feeling that it will speak to a fair number of them far more persuasively than it did to those of us who—unlike Benjamin Braddock—longed when young to climb the high hill of adulthood and see for ourselves what awaited us on the far side.
1 Algonquin, 278 pages
Choose your plan and pay nothing for six Weeks!
“I think that’s best left to states and locales to decide,” DeVos replied. “If the underlying question is . . .”
Murphy interrupted. “You can’t say definitively today that guns shouldn’t be in schools?”
“Well, I will refer back to Senator Enzi and the school that he was talking about in Wapiti, Wyoming, I think probably there, I would imagine that there’s probably a gun in the school to protect from potential grizzlies.”
Murphy continued his line of questioning unfazed. “If President Trump moves forward with his plan to ban gun-free school zones, will you support that proposal?”
“I will support what the president-elect does,” DeVos replied. “But, senator, if the question is around gun violence and the results of that, please know that my heart bleeds and is broken for those families that have lost any individual due to gun violence.”
Because all this happened several million outrage cycles ago, you may have forgotten what happened next. Rather than mention DeVos’s sympathy for the victims of gun violence, or her support for federalism, or even her deference to the president, the media elite fixated on her hypothetical aside about grizzly bears.
“Betsy DeVos Cites Grizzly Bears During Guns-in-Schools Debate,” read the NBC News headline. “Citing grizzlies, education nominee says states should determine school gun policies,” reported CNN. “Sorry, Betsy DeVos,” read a headline at the Atlantic, “Guns Aren’t a Bear Necessity in Schools.”
DeVos never said that they were, of course. Nor did she “cite” the bear threat in any definitive way. What she did was decline the opportunity to make a blanket judgment about guns and schools because, in a continent-spanning nation of more than 300 million people, one standard might not apply to every circumstance.
After all, there might be—there are—cases when guns are necessary for security. Earlier this year, Virginia Governor Terry McAuliffe signed into law a bill authorizing some retired police officers to carry firearms while working as school guards. McAuliffe is a Democrat.
In her answer to Murphy, DeVos referred to a private meeting with Senator Enzi, who had told her of a school in Wyoming that has a fence to keep away grizzly bears. And maybe, she reasoned aloud, the school might have a gun on the premises in case the fence doesn’t work.
As it turns out, the school in Wapiti is gun-free. But we know that only because the Washington Post treated DeVos’s offhand remark as though it were the equivalent of Alexander Butterfield’s revealing the existence of the secret White House tapes. “Betsy DeVos said there’s probably a gun at a Wyoming school to ward off grizzlies,” read the Post headline. “There isn’t.” Oh, snap!
The article, like the one by NBC News, ended with a snarky tweet. The Post quoted user “Adam B.,” who wrote, “‘We need guns in schools because of grizzly bears.’ You know what else stops bears? Doors.” Clever.
And telling. It becomes more difficult every day to distinguish between once-storied journalistic institutions and the jabbering of anonymous egg-avatar Twitter accounts. The eagerness with which the press misinterprets and misconstrues Trump officials is something to behold. The “context” the best and brightest in media are always eager to provide us suddenly goes poof when the opportunity arises to mock, impugn, or castigate the president and his crew. This tendency is especially pronounced when the alleged gaffe fits neatly into a prefabricated media stereotype: that DeVos is unqualified, say, or that Rick Perry is, well, Rick Perry.
On November 2, the secretary of energy appeared at an event sponsored by Axios.com and NBC News. He described a recent trip to Africa:
It’s going to take fossil fuels to push power out to those villages in Africa, where a young girl told me to my face, “One of the reasons that electricity is so important to me is not only because I won’t have to try to read by the light of a fire, and have those fumes literally killing people, but also from the standpoint of sexual assault.” When the lights are on, when you have light, it shines the righteousness, if you will, on those types of acts. So from the standpoint of how you really affect people’s lives, fossil fuels is going to play a role in that.
This heartfelt story of the impact of electrification on rural communities was immediately distorted into a metaphor for Republican ignorance and cruelty.
“Energy Secretary Rick Perry Just Made a Bizarre Claim About Sexual Assault and Fossil Fuels,” read the Buzzfeed headline. “Energy Secretary Rick Perry Says Fossil Fuels Can Prevent Sexual Assault,” read the headline from NBC News. “Rick Perry Says the Best Way to Prevent Rape Is Oil, Glorious Oil,” said the Daily Beast.
“Oh, that Rick Perry,” wrote Gail Collins in a New York Times column. “Whenever the word ‘oil’ is mentioned, Perry responds like a dog on the scent of a hamburger.” You will note that the word “oil” is not mentioned at all in Perry’s remarks.
You will note, too, that what Perry said was entirely commonsensical. While the precise relation between public lighting and public safety is unknown, who can doubt that brightly lit areas feel safer than dark ones—and that, as things stand today, cities and towns are most likely to be powered by fossil fuels? “The value of bright street lights for dispirited gray areas rises from the reassurance they offer to some people who need to go out on the sidewalk, or would like to, but lacking the good light would not do so,” wrote Jane Jacobs in The Death and Life of Great American Cities. “Thus the lights induce these people to contribute their own eyes to the upkeep of the street.” But c’mon, what did Jane Jacobs know?
No member of the Trump administration so rankles the press as the president himself. On the November morning I began this column, I awoke to outrage that President Trump had supposedly violated diplomatic protocol while visiting Japan and its prime minister, Shinzo Abe. “President Trump feeds fish, winds up pouring entire box of food into koi pond,” read the CNN headline. An article on CBSNews.com headlined “Trump empties box of fish food into Japanese koi pond” began: “President Donald Trump’s visit to Japan briefly took a turn from formal to fishy.” A Bloomberg reporter traveling with the president tweeted, “Trump and Abe spooning fish food into a pond. (Toward the end, @potus decided to just dump the whole box in for the fish).”
Except that’s not what Trump “decided.” In fact, Trump had done exactly what Abe had done a few seconds before. That fact was buried in write-ups of the viral video of Trump and the fish. “President Trump was criticized for throwing an entire box of fish food into a koi pond during his visit to Japan,” read a Tweet from the New York Daily News, linking to a report on phony criticism Trump received because of erroneous reporting from outlets like the News.
There’s an endless, circular, Möbius-strip-like quality to all this nonsense. Journalists are so eager to catch the president and his subordinates doing wrong that they routinely traduce the very canons of journalism they are supposed to hold dear. Partisan and personal animus, laziness, cynicism, and the oversharing culture of social media are a toxic mix. The press in 2017 is a lot like those Japanese koi fish: frenzied, overstimulated, and utterly mindless.
Choose your plan and pay nothing for six Weeks!
Review of 'Lessons in Hope' By George Weigel
Standing before the eternal flame, a frail John Paul shed silent tears for 6 million victims, including some of his own childhood friends from Krakow. Then, after reciting verses from Psalm 31, he began: “In this place of memories, the mind and heart and soul feel an extreme need for silence. … Silence, because there are no words strong enough to deplore the terrible tragedy of the Shoah.” Parkinson’s disease strained his voice, but it was clear that the pope’s irrepressible humanity and spiritual strength had once more stood him in good stead.
George Weigel watched the address from NBC’s Jerusalem studios, where he was providing live analysis for the network. As he recalls in Lessons in Hope, his touching and insightful memoir of his time as the pope’s biographer, “Our newsroom felt the impact of those words, spoken with the weight of history bearing down on John Paul and all who heard him: normally a place of bedlam, the newsroom fell completely silent.” The pope, he writes, had “invited the world to look, hard, at the stuff of its redemption.”
Weigel, a senior fellow at the Ethics and Public Policy Center, published his biography of John Paul in two volumes, Witness to Hope (1999) and The End and the Beginning (2010). His new book completes a John Paul triptych, and it paints a more informal, behind-the-scenes portrait. Readers, Catholic and otherwise, will finish the book feeling almost as though they knew the 264th successor of Peter. Lessons in Hope is also full of clerical gossip. Yet Weigel never loses sight of his main purpose: to illuminate the character and mind of the “emblematic figure of the second half of the twentieth century.”
The book’s most important contribution comes in its restatement of John Paul’s profound political thought at a time when it is sorely needed. Throughout, Weigel reminds us of the pope’s defense of the freedom of conscience; his emphasis on culture as the primary engine of history; and his strong support for democracy and the free economy.
When the Soviet Union collapsed, the pope continued to promote these ideas in such encyclicals as Centesimus Annus. The 1991 document reiterated the Church’s opposition to socialist regimes that reduce man to “a molecule within the social organism” and trample his right to earn “a living through his own initiative.” Centesimus Annus also took aim at welfare states for usurping the role of civil society and draining “human energies.” The pope went on to explain the benefits, material and moral, of free enterprise within a democratic, rule-of-law framework.
Yet a libertarian manifesto Centesimus Annus was not. It took note of free societies’ tendency to breed spiritual poverty, materialism, and social incohesion, which in turn could lead to soft totalitarianism. John Paul called on state, civil society, and people of God to supply the “robust public moral culture” (in Weigel’s words) that would curb these excesses and ensure that free-market democracies are ordered to the common good.
When Weigel emerged as America’s preeminent interpreter of John Paul, in the 1980s and ’90s, these ideas were ascendant among Catholic thinkers. In addition to Weigel, proponents included the philosopher Michael Novak and Father Richard John Neuhaus of First Things magazine (both now dead). These were faithful Catholics (in Neuhaus’s case, a relatively late convert) nevertheless at peace with the free society, especially the American model. They had many qualms with secular modernity, to be sure. But with them, there was no question that free societies and markets are preferable to unfree ones.
How things have changed. Today all the energy in those Catholic intellectual circles is generated by writers and thinkers who see modernity as beyond redemption and freedom itself as the problem. For them, the main question is no longer how to correct the free society’s course (by shoring up moral foundations, through evangelization, etc.). That ship has sailed or perhaps sunk, according to this view. The challenges now are to protect the Church against progressivism’s blows and to see beyond the free society as a political horizon.
Certainly the trends that worried John Paul in Centesimus Annus have accelerated since the encyclical was issued. “The claim that agnosticism and skeptical relativism are the philosophy and the basic attitude which correspond to democratic forms of political life” has become even more hegemonic than it was in 1991. “Those who are convinced that they know the truth and firmly adhere to it” increasingly get treated as ideological lepers. And with the weakening of transcendent truths, ideas are “easily manipulated for reasons of power.”
Thus a once-orthodox believer finds himself or herself compelled to proclaim that there is no biological basis to gender; that men can menstruate and become pregnant; that there are dozens of family forms, all as valuable and deserving of recognition as the conjugal union of a man and a woman; and that speaking of the West’s Judeo-Christian patrimony is tantamount to espousing white supremacy. John Paul’s warnings read like a description of the present.
The new illiberal Catholics—a label many of these thinkers embrace—argue that these developments aren’t a distortion of the idea of the free society but represent its very essence. This is a mistake. Basic to the free society is the freedom of conscience, a principle enshrined in democratic constitutions across the West and, I might add, in the Catholic Church’s post–Vatican II magisterium. Under John Paul, religious liberty became Rome’s watchword in the fight against Communist totalitarianism, and today it is the Church’s best weapon against the encroachments of secular progressivism. The battle is far from lost, moreover. There is pushback in the courts, at the ballot box, and online. Sometimes it takes demagogic forms that should discomfit people of faith. Then again, there is a reason such pushback is called “reaction.”
A bigger challenge for Catholics prepared to part ways with the free society as an ideal is this: What should Christian politics stand for in the 21st century? Setting aside dreams of reuniting throne and altar and similar nostalgia, the most cogent answer offered by Catholic illiberalism is that the Church should be agnostic with respect to regimes. As Harvard’s Adrian Vermeule has recently written, Christians should be ready to jettison all “ultimate allegiances,” including to the Constitution, while allying with any party or regime when necessary.
What at first glance looks like an uncompromising Christian politics—cunning, tactical, and committed to nothing but the interests of the Church—is actually a rather passive vision. For a Christianity that is “radically flexible” in politics is one that doesn’t transform modernity from within. In practice, it could easily look like the Vatican Ostpolitik diplomacy that sought to appease Moscow before John Paul was elected.
Karol Wojtya discarded Ostpolitik as soon as he took the Petrine office. Instead, he preached freedom and democracy—and meant it. Already as archbishop of Krakow under Communism, he had created free spaces where religious and nonreligious dissidents could engage in dialogue. As pope, he expressed genuine admiration for the classically liberal and decidedly secular Vaclav Havel. He hailed the U.S. Constitution as the source of “ordered freedom.” And when, in 1987, the Chilean dictator Augusto Pinochet asked him why he kept fussing about democracy, seeing as “one system of government is as good as another,” the pope responded: No, “the people have a right to their liberties, even if they make mistakes in exercising them.”
The most heroic and politically effective Christian figure of the 20th century, in other words, didn’t follow the path of radical flexibility. His Polish experience had taught him that there are differences between regimes—that some are bound to uphold conscience and human dignity, even if they sometimes fall short of these commitments, while others trample rights by design. The very worst of the latter kind could even whisk one’s boyhood friends away to extermination camps. There could be no radical Christian flexibility after the Holocaust.