A coherent and ordered picture of the world has gradually emerged in the 20th century. It is modeled on the…
In the 20th century, the ordered universe broke apart into atoms, and man found himself alone. William Butler Yeats expressed the new sense of breakdown in a famous phrase, “the center cannot hold” (from his poem “The Second Coming”); it is at the heart as well of T.S. Eliot’s “The Waste Land,” surely the best known modern poem, with its no less famous ending, “These fragments I have shored against my ruins.” The American novelist Walker Percy, alluding to Yeats, recently summed up this conception, which has dominated the modern imagination right up to the present, in the following words:
I think it fair to begin with the assumption, which seems fairly obvious, that, as the poet said, the center is not holding; that the consensus, while it might not have disappeared, is at least seriously called into question. Indeed, to judge from a good many contemporary novels, films, and plays, it often appears that the only consensus possible is a documentation of the fragmentation.
The lost consensus to which Walker Percy refers is one which stretched back, changes and permutations notwithstanding, for centuries. The Renaissance scholar E.M.W. Tillyard, in The Elizabethan World Picture (1943), describes the form it took in Shakespeare’s time. As epitomized in Ulysses’s speech on degree in Troilus and Cressida, a great chain of being was imagined, extending vertically in logical, hierarchical steps from the highest to the lowest spheres of the universe. On the earth, rulers held sway over their subjects, fathers over their families, and man dominated the lower animals. The essential order of things was expressed in the harmonies of music and the rhythm and rhyme of poetry.
It was precisely the loss of this sense of order and authority that Eliot and Yeats mourned, a loss that found formal expression in the early 20th century in musical disharmony and free verse. Yet if the 20th century as a whole has been characterized by the conviction that we are living in a fragmented universe, the truth is that a coherent and ordered picture of the world has nevertheless gradually emerged and in the last decades has taken on definable form. The new picture is, in fact, modeled exactly on the old idea of an ordered hierarchical universe, except that the terms have been systematically reversed. Where once man was at the top of this order—only a little lower than the angels—now he is somewhere near the bottom; and where once his authority was taken to be divinely inspired and ordained, now it has come to be seen as a kind of usurpation.
The traditional justification for man’s dominion over the earth, namely his superiority over other creatures, has now been challenged. After all, it is argued, animals, too, communicate and reason, they have emotions, and what is more they are on the whole less destructive than man. And as with animals, so with primitive man and with societies less developed than our own: both are closer to the sources of natural wisdom, and both wreak less damage upon the ecosystem and biosphere than does Western man.
By analogous reasoning, the earliest human societies, with their pre-capitalist forms of organization, are seen as the best. Here, unlike in earlier attempts to celebrate the past over the present, which tended to favor the high achievements of classical civilization, the contemporary world view elevates the simplicities of past times. As for the future, that ultimate expression of the present, it is seen in the new perspective not as a hoped-for culmination but as a looming epoch of decline, a nightmare of overpopulation, cancer, famine, ecocide, nuclear winter.
The work of hierarchy reversal proceeds on many different fronts, and is undeterred even by such seeming obstacles as the Darwinian theory of biological adaptation, with its suggestion of progress upward to an apex that is man. “From a strict evolutionary point of view,” writes one contemporary biologist, Bernard Rollin,
there is no top of the ladder, there is only a branching tree. If there is a “top,” it has to do with adaptability and species durability and reproductive success. In that case, the rat shares billing with us and we both lose to the cockroach.
The unfavorable comparison of man with cockroach has become a familiar staple of biological discussion—although Stephen Jay Gould is ready to take a stunning step even further down the scale:
Evolution is a copiously branching network, not a ladder, and I do not see how we, the titular spokesmen for a few thousand mammalian species, can claim superiority over three quarters of a million species of insects who will surely outlive us all, not to mention the bacteria, who have shown remarkable staying power for more than three billion years.
Here we have very nearly the ultimate demotion of man, the inferior not only of primitive peoples, other mammals, and the cockroach, but even of bacteria.
But there is still more—or, rather, less. It has remained for contemporary poetry to accomplish a fully imagined reduction of man beneath the status of cockroaches and bacteria. In his 1978 survey, “How to Read the Contemporary Poem,” the critic Paul Breslin points out that “the most popular key word in the new poetry” is “stone,” standing for the “furthest things from the human—the least conscious, the simplest, the most innocent.” What stones represent in the contemporary world picture may be gathered from an interview, cited by Breslin, with the Pulitzer prize-winning poet Galway Kinnell. “If you could go even deeper” on the contemporary scale, says Kinnell,
you’d not be a person, you’d be an animal; and if you went deeper still, you’d be a blade of grass, eventually a stone. If a stone could speak, your poem would be its words.
Beyond this, surely, it is impossible to go, though if one considers the matter carefully enough, it is evident that there must be a reverse hierarchy among stones themselves, from the least valuable (diamonds, surely) to the most (perhaps the friable, vulnerable sandstone).
The strength of a world picture may be said to lie in its capacity to sustain itself in the face of contradictions. Thus, some lines written by John Donne a few years after the death of Queen Elizabeth indicate how it must have felt when the world picture of the time came under threat. Alluding to the philosophical repercussions of the idea that the universe might not be earth-centered, Donne observed that the
. . . new Philosophy calls all in doubt
. . . . . . .
‘Tis all in peeces, all cohaerence gone;
All just supply, and all Relation:
Prince, Subject, Father, Sonne, are things forgot.
Something like the confusion registered by Donne manifests itself (though not always with Donne’s frankness) whenever our own contemporary world picture is threatened.
Of course, where post-Elizabethans were disturbed by evidence that the universe and the earth had not evolved in a harmonious fashion, those who hold the new view feel threatened by evidence of the opposite kind, and specifically evidence militating against the theory that the universe originated in a giant explosion—a big bang. This theory, suggesting chance and randomness as the underlying principles of the universe, supplies nothing that can be taken to ennoble man, and therefore offers satisfactions that are fully the equivalents of those the earth-centered universe of Ptolemy offered to the medieval and early-Renaissance minds. When evidence appears that would seem to contradict randomness—and in some fields, such evidence has been accumulating—it is routinely dismissed. In this respect, at least, the contemporary world picture is as efficacious as any that preceded it.
When it comes to geological history, chance and discontinuity have been wrung from the record with missionary zeal. Starting in the early 1970’s, the mysterious disappearance of the dinosaurs came to be explained by a theory with negative implications for man in two ways. Purportedly, the dust from the crash of a meteorite, a sort of earthly big bang, blocked out the sun for a number of years, thereby depriving the dinosaurs of food. It could thus be concluded that man’s survival as a species was due not so much to his adaptability as to the chance escape from catastrophe of his mammalian predecessors. At the same time, the prehistoric catastrophe served as a warning: through a similar blockage of the sun—this time in a nuclear winter brought on by the use of atomic weapons—man could bring about his self-destruction.
In 1985, however, paleontologists marshaled geological evidence that the extinction of the dinosaurs occurred over too long a period of time to be reconciled with a scenario of catastrophe. Then it emerged that most paleontologists—some 96 percent—had never credited the catastrophe theory in the first place, but had felt constrained to remain silent in the face of their opponents’ moral fervor concerning the threat of nuclear war. To such an extent can the need to adhere to a world picture compromise scientific objectivity.
That the demeaning of man in the new world picture rests on a rather tenuous basis in observation has been repeatedly demonstrated by anthropological and ethnographic contretemps similar in their outlines to the fate of dinosaur catastrophism. The deciphering of the ancient Mayan hieroglyphs, for example, has made it possible to describe the life of that people, and incidentally to interpret the symbolic drawings accompanying their hieroglyphic script. The drawings turn out to depict a range of cruel tortures and self-lacerations. “Before going to war, for example,” says a report in the New York Times, “the king would puncture his penis with a stingray spine, while his wife drew a thorn-barbed rope through her tongue.” The Mayans were frequently at war,
in large part to capture aristocrats for torture and sacrifice. If the Maya sacrificed humans in lesser numbers than the Aztecs, against whom they have often been held up as superior, they tortured their victims more viciously. Ancient ball games, like Roman gladiatorial contests, pitted captives against one another for their lives; the heads of losers were sometimes used for balls.
These discoveries are of interest both in themselves and for the light they throw on the scholars who have been surprised by them. As the Times reports, “Evidence of these darker practices has been available for decades in the Maya’s stone reliefs and paintings,” but scholars had explained it all away, interpreting the reliefs and paintings in such a manner as to place the Maya “on a mist-shrouded pedestal as austere, peaceful. . . students of the stars and the calendar.” The pictures, in other words, functioned as virtual Rorschach tests, onto which their interpreters projected a collective wish to find in Meso-American prehistory an indigenous people at once gentle and mystical and accomplished in science and the arts.
Now that a rather different portrait of the Maya has emerged, a scramble is occurring to keep the bigger world picture in place. If, as the Times reports, the “new image” of the Mayans “is less romantic, it is also more human, scholars are quick to assert.” Thus Professor David Friedel, an anthropologist at Southern Methodist University, believes that the self-laceration practiced by kings in advance of going to war in search of torture victims “indicates a cooperative, sacred relationship between the elites and the commoners.” In other words, if the evidence shows a society’s aristocrats obsessed with self-mutilation and torture, a bit of interpretation will help us see beneath the surface to the class solidarity so characteristic of pre-Columbian America and so lamentably missing from the modern world.
To the north of the Mayans, the American Indians inhabiting what is now the United States fit into the new world picture as well—not so much exemplars of a superior civilization as object lessons demonstrating the defects of our own. The displacing of the Indians has long been a source of shame to enlightened consciences; modern historical accounts commonly include a broad indictment of the early American colonists and their descendants on this score. In the latest perspective, the actions of the colonists are often cited as examples of “exploitation” and “commodity capitalism,” while Indians are portrayed as having lived in benign relationship with the land until introduced to anti-ecological practices by the whites. Such is the thesis of William Cronon’s Changes in the Land, a recent, much-praised ecological history of colonial New England that has won the Francis Parkman prize of the Society of American Historians.
Yet here, too, the evidence has proved to be shaky. At least one scholarly reviewer of Cronon’s book felt bound to point to “a body of evidence” showing that colonial agricultural practices did not rape and deplete the land, as Cronon has it, but rather were “carried on without plows, on fields that were fallowed far more frequently than he concedes.” As for the Indians, “they burned the woods, sometimes usefully but at other times more harmfully, and before moving on registered a heavy impact on the land on which their fields and villages were situated.”
More interesting than this refutation, however, is the spirit in which it is presented. For the reviewer judges the book as a whole to be one “of impressive originality” and “penetrating scholarship,” and he is at pains to assert that the factual refutations in his review in no way damage the book’s thesis. After all, the “ultimate effect” of American settlement “was very much as Cronon has so brilliantly and provocatively described it.” Thus, it does not matter whether an author is right or wrong concerning the period he has chosen for study, so long as he holds the right view of modern civilization.
The effects of a scholarly orthodoxy on those who dissent from it can be seen in the case of Alden Vaughan’s 1965 book, New England Frontier: Puritans and Indians 1620-1675. Even before Vaughan’s book appeared, a consensus had grown according to which, in Vaughan’s words, the Puritans “appropriated the natives’ land, bargained unscrupulously for their furs, and abused individual Indians with impunity.” But Vaughan, starting out in perfect accord with this view, and wanting only to offer a more detailed account than any that existed for the earliest period of New England settlement, was surprised to find that “the evidence suggested a more humane and equitable treatment of the natives.” The Puritan colonists had “sought peaceful and equitable methods of acquiring land and furs, administering justice, and recruiting converts”: they had tried “to deal justly and peacefully with their Indian neighbors.”
Although Vaughan’s book was favorably received on its initial appearance, it soon enough came under attack as its conclusions were increasingly seen to violate the orthodoxy of the time. In one disputed instance, that of the war between the Puritans and the Pequots, Vaughan’s critics found not just that he was wrong in assigning blame to the Pequots but that the war was an act of Puritan “genocide.” No matter that the Pequot tribe was not annihilated. No matter that here and elsewhere the historical record failed to support the charge of genocide. The Puritans, the new scholars declared, must have destroyed the relevant documents.
In the face of this assault, Vaughan was forced into retreat. In 1979 he brought out a revised version of his book in which he muted both his defense of the Puritan settlers and his exposures of Indian atrocities. Vaughan writes in his 1979 preface that on reflection he had overstated the case for the Puritans. And as for the Pequot war,“I am less sure than I was fifteen years ago that the Pequots deserve the burden of blame.” Still, a careful reader will note that Vaughan never really yields on the substantive issues, and in his footnotes he makes it clear that the new historians have built their case on a misuse of scholarship. In the end, Puritan “genocide” turns out to be another catastrophe that never occurred—but also another scholarly article of faith to doubt which can be damaging to one’s reputation.
Doubt of any kind has hardly affected the new world picture of primitive peoples and their virtues. In the early 1970’s, for example, two primitive societies were discovered and presented to the world: the gentle Tasaday of the Philippines and the benighted Ik of Uganda. Both were much exploited in the service of demonstrating the degeneracy of civilization.
In 1971 a dozen natives and their children were discovered living in the rain forest on the island of Mindanao in the Philippines. In The Gentle Tasaday, John Nance, the reporter who first publicized the discovery, wrote in personal terms of the lesson these near-naked people teach to civilized man. As against “the popular image of Stone Age man, hulking and grunting, crude and dull,” the Tasaday knew no cruelty or evil. They were ignorant of “killing, murder, war”; they did not so much as punish their children. When asked to describe evil, they replied: “We can’t think of anything that is not good.”
Charles Lindbergh, appropriately enough a member with Nance of the large, media-event expedition that quickly made its way to the Tasaday, expounded further on their significance for modern man. “The rise of intellect has coincided with the decline of natural life,” Lindbergh explained. Technology, produced by intellect, has “taken us to a stage where life itself is endangered, man included. We are destroying our environment through pollution.” For both him and Nance, the answer to this danger lay in the countercultural ideal of “touch and tenderness,” epitomized by the Tasaday with their “simple beauty, their mysterious purity.”
As it happens, the enthusiastic reception accorded to Nance’s book revealed far more about the self-image, of contemporary civilized man than was ever revealed about the two dozen Tasaday, who were only briefly visited before being cut off from the outside world. Nance, in fact, had himself originally set out to expose the early reports of the Tasaday as a hoax, perpetrated (as he then thought) by a flamboyant official of the Marcos government named Manuel Elizalde. Nance had been taken to the Tasaday under suspicious circumstances. Elizalde let him see the group only after he himself had gone ahead to meet with them, by his own admission arranging for them to remove their clothing and don orchid leaves, which he claimed to have been their authentic dress up until the time they were discovered. He then seated them on a log and finally presented them to Nance, who found their pose “too perfect.” But at precisely this moment, Nance, flooded with awe, was converted from skeptic to apologist.
In retrospect, it becomes evident how much common sense had to be sacrificed in the service of this conversion. One notices now, for instance, that Nance’s report of a “lack of serious health problems” among the Tasaday is belied in his book by accounts of disease. And recently it has been said once again that the whole story of the discovery of the Tasaday may be fraudulent. Yet the wish to believe remains as strong as ever. Thus, when the New York Times, breaking a month-long silence, finally carried a report of the new charges of fraud, it restricted itself to the information that recent visitors, the first in ten years, had found the Tasaday wearing clothing and in possession of manufactured tools and other objects. “In interviews,” the Times went on,
two anthropologists who recently revisited the tribe said these new possessions, which had aroused the skepticism of Swiss and German reporters who saw them recently, were an expectable product of the tribe’s first contacts with outsiders in the early 1970’s.
But why, if the Tasaday had come into extended contact with civilization in the early 1970’s, should the Germans and Swiss have been surprised at the clothing? The answer, though one would not have learned it from the Times, is that the Germans and Swiss were not surprised at all. Instead, they reported being told by the Tasaday that Elizalde had “forced us to live in the caves so we could be called cave men. Before then, we had worn clothing, even though it was rather torn.” The Times not only omitted this information, which was available to it as a subscriber to the Reuters news service, but also failed to mention that the two anthropologists interviewed for its own story were members of the original Nance expedition and deeply committed to the Tasaday story.
The “discovery” of the Tasaday, whether or not a hoax, seemed tailor-made to buttress the new world picture; by contrast, the mere existence of the Ik seemed calculated to undermine it, and to redress the balance in favor of civilization. The Ik were studied by Colin Turnbull, a professional anthropologist whose book on them, The Mountain People, appeared in 1972. As against the Tasaday, the life in nature of this isolated and primitive African group was one of unexampled Hobbesian viciousness. The individual Ik, whether adult or child, foraged strictly for himself. The group had failed even to establish a sufficient degree of community to arrange for the disposal of human waste, and as a result it lived in the unrelieved stench of human feces. Children were barely tolerated by their parents, and then only until they were three, at which point they were expelled from their homes. “Men would watch a child with eager anticipation as it crawled toward the fire, then burst into gay and happy laughter as it plunged a skinny hand into the coals.” Virtually universal among the Ik were adultery, deception, cruelty to others and to one another. Far from protecting one another, “anyone falling down was good for a laugh . . . particularly if he was old or weak, or blind.”
The implications of such a people for the romanticized conception of the primitive would seem all too clear. Yet Turnbull himself defended the Ik—on theoretical grounds. He concluded that they had probably once held to higher values—and “very likely” with greater fidelity than Western civilized men—but had had to abandon them “for the very good reason that in their context these militated against survival.” The “context” Turnbull refers to is the severely depleted region the Ik inhabited, which itself had degenerated “with the advent of civilization to Africa.” In other words, the conditions that produced the Ik were “a part of that phenomenon we so blandly and unthinkingly refer to as progress.”
Turnbull goes even farther, arguing that the Ik, “if we are honest,” are not “greatly different from ourselves in terms of behavior.” After all, starting with our kindergartens, and “reaching on through school and summer camps,” we “effectively divorce” ourselves from our children. In fact, “there is not all that much difference” between us and the animals. “Technologically,” Turnbull grants, “we are superior in some re-respects,” but of course our technology is evil. And if it is true that we can speak—or as Turnbull grudgingly phrases it, if “we do seem to have developed the art of verbal communication to a point that gives us an enormous potential advantage over other animals”—here too “it can readily be shown that both speech and writing, misused, have led to many of the disasters peculiar to humanity.” Much, indeed, can be readily shown, especially when the contemporary world picture is threatened as radically as it is by the horrors of the Ik.
Not surprisingly, Turnbull later was one of those who sprang to the defense of Margaret Mead when the methods and conclusions of her famous 1928 book, Coming of Age in Samoa, were challenged in 1983 by the Australian anthropologist Derek Freeman. Mead, it will be recalled, had depicted a Tasaday-like society which, thanks to unrepressed sexuality during adolescence, was virtually immune from such ills of modernity as rape and suicide. On the basis of his own investigations Freeman found to the contrary that the Samoans were not sexually unrepressed during adolescence but placed a high value on female virginity. Using court records and other evidence, he also showed that Samoan society displayed greater amounts of hostility than Western society, and had higher rather than lower rates of both suicide and rape. Turnbull, in his defense of Mead, avoided the scientific issue and instead praised Mead for her bravery and personal sacrifice—and her admirable criticisms of Western civilization.
An analogous defense has been mounted on behalf of those who, by attempting to teach sign language to primates, have challenged man on the grounds of his verbal uniqueness. In Nim, an account of one such attempt, Herbert Terrace writes that “until recently, humans could take comfort in the assurance that our language made us unique.” Setting out to challenge that assurance, Terrace trained a chimpanzee named Nim for two years, teaching it to recognize numerous signs as well as to signal its wants.
As Terrace sat down to review his written and videotaped records, he had reason to believe that he had achieved a breakthrough. To his dismay, however, he discovered that neither Nim nor any other trained primate had used combinations of signs in genuinely sentence-like ways. Terrace came to this conclusion “reluctantly,” and no wonder, given the hopes with which he had begun. Nevertheless, his was a rare victory for scientific objectivity.
The responses to that victory are another matter. The well-known science writer Martin Gardner, who accepts Terrace’s conclusions, has pointed out that the desires of researchers have colored experiments to such a degree that an animal’s failure to talk is often explained away as a joke or a lie. He cites Francine Patterson on Koko (made famous in the film documentary, Koko, A Talking Gorilla):
She asks Koko to sign drink. Koko touches her ear. Koko is joking. She asks Koko to put a toy under a bag. Koko raises it to the ceiling. Koko is teasing. She asks Koko what rhymes with sweet. Koko makes the sign for red, a gesture similar to the one for sweet. Koko is making a gestural pun. She asks Koko to smile. Koko frowns. Koko is displaying a “grasp of opposites.” Penny [Francine Patterson] points to a photograph of Koko and asks, “Who gorilla?” Koko signs “Bird.” Koko is being “bratty.”
Though Gardner could not be more unequivocal in dismissing talking-animal claims, he himself, it is worth noting, declares a certain uneasiness at being put thereby in the company of the likes of Mortimer J. Adler (The Difference of Man and the Difference It Makes). Still, Gardner’s uneasiness is mild compared with that of others, including Terrace’s publisher. As Gardner observes, “nowhere on the jacket of Nim or in the book’s advertising does the publisher so much as hint that the book severely criticizes practically all earlier work with talking apes.” The publisher’s silence was well calculated; in the event, one lone researcher has accepted Terrace’s conclusions.
Today, Ursula LeGuin remains positively outraged at the “bad faith” of those who deny animal speech. She attributes their position to “speciesism,” to “a need to believe in the unquestion-ability of human uniqueness, human supremacy.” (Actually, the proclivities of Terrace and other critics lie exactly in the opposite direction.) Miss LeGuin holds in favor of animal speech despite several bizarre revelations that she herself recounts of eccentric, erratic behavior on the part of ape researchers. For example, Francine Patterson now keeps Koko “jealously and zealously guarded,” and her results “have been so selectively released that even the most sympathetic scientists have trouble defending Miss Patterson’s work.” Another animal and its trainer “are keeping a low profile in the Pacific Northwest.” Finally, almost unbelievably,
Janis Carter, who worked in language experiments with the chimpanzee Lucy (raised as a “baby” by a couple who gave her to the training center), took her to an island in an African river, where the woman must live in a cage while two groups of chimpanzees roam free. Trying to free Lucy of her human dependence, Miss Carter will not use sign language with her.
Similar contortions in the service of the theory that animals are at least the equals if not the superiors of man have been undergone by Jane Goodall, the lady who lived “Among the Wild Chimps,” as a National Geographic TV special had it. Goodall spent most of the years between 1960 and 1984 in close proximity to chimpanzees in the wild. She has concluded that they share essential traits with humans, including altruistic behavior, and in addition they possess a number of admirable and endearing qualities of their own. After fourteen years of study, however, Goodall observed something never before suspected of chimpanzees: they had begun to murder and devour their own kind. A mother and daughter became adept at seizing and eating the infants of other chimpanzees. There was no question of a food shortage, or of any resentment at work.
Goodall could offer no explanation for this unanticipated behavior—though a conclusion never mentioned by her or those reporting on her work is one that would have instantly sprung to the minds of scientists and amateurs alike in any period preceding our own, namely, that animals in their way, just as primitive peoples in theirs, exhibit behavior that is considered brutal by civilized norms. But to concede this would be tantamount to conceding the superiority of civilized man, an obvious impossibility. To anyone still foolish enough to hold such an idea, the usual retort is to point triumphantly to acts of torture, infanticide, cannibalism, and genocide committed by civilized societies themselves (while ignoring the fact that such violations of civilized norms have always been condemned as aberrations by the civilized world at large).
To remind us of our place in the scheme of things, the curators of the Bronx Zoological Garden in New York have thoughtfully positioned a mirror for visitors to the Great Ape House. Above the mirror is the legend, “The Most Dangerous Animal in the World.” Under it is written:
This animal, increasing at the rate of 190,000 every 24 hours, is the only creature that has killed off entire species of other animals. Now it has achieved the power to wipe out all life on earth.
The curators, one understands, do not have in mind the American Indians—who joined civilized men in hunting the ivory-billed woodpecker until the bird was extinct on the North American continent. Nor are they thinking of the Maori, a South Pacific people who hunted the six-foot-tall land bird, the Moa, to extinction in New Zealand. No, the “animal” being referred to is the one from whose ranks came those volunteers in California who recently took turns supporting beached whales in their arms to enable them to breathe, not to mention the marchers in the 1984 All-Species parade in San Francisco who costumed themselves as crustaceans, birds, and trees to express their solidarity with forms of life ordinarily looked down on by man.
Most writers on these subjects are so committed to the relativist view that they routinely put quotation marks around the key terms of the discussion, “civilization” and “primitivism.” These indicate that the writer rejects any tendency to elevate the one or to fail in appreciation of the other. “Cannibalism,” too, regularly appears in quotation marks, both to indicate skepticism that it is as widespread as claimed and to allude to the view that “civilization” brings harms as great as or greater than primitivism. A note of irony often hovers about such quotation marks, as when Colin Turnbull distinguishes between “‘advanced’” societies, which show signs of violent collapse, and “‘backward’” societies, to which “this new violence has not yet come.” Similarly, when he writes that the Ik have “‘progressed’” to their present condition, he really implies that they have been the casualties of modern progress.
The vocal equivalent of such quotation marks can be heard in the tones of narration employed in television documentaries about animals and primitive peoples. The title of one of these, “Testament to the Bushmen,” may stand for all, with their invariable presumption of a widespread and deplorable tendency toward feelings of civilized human superiority. His voice dripping with sanctimony, the narrator at once gently castigates the audience and renders homage to the rare, underappreciated values to be found among the peoples and species who have been photographed.
Interestingly enough, however, what we see on the screen is often a picture of destitution, disease, or blank despair. As a narrator intones the virtues of Masai tribal life, for example, the camera may be revealing naked children covered with sores and beset by cattle flies so persistent that the children have ceased to pay attention as they crawl over their open eyes. Or the Maori will be extolled for their devotion to art and respect for nature while the camera roams over ancient carvings depicting a people living in fortified villages and constantly at war.
In a film about Nigeria, a solemn respect is accorded to “traditional healers,” long spurned by “orthodox [Western] medicine”; it is wrong, we are told, to think that “one is better than another,” when what is needed between the two is “mutual respect.” Without irony, the camera shows a traditional bone setter, who helps heal his patient by breaking a cock’s leg; when the cock heals, the patient’s leg will heal as well. With the mentally disturbed, we are informed, the native way in Nigeria is to keep them in village surroundings rather than exiling them to the indifferent confines of institutions. In the village, they are cared for by a traditional healer, who uses incantations and herbal concoctions. The camera now takes the viewer to the dirt area behind the healer’s hut. Here sit approximately a dozen mental sufferers in a line (the camera shoots only in close-up, making it impossible to view conditions in the dirt enclosure or to see how many are held in it). Some of them are shackled, some drugged; all straddle a ditch; none moves.
At least one public-television documentary has taken an anthropological look at modern man using images compiled from a year of photographing at and near a single, busy corner in Manhattan. People are shown crossing the street, often hurrying. Some are abstracted, few are animated. In one sequence, the corner is viewed from a rooftop high above, so that the people resemble insects in a hive. The camera lingers over those passers-by who seem upset, disoriented, even frantic. The narration? Modern man is lonely, unhappy. He lacks a sense of community, suffers from anomie. Not revealed is that the disturbed faces are those of released mental patients who mingle with the crowds in this particular neighborhood. When it comes to civilization, the accepting, embracing eye—the eye that looks benignly on the disposition of the mentally ill in Nigeria—turns suddenly censorious and judgmental.
This is not to say that there are no discriminations to be made within Western civilization; the new world picture has as many gradations as any previous world picture. But once the principle of reversal is understood, it is easy to arrive at the contemporary view on any number of matters. Simply elevate students with low grades and poor test scores, for example, over high achievers (the former are talented, the latter unimaginative). Find virtue in the criminal, stodginess in the law-abiding citizen. For the highest accolades, move quickly to what used to be thought of as the bottom of society: children and the insane.
Better yet, find a Caliban. In Shakespeare’s The Tempest, the wise magician Prospero is accorded sway over the lower creatures and the lower mortal Caliban—or so the play has always been understood. As Richard Levin has shown, however, contemporary literary criticism has systematically turned the meaning of this play upside down. The first step was to argue that Shakespeare actually intended Prospero to be “equated” with Caliban in evil. Next, Caliban himself began to be viewed favorably. By the 1980’s he “represented any group that felt itself oppressed.” In one production of the play in New York, “he appeared as a punk rocker, complete with cropped hair, sunglasses, and Cockney accent.” A feminist critic has imagined a soliloquy for Prospero’s daughter, Miranda, whom Caliban once tried to rape, in which she concludes, “I need to join forces with Caliban.”
In the Elizabethan world picture, man was constantly admonished to remember that his place was both high and low in the scale of things: above the animals but below the angels, possessed of reason but subject to the passions. “What a piece of work is a man!” Hamlet exclaims, but in the same speech calls him “this quintessence of dust.” And Prospero himself, as Tillyard points out, calls Caliban “this thing of darkness . . . mine,” as a reminder that his limitation as a human being is to be linked forever with the bestial. That this link should become a point of pride, that Caliban should be apotheosized rather than seen as a cautionary example—this is a development that could hardly have been predicted.
In the 1980’s signs typical of a disintegrating world picture have appeared. Those clinging to the still-dominant, low view of man increasingly have resort to the unverifiable to sustain their case, positing eons-old explosions in space, fanciful interpretations of prehistoric hieroglyphs, putatively destroyed documents, conjectures about the mind processes of inarticulate primates. When their slenderly supported metaphysical constructs are refuted, they respond with recriminations against the bearers of bad tidings. Still, some, like Herbert Terrace, have found that their first allegiance is to the scientific method, and in this there is hope. It now remains, just as it did, mutatis mutandis, at the time of the breakup of the Elizabethan world picture, for the implications of evidence favorable to man to be assimilated. The contemporary world awaits its John Donne: a poet capable of abandoning the comfort of stones in favor of the rigors of self-respect.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
The Demotion of Man
Must-Reads from Magazine
A monstrous regime's rational statecraft
ne of the more improbable geostrategic surprises of recent years has been the revival of the North Korean economy under the direction of Kim Jong Un. Just to be clear, that economy remains pitiably decrepit, horribly distorted, and desperately dependent on outside support. Recent estimates suggest that its annual merchandise exports do not reach even 1 percent of the level generated by its nemesis, South Korea. Even so, the economic comeback on Kim Jong Un’s watch has been sufficiently strong to permit a dramatic ramp-up in the tempo of his nation’s race to amass a credible nuclear arsenal and develop functional intercontinental ballistic missiles capable of striking the U.S. mainland. That is, of course, the express and stated objective of the program. Pyongyang today appears to be perilously close to achieving its aim—much closer now, indeed, than complacent Western intelligence assessments had presumed would be possible by this juncture. But then, North Korea is full of surprises for foreign observers.
The difficulty with analyzing the country’s weaknesses and strengths comes from the fact that the North Korean system—which is made up of the Kim dynasty, the North Korean state, and the economy constructed to maintain them both—is unlike any other on earth. By now, its brand of totalitarianism (“our own style of Socialism,” as Pyongyang calls it) is sufficiently distinctive that children of the Soviet or Maoist tradition also commonly find themselves at a loss to apprehend its logic and rhythms.
North Korea is no longer even a Communist state, if that term is to have any meaning. The once-prominent statues of Marx and Lenin in Kim Il Sung Square were removed some years ago. Mention of Marxism-Leninism has reportedly been excised from the updated but still currently unpublished Charter of the Korean Workers’ Party. The 2016 version of its constitution excises all references to Communism, extolling instead only the goal of “socialism”—and its two “geniuses of ideology and theory,” Kim Il Sung and Kim Jong Il (the grandfather and father of the current dictator). Small wonder that the world routinely misjudges—and very often, underestimates—the North Korean state and its capabilities.1
Despite its suffocating ideology, for example, North Korea is capable of highly pragmatic adaptation and economic innovation. Notwithstanding its proclaimed “self-reliance” and its seeming isolation, it is constantly finding new sources of foreign cash through ingenious and often remarkably entrepreneurial schemes overseas. And despite all the international sanctions, Kim Jong Un really has overseen a North Korean economic upswing of sorts since assuming power in 2011, the signal fact that best helps explain the acceleration in Pyongyang’s push for a credible nuclear and ballistic arsenal. Thanks to these and other apparent paradoxes, an economy seemingly always on the knife edge of disaster somehow manages to stay on course, methodically amassing the military might for what it promises will be an eventual nuclear face-off with the world’s sole superpower.
Though the hour is late, given all the progress that North Korea has been permitted over the past generation, it nevertheless looks as if there may still be time left to prevent Pyongyang from completing and perfecting its nuke and missile projects through “non-kinetic means”—that is to say, through international economic pressure as opposed to military action. For such an approach to work, however, we will need an informed and robust strategy—not the feckless, episodic, and intellectually shoddy interventions we have mainly witnessed up to now.
Indispensable to such a strategy must be an understanding of the North Korean economy—the instrument that makes the North Korean threat possible. In particular, we need to understand 1) how that economy functions, and to what ends; 2) how the “Dear Respected Comrade” Kim Jong Un brought to it a limited but critical measure of economic revival; and 3) how America and others might use the considerable financial and commercial options at their disposal to impair the North Korean regime’s designs, before Pyongyang wins what is now a race against time.
Despite the information blackout that North Korean leadership has striven to enforce for generations, we already know much more about all these things today than the Kim family regime could possibly want—more than enough to begin purposely defanging the North Korean menace.
One: The Economy of Command
Given its longstanding reputation as a basket case, it may startle readers to learn that there was actually a time when North Korea was regarded as a dynamic and rapidly advancing economy. Back in 1965, the eminent British economist Joan Robinson wrote that North Korea’s achievements put “all the economic miracles of postwar development…in the shade.”2
In those days, if Western intellectuals happened to talk about the “Korean miracle,” they weren’t discussing anything going on in the South. And it wasn’t just dreamy academics and well-hosted foreign visitors who seemed to hold North Korea’s economic prospects in high esteem. Between the late 1950s and the early 1960s, Japan witnessed an exodus of ethnic Korean residents—in all, roughly 80,000 people—who packed up and steamed off under their own free will to the North, voting with their feet to join the Korean state they deemed to offer the greater promise.
Despite the devastation North Korea suffered during the war it launched against the South in 1950, and despite the blazing economic takeoff in South Korea that commenced in the early 1960s under the Park Chung-Hee junta, North Korea may have been ahead of South Korea in per capita output for two full decades after the 1953 armistice. A CIA study in the late 1970s, for example, concluded that South Korea did not catch up with North Korea until 1975.3 Contemporaries at South Korea’s Korean Central Intelligence Agency (KCIA) concurred that the North was well ahead of the South on a per capita basis throughout the 1950s and 1960s, though they argued that the South caught up with the North a few years earlier than the CIA believed.
In retrospect, the wonder is that North Korea’s economy worked as well as it did for as long as it did. For from its 1948 founding onward, North Korea was not just another Cold War Soviet-type economy: It was a Stalin-style war economy on steroids.
As fate had it, the Japanese colonial overlords who controlled Korea from 1910 until 1945 constructed a heavy industrial base in its northern half—a forward supply zone to support their own greater Asian war efforts. Unlike the South, the North had major deposits of coal, iron, and other minerals, along with plenty of natural hydropower. “Great Leader” Kim Il Sung—the onetime guerrilla fighter and later Red Army officer who started North Korea’s Kim family dynasty—inherited this infrastructure when he took over the northern part of the divided peninsula in 1945 and used it as a base camp from which he directed an upward climb toward the summit to which he aspired: an economy set on permanent total-war footing.
Kim Il Sung came perilously close to consummating his vision. By the mid-1970s, the Great Leader would observe that “of all the Socialist countries, ours bears the heaviest military burden.”4 Even by comparison with places like the Soviet Union and East Germany, his North Korea was a garrison state. By the late 1980s, this country of barely 20 million was fielding an army of more than 1.2 million—a ratio comparable to America’s in the middle of World War II. Those military-manpower estimates, by the way, are derived not from U.S. or South Korean intelligence, but rather from unpublished population figures Pyongyang transmitted to the UN in 1989 (data that inadvertently revealed the size of the country’s non-civilian male population).5
Today, two Kims later, the International Institute for Strategic Studies reports that North Korea currently maintains the world’s fourth-largest standing army in terms of sheer manpower—ahead of Russia and behind only the globe’s demographic giants (China, India, the United States). For more than half a century—since 1962, the year Kim Il Sung decreed the “all-fortress-ization” of the nation—North Korea has been the most exceptionally and unwaveringly militarized country on the face of the planet.
But why? What possessed North Korean leadership to commit their country, decade after decade, to such an extraordinarily expensive and irrational economic posture? There was a method to this seeming madness. Kim Il Sung’s grand design for unending super-mobilization served many logical purposes, given the first premises of his North Korean state.
Enforcing permanent war-economy discipline comported nicely with perfecting the domestic totalitarian order the Great Leader desired. Further, given the unhappy realities of geography and 20th-century Korean history, having the might to stand up to any and all foreign powers—including his nominal Communist allies in Moscow and Beijing—may also have seemed an imperative. But above all else, North Korea’s immense military economy reflected Kim’s overarching obsession with unifying the divided Korea, and doing so unconditionally—that is to say, to finishing up that Korean War he had started in 1950, and finishing it up on his own terms this time.
In the eyes of North Korea’s rulers, the South Korean state was (and still is) a corrupt, illegitimate, and inherently unstable monstrosity, surviving only because of the American bayonets propping it up. The Great Leader wanted to be able (when the right opening presented itself) to strike a knockout punch against the regime in Seoul and wipe it off the face of the earth—“independent reunification,” in North Korean code language. This he could not do without overwhelming military force—and without an economic system straining constantly to provide that muscle.
As early as 1970, the Great Leader was warning that “the increase in our national defense capability has been obtained at a very great price.”6 And by the late 1980s, Kim Il Sung’s “economic miracle” was all but dead in the water. Decades of crushing military burden and systemic suppression of consumer demand had taken their predictable toll. And North Korean planners had compounded these difficulties with additional unforced errors of their own.
Their idiosyncratic application of the Great Leader’s Juche (“self-reliance”) ideology, for example, included a general injunction against importing new foreign machinery and equipment. This ensured that the country would have to maintain a high-cost, low-productivity industrial infrastructure. Juche also apparently meant never having to pay your foreign debts, whether to fraternal socialist states or to “imperialist” creditors in Western countries foolish enough to lend money to Pyongyang. By the 1980s, global financial markets had caught on to the game, and North Korea found itself almost completely cut off from international capital. And the longstanding “statistical blackout” North Korean leadership enforced to facilitate international strategic deception also inadvertently impaired economic performance by blinding domestic decisionmakers and requiring them to “plan without facts.”
But it was the ending of the Cold War that pushed the North Korean economy out of stagnation, and into disaster. Juche ideology notwithstanding, North Korea had never been self-reliant; sustaining its severely deformed economy required constant inflows of concessionary resources from abroad. Pyongyang was (and remains) consummately imaginative in devising schemes for extracting aid and tribute from overseas. In the 1960s, 1970s, and 1980s, Kim Il Sung procured the equivalent of tens of billions of dollars in support from Beijing, Moscow, and the Kremlin’s Warsaw Pact satellites, expertly playing the Kremlin off against China, gaming aid out of each while aligning with neither.
In 1984, Kim Il Sung made a fateful error: He leaned decisively toward Moscow, a tilt signaled by his unprecedented six-week state visit to the USSR and Eastern Europe that same year. The gamble paid off initially: Between 1985 and 1989, the Kremlin transferred around $7 billion to Pyongyang, twice as much as the amount transferred over the entire previous 25 years, much of it in military matériel. In 1988, North Korea relied on the Soviet bloc not only for almost all its net concessionary foreign-resources transfers, but also for roughly two-thirds of its international trade, most of it arranged on political, highly subsidized, terms.
Then the came the Soviet bloc’s collapse. By 1992—the year after the collapse of the USSR—both trade and aid from the erstwhile Soviet bloc had plummeted by nearly 90 percent. North Korea’s worldwide overall supplies of merchandise from all foreign sources consequently plunged by more than half over those same years.
These sudden devastating shocks sent North Korea’s economy into a catastrophic free fall from which it would not manage to recover for decades. The socialist planning system essentially collapsed. Famine was just around the corner.
Two: A Man-Made Horror and Its Surprising Aftermath
The North Korean famine of the 1990s was a catastrophe of historic proportions. No one outside North Korea’s leadership knows just how many people died in that completely avoidable man-made tragedy, but the toll was certainly in the hundreds of thousands and could possibly have exceeded a million.7 It arguably qualifies as the single worst economic failure of the 20th century. It was the only time in history that people have starved en masse in an urbanized, literate society during peacetime.
It is noteworthy that the famine—usually dated from 1995 to 1998—did not commence until after the death of the Great Leader and the ascension of his son and heir, “Dear Leader” Kim Jong Il. This was no coincidence. Economic failure was the Dear Leader’s stock-in-trade. His political rise almost perfectly corresponds to the decline and fall of the North Korean economy. It happens that the Dear Leader did succeed in what was arguably his primary political objective: to die of natural causes, still safely and securely in power. But economic progress worthy of the name would not be possible in North Korea so long as he was its supreme ruler.
Though both father and son were totalitarian tyrants enthused with their hereditary total-war machine, the differences in their economic inclinations and impulses were nonetheless striking. Dogmatic as he was, the Great Leader still possessed a peasant’s sense of practicality. Proof of his pragmatism is the singular fact that North Korea, alone among all Asian Communist states (and including Russia in this roster), avoided famine during its 1955–57 collectivization of agriculture.
On the other hand, the Dear Leader, from his sheltered Red Palace upbringing onward, was every bit the paranoid, secluded ideologue. He not only disapproved of any concessions to economic pragmatism but feared these as positively counterrevolutionary and potentially lethal to his rule. He likewise regarded ordinary commercial interactions with the world economy as “honey-coated poison” for the North Korean system. At home, he wanted total mobilization but without any material incentives; from abroad, he sought a steady inflow of funds unconstrained by any reciprocal obligations. Kim Jong Il’s preferred economic model, in short, was to enforce Stakhanovite fervor at home through propaganda and terror while financing his war-economy state through military extortion abroad. He called this approach “military-first politics.”
Unwilling as he was to address the country’s newly dire economic circumstances with reforms—in his view, there was nothing to reform—Kim Jong Il’s North Korea was trapped in deepening depression for most of the 1990s. We will know how close the place came to total economic collapse—to the sort of breakdown of the national division of labor that Germany and Japan suffered at the very end of World War II—only when the archives in Pyongyang are finally opened. Throughout the 1990s, in any case, heavy industry was largely shut down, with inescapable consequences for conventional military forces. The death spiral for the war-making sector redoubled the importance to the regime of the nuke and missile programs, both as an insurance policy for regime survival and as the last viable military instruments for forcing the South into capitulation in some future unconditional unification.
In retrospect, it is clear that Pyongyang had no intention of desisting from its quest for nuclear weapons and ballistic missiles, even as it played Washington and her allies for aid for years by pretending its nuclear program might be negotiable. Yet also in retrospect, the slow tempo of nuke and missile development under Kim Jong Il’s rule has to be considered a surprise. Any serious weapons program requires testing to advance—yet Pyongyang managed just one long-range missile launch in the 1990s and only three during his 17-year reign. The Dear Leader also oversaw two nuclear tests before his death in 2011—but only toward the end of his tenure, in the years 2006 and 2009.
Why this hesitant tempo if nukes and missiles were a central priority for the North Korean war economy? Although other possible explanations come to mind, the obvious one has to do with financial and economic constraints. Ironically, despite his vaunted “military-first politics,” North Korea’s nuke and missile programs may also have been inadvertent casualties of Kim Jong Il’s gift for stupendous economic mismanagement. (True, North Korea could undertake expensive nuclear projects internationally, such as the undeclared plutonium reactor in Syria that was nearing completion when the Israelis leveled it in 2007—but that was apparently a cash-and-carry operation, bankrolled by the Dear Leader’s friendly customers in Iran.)
There is considerable evidence that the North Korean economy hit bottom around 1997 or 1998. That bottom was very low indeed: Rough estimates suggest that, by 1998, North Korea’s real per capita commercial merchandise exports were barely a third their level of just a decade earlier, while real per capita imports, including supplies indispensable to the performance of key sectors of the domestic economy, were down by about 75 percent.
North Korea appears to have turned the economic corner not on the strength of new or better domestic economic policies, but rather on breakthroughs in international aid procurement. Pyongyang figured out how to work the West’s international food-aid system: Between 1997 and 2005, the year before its first nuclear test, it was bringing in an average of over a million tons of free foreign cereal each year, ending the food crisis. It is tempting to regard this as “military-first politics” in action, for military menace played an important role in the international community’s solicitude. It is impossible to imagine a helpless and stricken sub-Saharan population obtaining “temporary emergency humanitarian aid” on such a scale, for such an extended duration and with so very few conditions attached.
Central to this upswing in food aid and other freebies from abroad was the fact that North Korea got lucky with the alignment of governments in Seoul, Washington, and Tokyo. For a while, the leaders of this consortium of states were commonly willing to underwrite an exploratory policy of “sunshine” or “engagement” with the Dear Leader by offering him subventions and financial transfers. To secure his June 2000 Pyongyang Summit with the Dear Leader, for example, South Korea’s then-president had hundreds of millions of dollars secretly wired to special North Korean accounts—thereby committing crimes under South Korean law (for which he later issued pardons).
In the event, the “sunshine”-aid influx that may have rescued North Korea at its darkest moment would wane after its clandestine uranium-processing project surfaced in 2002—but the nuclear crisis that revelation triggered also made possible the next big round of North Korean international aid-harvesting.
After the 2003 U.S. invasion of Iraq, Beijing—alarmed by the possibility that the U.S. might also engage in a similar military confrontation with neighboring North Korea—organized and convened a “six-party talks” diplomatic process, ostensibly for deliberations over North Korean denuclearization, to cool things down. While the subsequent years of talking quite predictably led nowhere, North Korea’s price of attendance was apparently a steep increase in economic support from China. Between 2002 and 2008, China’s annual net balance of shipments of goods to North Korea—its exports to Pyongyang minus corresponding imports—more than quintupled, rocketing upward from less than $300 million to more than $1.5 billion. By then, North Korea had become just as economically dependent on Chinese largesse as Pyongyang had been on Soviet-bloc blandishments two decades earlier—but these inflows, and the politically subsidized trade they came with, were evidently sufficient to help at least partially revive the Dear Leader’s broken economy. From Chinese trade statistics, for example, we can infer that Chinese investments were instrumental in a resuscitation of North Korea’s mining and metallurgy sectors in the last years of Kim Jong Il’s life. (We must rely on inference here since Beijing to this day remains almost totally opaque about its economic relation with Pyongyang.)
All in all, Kim Jong Il’s North Korea took in more than $1 billion from its enemies in Washington, and nearly $4 billion from the “puppet regime” in Seoul (not including the South’s additional expenditures on “off-the-books” transfers and special economic or tourist zones in the North). And from China, North Korea scored more than $12 billion of net merchandise inflows under the Dear Leader—a sum that would look even greater if valued in today’s dollars. All the while, North Korea was also earning invisible revenues from a whole network of highly enterprising if generally illicit overseas endeavors: its “nuke-and-missile homework club” with Iran; à la carte weapons sales and military services provided to a host of dictatorships and terror groups; counterfeiting of U.S. currency; drug racketeering; insurance frauds perpetrated against firms in London’s City; and more. The Dear Leader was extensively involved in the world economy, after all—just in a Bizarro World, Legion of Super-Villains sort of way.
Thanks to highly skilled aid-wheedling, international shakedowns, and financial gangsterism, Kim Jong Il’s North Korea clawed its way back from famine to a low but acceptable new economic normal—all the while forswearing domestic economic reforms or genuinely commercial contacts with the outside world. North Korea did not completely avoid potentially fraught economic changes under Kim Jong Il, of course—that was beyond the powers even of the Dear Leader. Domestic cellphone use began during the Dear Leader’s reign, for example, as did a tentative marketization of private consumption (about which more in a moment). But these and other analogous economic changes during the Kim Jong Il era are best understood as “transition without reform,” to borrow an apt term from North Korea watcher Justin Hastings.8
The economy’s “new normal” in the Dear Leader’s final days was still at a miserable level. Although North Korean scientists could launch long-range missiles and test atomic weapons, and although North Korea’s population had reportedly achieved a fairly high level of educational attainment (higher than China’s, if North Korean figures are believed), the country’s international economic profile was Fourth World. According to the World Trade Organization, North Korea’s per capita merchandise trade levels in 2010 approximated Mali’s. Its share of world merchandise trade that same year was roughly the same as that of Zimbabwe, a country with half of North Korea’s population—and despite its measure of recovery after 1998, North Korea’s global trade share fell by more than two-thirds between 1990 and 2010, even more than Zimbabwe’s under Mugabe’s misrule in that same period.
The world is a moving target and, generally, an improving one—so national stagnation also means continuing relative decline. Although the Dear Leader bequeathed his son Kim Jong Un a system that had avoided total collapse, there was little else that could be said to commend his economic legacy.
Three: The Economic Upturn
Dear Respected Comrade Kim Jong Un faced formidable odds when he took over in late 2011. The twentysomething was a novice manager at the time of his father’s demise. Unlike the Great Leader, who had groomed his son to rule from an early age, Kim Jong Il himself put off the whole business of naming a successor for as long as he possibly could, designating the child of one of his mistresses as the next Supreme Leader only after an incapacitating stroke made the naming of an heir an unavoidable matter of state.
As Kim Jong Un took office, the planned economy was no longer functioning, and to make matters worse, North Korea’s limited market sector was beset by galloping and seemingly unstoppable inflation. His father had experimented with a limited monetization of North Korea’s tiny consumer sector in 2002 but botched it—and only made matters worse with a surprise 2009 “currency reform” that effectively confiscated private holdings above $100, drastically degrading the already low credibility of the won.
From this unpromising beginning, Kim Jong Un has proved a relative success in delivering economic results in North Korea. There is evidence that the North Korean economy has enjoyed some measure of growth, macroeconomic stabilization, and even development under his aegis.
Pyongyang, “the shrine of Juche,” may be a Potemkin showpiece—but is showpiece-ier today than in the past. Construction cranes are whirring, and whole new sections of the city have risen up. Traffic jams now sometimes clog “Pyonghattan’s” vast, previously empty boulevards. Expensive restaurants and shops purveying luxury goods increasingly dot the capital, and their customers are mainly locals, not foreigners. The upsurge in prosperity and living standards evident in Pyongyang is reportedly reflected, albeit to a more modest degree, in other urban centers as well.
Furthermore, in sharp contrast to previous North Korean trends, or other earlier Soviet-type economies, the country today not only displays considerable marketization but also market stability. This much is demonstrated by cereal prices and foreign-exchange rates in informal markets across North Korea. Over the decade between mid-2002 and mid-2012, North Korea’s won depreciated against the U.S. dollar in such markets by a factor of more than 5,000 (no, that is not a typo). But that depreciation abruptly stopped a little over five years ago, and since then the won has traded around 8,000 to the dollar (fluctuating within a band around that average). In other words, North Korea now has a stable currency that is convertible into hard currencies. Likewise, the domestic price of rice in North Korean markets suddenly stopped soaring five years ago and has been in the vicinity of 5,000 won per kilogram ever since. Whatever else one may say of these new domestic price signals from Kim Jong Un’s North Korea, they are not what one would expect to see from an economy in mounting crisis and disarray.
Finally and by no means least important: In the military realm, nuke and missile testing has accelerated. In the 13 years between Kim Jong Il’s first Taepo Dong test and his death, North Korea launched three long-range rockets and detonated two atomic devices. Kim Jong Un has been in power just over six years; his regime has already set off four nuclear tests and shot off more than a dozen long-range missiles. Some of the speed-up could reflect long-term strategic choices and might in part be affected by improvements in efficiency (cost reduction) within the WMD industrial sector. All other things being equal, though, this sharp acceleration would seem to betoken a major new infusion of resources into programs already long accorded a top priority by the North Korean state. Without a bigger economic pie and substantially greater funding sources, it is hard to see how Pyongyang could have pulled this off.
All this said, North Korea is still shockingly unproductive, still punching far below its weight, still nowhere near self-sustaining growth. Kim Jong Un’s boundless self-indulgence is manifest in costly vanity projects like a spanking-new “ski lift to nowhere” resort, Masikryong, a venture otherwise inexplicable save perhaps for the memories of childhood days in Switzerland that it might elicit.
But by distancing himself from his father’s most economically destructive policies and practices, and navigating into previously uncharted waters of economic pragmatism, Kim Jong Un has opened up heretofore ungraspable opportunities for raising living standards and building military power at one and the same time. Thus the name of his signature policy: byungjin, or “simultaneous pursuit.”
In short order after his ascension, Kim Jong Un demoted—or killed—most of the Dear Leader’s closest cadres and confidants. And less than five months after assuming power—at a ceremony commemorating his grandfather’s 100th birthday in April 2012—he made an astounding declaration, coming as it did from North Korea’s supreme ruler: “It is our party’s resolute determination to let our people…not tighten their belts again.” Translation: This is no longer your father’s dictatorship; aspiration for personal betterment is no longer a counterrevolutionary act of treason.
Dear Respected has deliberately and steadily reshaped the economy under his command. The fundamental strategic difference between Kim 2 and Kim 3 was this: Whereas the Dear Leader saw “reform” and “opening” as deadly “ideological and cultural poison” pure and simple, Dear Respected believes that North Korea could withstand a bit of that poison—actually, quite a bit—and even end up stronger for taking it.
Pyongyang’s new policy directives have been informed by this insight. In agriculture, Kim Jong Un promulgated the “June 28 Instructions” (2012), which permitted family-level work units and allowed farmers to keep 30 percent of their surplus—a bonanza compared with all previous official rules. For enterprises and industry, there were the “May 30 Measures” (2014), which allowed managers to hire and fire workers, pay them according to their productivity, and keep a portion of any profits they earned. People were, increasingly, paid with money for their work—and it was real money, as in, money that could buy things people wanted. The gradual marketization and monetization of North Korea’s civilian economy over the past two decades is a major transformation, and one critical to understanding the country today.9
By the late 1980s, North Korean leadership had fashioned a consumer sector that would have turned Stalin green with envy. No country on the planet had so tiny a share of total national output flowing to personal consumption as late Cold War North Korea—and no country had so low a fraction of its personal consumption accruing to citizens on the basis of their own market choices. By the late 1980s, North Korean planners had come closer to completely demonetizing their economy than any modern polity this side of the Khmer Rouge. Most goods, services, and supplies that North Korean families consumed were provisioned to them directly by the state, with no “interference” by actual consumer preferences. North Korean planners wished to cede as little control over their command economy as humanly possible.
Pyongyang’s near-total control of the consumption basket, however, presupposed that the state would be supplying its subjects with their daily necessities in the first place. That collapsed in the mid-1990s when the Public Distribution System simply stopped providing the full promised daily food rations to most of the population—and stopped supplying any food at all to some of the population. A terrible number of those who trusted the government to take care of them ended up perishing. To survive the famine, North Koreans had to learn to buy and sell in informal markets that began to spring up—even though such activity was against the law, and some “economic crimes” were punishable by death. The Kim Jong Il government loathed these new private markets, but it needed them to forestall wholesale calamity. Thus commenced the two-steps-forward-one-step-back dialectic of marketization that lasted the rest of the Dear Leader’s life—and after his death, marketization and monetization of the civilian economy gained further steam.
Today it is all but impossible to get by in North Korea on state-supplied provisions alone—and a wide array of goods and services, both foreign and domestic, are available for money in North Korean markets. Although formally prohibited, even real estate is for sale throughout the country, with a vibrant market for private flats in Pyongyang. And a wealthy marketeering caste has arisen: donju, or “money masters,” stereotypically a well-connected official and his enterprising wife, who use political influence as well as entrepreneurial savvy to enter this nouveau riche North Korean elite.
In case you were wondering: Yes, corruption is rife in North Korean markets. It is the necessary lubricant for all North Korean private commerce. In addition, the government expects a big cut, and such funds have been integral to the recovery of the North Korean state.
The marketization and monetization of its consumer economy, in conjunction with new agricultural and commercial incentives and a more tolerant official attitude toward informal activity, laid the groundwork for a domestic-production upswing in North Korea (and a veritable boom in private consumption, although from a very low starting point).
Unlike Asia’s “reform socialism” states, China and Vietnam, North Korea has never made a serious effort to attract private investment from abroad from real live capitalists. Pyongyang prefers large-scale foreign projects that are political in nature. Such projects are bankrolled by governments indifferent to profit, which is to say by the foreign taxpayers who can ultimately be left holding the bag. Examples include the ill-fated Kaesong Industrial Complex paid for by South Korea, as well as its doomed Kumgang Tourist Resort. For international trade and finance, the overwhelming bulk of North Korean activity still falls into two categories: 1) politically predetermined, highly subsidized economic relationships, or 2) what we might call “guerilla warfare” or “outlaw” finance.
Four: North Korea’s Friends
Preferential trade ties with China are pretty much the only game in town for Pyongyang these days. With the virtual shutdown of South Korea’s politically subsidized inter-Korean trade in 2016 following accusations that money from the Kaesong project was being used to fund the North’s missile program, China may now account for close to 90 percent of North Korea’s international commercial-merchandise trade turnover. And North Korea always receives much more than it gives in its arrangement with China, year after year.
There is, to be sure, an element of harsh capitalist bargaining within this overall relationship—but most of that is in the “people to people” bartering and petty trading at the border, largely for consumer goods. At the national level, judging by Chinese customs statistics, North Korea raked in well over a billion dollars a year in net merchandise shipments from China from 2008 through 2014—with no transparency on Beijing’s part about the mechanisms by which this ongoing transfer is financed, much less about the Chinese government’s objectives and intentions in extending this lavish lifeline.
Since 2015, official Chinese numbers suggest that Beijing’s de facto aid is down—but these look like figures deliberately fudged in the face of mounting international demands for sanctions against North Korea. It is at the very least possible that important aspects of Chinese support for the North Korean economy or its defense industries have not yet come to light. Given what is already known, though, it is indisputable that deals with China under the two latest Kims have been key to reviving North Korea’s heavy industrial sector. (For the year 2016, China reported shipping over three-quarters of a billion dollars of machinery and transport equipment to North Korea, 10 times the volume in 2003, when the six-party talks commenced.)
Vital as Chinese support may be to North Korea’s survival and economic revival, North Korea evidences no gratitude for Beijing’s largesse. Pyongyang does not “do” gratitude. Moreover, leadership in Pyongyang knows very well a bitter truth about Chinese aid that they can never utter: namely, that capricious cutbacks in free food from China in the year 1994 were the trigger for the Great North Korean Famine, which became impossible to conceal by 1995.
Apart from its Chinese lifeline, North Korea’s other main sources of international support come from “outlaw” forays into the world economy—including activities tantamount to state-sponsored organized-crime operations. These shady dealings typically attempt to generate revenues for the state that avoid international detection, often relying on the special protections and prerogatives of a sovereign state for cover.
One cannot help but be struck by the industry, ingenuity, and sophistication that have generally kept such schemes one step ahead of international authorities. Koreans in the North can be world-class innovators, too—it’s just that their chosen fields of excellence happen to be in smuggling, drug-running, money-laundering, and the like.
Some of these inventive schemes have been in the news. In recent years, for example, Pyongyang has made unknown millions abroad from what we might call its own style of human trafficking: profiting off the tens of thousands of workers in labor gangs it has sent to China, Russia, the Middle East, and even parts of Europe. No less inventive has been Pyongyang’s apparent monetization of its growing capacity for cyberwarfare through international bank robbery. In 2016, “unknown” hackers relieved the Central Bank of Bangladesh of $81 million in a spectacular heist; in late 2017, similar cyber-fingerprints were detected in a theft of $60 million from a bank in Taiwan. These are just two of many “hit and runs” orchestrated under the Kim Jong Un crime family. And as the WannaCry ransomware attack last year demonstrated by infecting hundreds of thousands of computers the world over, vastly greater dividends from cybercrime may lie just over the horizon.
Then there is North Korea’s signature global service industry: WMD proliferation. For obvious reasons, most of this work never makes the news. No one outside Kim Jong Un’s court probably knows just how much this nefarious business is bringing in these days. These unobservable flows, however, may be consequential. Consider this: Barely weeks after Tehran inked its September 2012 Scientific Cooperation Agreement with Pyongyang, the won suddenly ended its decade-long freefall and finally achieved exchange-rate stability. North Korea may have had additional, still concealed, operations that were also paying off at the same time as that Iranian deal, of course. But either way, the deal clearly marked a turning point in North Korea’s macroeconomic fortunes, and the stabilization of exchange rates and domestic cereal prices probably could not have occurred without an open spigot of foreign cash.
In sum, the hallmarks of Jong-Un-omics economics would appear to be new revenues from foreign sources, along with the new flows of funds derived from privatization and growth at home. These monies have apparently sufficed not only to stabilize North Korea’s previously toxic currency, and to bring an end to runaway inflation in North Korean key private markets, but also to abet Pyongyang’s nuclear and ballistic ambitions. This, at least, would seem to be the most plausible reconstruction of the limited but meaningful evidence from the jigsaw puzzle that is the North Korean economy today.
To repeat: While we should recognize the existence of this economic upswing we should also keep its scale in perspective. All one need do is consider the sad, stunning space photos of North Korea at night—the satellite shots revealing a territory almost pitch-black, while the rest of Northeast Asia is glowing with light. They attest better than any available statistics to the limits of economic recovery under Kim Jong Un.
Among the other implications of that space imagery, the North simply does not have the pocketbook for a wholesale modernization of its conventional army and a nuke-missile program. For now at least, most of the military’s equipment, apart from critical nuclear-related pockets like submarine production, remains outdated and ill-suited for the tasks originally assigned. Today, Kim Jong Un cannot credibly threaten to roll in and occupy South Korea. But Kim Jong Un is on track to manufacture enough nuclear matches to burn the place down, with Tokyo and Washington thrown in for good measure, in the foreseeable future.
Five: How to Put Pressure on Pyongyang
Given what we know about the North Korean economy, can America and the world community keep Pyongyang from reaching its ultimate nuclear objectives through a real economic-pressure campaign?
We do not know just how close North Korea is to perfecting its weaponization of ballistic missiles, or how many nuclear weapons the North currently possesses. We also do not know as much as we need to about North Korea’s strategic inventories and reserves. If Pyongyang were stopped in its tracks today, its nuclear and missile work would require unwavering vigilance and far-reaching containment for the remaining life of the regime. That said, a serious international campaign of trade and financial sanctions—led by America, ruthlessly executed, and starting immediately—could very significantly slow the pace of Pyongyang’s ongoing nuclear-ballistic march. And if we are in it for the long haul, a serious sanctions campaign could eventually promise the effective suffocation of the entire North Korean military economy.
An international economic campaign of this sort won’t be easy (though America has many more cards in her hand than many now appreciate). It probably won’t be pretty, either. But in any case, it is the world’s last chance to thwart North Korea’s nuclear ambitions by nonmilitary means.
Let’s start with the unpleasant truths. We must recognize that economic pressure will not alter the intentions of the Kim family regime—ever. We must dispense with the fantasy, still inexplicably maintained in various esteemed diplomatic circles and Western universities, that Pyongyang can somehow be pressured—or bribed—at this late stage into changing its mind about its multi-decade march to a credible nuke and missile arsenal. There is no “bringing North Korea back to the table” that ends with CVID—comprehensive, verifiable, irreversible denuclearization. Period.
So much for the bad news. The rest of the news about the outlook for sanctions against North Korea, fortunately, is better than we usually hear.
Many authoritative voices seem to think sanctions have little chance of influencing North Korea’s nuclear trajectory. Economic historians note that the record for coercive economic diplomacy is poor and has been for centuries. Policy wonks and foreign-affairs experts add that successive rounds of UN and international economic sanctions seem to have had no real bite so far against North Korea. These pessimistic assessments, however, misread the prospects for international economic pressure against North Korea on two important counts.
As poor as the general record of coercive economic diplomacy may be, North Korea is not exactly a typical economy. It is an outlier—it’s world-class dysfunctional, recent changes under Dear Respected notwithstanding. The economy is incapable of growth (or for that matter, even stagnation) without steady inflows of financial support from abroad to keep it on its feet. Remember: When net aid from abroad sharply dropped (but did not end) in the 1990s, that was enough to send North Korea spiraling downward into paralysis and mass famine. The North Korean regime in short, is a poster child for a successful international campaign of economic strangulation. Despite Pyongyang’s nonsense about “self-reliance,” it is uniquely vulnerable to the cutoff of foreign money and subvention.
Kim Jong Un has not yet faced anything even remotely resembling an international campaign of “maximum economic pressure.” The continuing stability of North Korea’s foreign exchange rate and domestic food prices pointedly suggest international sanctions have not yet greatly impacted North Korea. But few foreign-policy experts, and even fewer general readers, seem aware of how flimsy were the array of sanctions imposed on North Korea by the UN and U.S. during the George W. Bush and Obama years.
Consider first the successive rounds of UN Security Council sanctions lodged against the regime since its first atomic test in 2006. China and Russia flagrantly and routinely violate the very sanctions their own Security Council representatives voted to impose. Most countries around the world still ignore them, too. In early 2017, the UN’s Panel of Experts on the sanctions reported that 116 of the UN’s 193 members had not yet bothered even to file implementation reports on the then-latest round (UNSC 2270, levied in response to Pyongyang’s fifth nuclear blast). The previous year, the Panel noted that 90 countries had never reported on any of the sanction resolutions against North Korea (eight at that time, the first of them ratified a decade before that report). And filing a report on these sanctions resolutions is not the same thing as enforcing them. Several countries with whom Washington enjoys ostensibly friendly relations have turned a blind eye to illicit North Korean activities on their soil for many years (Malaysia, Singapore, and some of the Gulf States being among the more egregious examples).
When it comes to Washington’s own economic measures, furthermore, North Korea is still far from being “sanctioned out,” no matter the received wisdom. In the final year of the Obama administration, according to Anthony Ruggiero of the Foundation for Defense of Democracies, fewer entities and individuals from North Korea were under U.S. Treasury Department sanction than those from seven other countries, including Zimbabwe and Sudan. While the Trump administration has been much more serious about sanctioning North Korea, Ruggiero testified that as of late summer 2017, North Korea nonetheless remained less sanctioned than either Syria or Iran. For some mystifying reason, moreover, North Korea was not put back on the State Department’s list of strictured “state sponsors of terrorism” until the end of 2017, after enjoying a nearly decade-long holiday off that roster.
As 2018 commences, three big changes augur well for the prospect of devastating “shock and awe” sanctions against the North Korean system. First: At the end of 2017, the Security Council endorsed a broad new writ and scope for sanctions against North Korea, dispensing with the earlier “marksman” approach of picking off particular military-related firms or individuals and embracing instead the “blockbuster” approach of crippling North Korea’s entire military-industrial complex. The new sanctions, among other things, ban all industrial imports by North Korea, severely cut permitted energy imports, and require UN member governments to “seize, inspect, and freeze” vessels violating some of the new restrictions.
Second: In late 2017, the U.S. Treasury announced new and much more sweeping authority for North Korea sanctions, granting U.S. officials wide discretion to impose what are known as “secondary sanctions.” Henceforth any business or person engaging in any kind of commercial or financial transactions with North Korea could be severely penalized, with punishments including fines, seizure or forfeiture of assets, prohibition against any commerce in or with the U.S., and being cut off from the worldwide clearing system for dollar-based financial settlements.
Finally, and by no means unrelated to these other changes, is the third change: the advent of the Trump administration. Under President Trump and his team, there appears to be a qualitative change in America’s North Korea policy—one that accords the North Korean threat a higher priority, and more unblinking attention, than it has been granted by any of Trump’s predecessors. The White House calls this new approach to North Korea a policy of “maximum pressure.”
Six: The American Role
Trump’s address before South Korea’s National Assembly last November on the North Korea problem was the most incisive, and moving, statement on the topic ever delivered by an American president. Whatever else may be said of him, Trump is keenly aware that the North Korean threat he inherited was allowed to fester and worsen under each of the four men in the Oval Office immediately before him. He appears to have no intention of continuing that tradition.
The Achilles’ heel of the North Korean economy—and thus, of Pyongyang’s nuclear and missile programs—is its existential dependence on foreign aid and outside money. The fortress-prison country is an operation that cannot be sustained on its own. To date, North Korea has skillfully extracted wherewithal and extorted financial concessions out of a largely unfriendly world. To jam the gears of the North Korean war machine, the international community must recognize, and finally begin systematically exploiting, Pyongyang’s unique economic weakness. This will require a campaign of economic pressure worthy of the name—and the pieces for such a campaign are already falling into place.
In broad strokes, what would this “maximum economic pressure” campaign look like? It must be Washington-led, since it will not coalesce spontaneously. To carry it out most effectively, diplomacy will be crucial: Alliance coordination and the building and maintenance of motivated coalitions are obvious force multipliers for this exercise. But the U.S. has unique international strengths that allow us to act unilaterally and with great consequence when necessary.
For starters, now that we ourselves have relisted North Korea as a state sponsor of terrorism, we have a stronger case for pressing governments around the world to shut down the regime’s embassies, trade missions, and other facilities located on their soil. Not necessarily to sever diplomatic ties, much less end all communication, with Pyongyang: just to deprive North Korea of safe havens for their illegal rackets on foreign shores. Given North Korea’s standard operating procedure overseas, affording Pyongyang an embassy in one’s country is like offering diplomatic immunity to the Mafia. The Trump administration has begun some of this advocacy already and has some initial results to show for its troubles. In conjunction with a consortium of like-minded states (including Japan), a full-court press could gain true international momentum. At the very least, this would disrupt some of North Korea’s illegal rackets and reduce the take from them.
Washington can also take the lead in lobbying governments to shut down the North Korean work crews operating within their own countries—these are too close to slave labor for comfort. This need not be quiet diplomacy. The complicit governments in question, including Beijing and Putin’s Kremlin, deserve to be called out publicly if they are intransigent. (The wording of the latest round of Security Council sanctions calls for shutting down such arrangements within 24 months, an amendment Moscow negotiated for—but there is no reason that the U.S. or independent human-rights groups should not try to speed up that timetable.) The U.S. also has options for penalizing trading partners who violate internationally recognized labor standards, which is to say we can affect the cost-benefit calculus for governments that tolerate North Korea’s odious practices in their own backyards.
This brings us to a rather larger diplomatic task: confronting China and Russia about their continuing financial malfeasance on North Korea. The scope and scale of China’s furtive support for North Korea dwarfs Russia’s, of course—but that is no reason to give the Kremlin a pass. These two states have long been playing a double game—one that must come to an end starting now.
Seven: The Russians and the Chinese
Contrary to some hand-wringing in Washington and elsewhere, the U.S. is by no means devoid of options in facing down China and Russia for their economic enablement of the Kim family regime. As already noted, Washington possesses an extraordinarily powerful tool for inducing their compliance: the U.S. dollar—the most important reserve currency in the world economic order. America gets to decide who can, and who cannot, engage in the dollar-denominated financial transactions with the myriad of correspondent banks serving the globe, for which the Federal Reserve Bank is the clearing house. Existing legislation and executive orders already provide the U.S. government with a panoply of instruments for inflicting nuanced and escalating economic penalties and losses on financial institutions, corporations, and private individuals who rely upon U.S. correspondent banks but engage in illegal or forbidden commerce with North Korea.
So far, the United States government has used only minor pinpoint-pinprick secondary sanctions against Chinese and Russian parties that violate restrictions on dealings with North Korea. Both nations face potentially major economic costs if they do not address and control such violations, should we choose to impose them.
It is no secret, for example, that the Chinese banking system is highly leveraged and that some of China’s largest banks are in what we might call a financially delicate situation. Does Beijing really want to find out whether one of these major concerns can survive a Treasury Department-Justice Department inquiry for North Korea infringements, much less the weight of actual secondary sanctions—or to find out what happens at home and in international financial markets if it looks as if a major Chinese bank might fail on that account?
If the Kremlin and Beijing believe we mean business, they will have reason to suppress illicit deals with North Korea—but convincing them we mean business is our responsibility. Washington has been curiously hesitant here, possibly for fear that Beijing or the Kremlin, or both, would respond by sabotaging any further UN sanctions. But we now have pretty much what we need from UN resolutions for a campaign of “maximum economic pressure” on North Korea—so the time for horse-trading and slow-walking is over. And while we are at it, these governments’ official economic support for North Korea shouldn’t be off the table. Isn’t it time to spotlight and track those flows, too?
As we work to rein in China and Russia, we should not lose sight of the money that North Korea receives through arrangements with other governments—including states in Africa and the Middle East that receive U.S. foreign aid. Yet much of what Washington needs to do in this economic campaign, alas, is currently unknown. This is a failure of our intelligence community that must be immediately addressed if “maximum economic pressure” is to stand a chance of ending up as more than just a slogan.
By the very nature of intelligence activity, spy agencies cannot take credit for many of their successes. But the U.S. intelligence community doesn’t deserve a slap on the back for its performance in this particular area. It should be something of an embarrassment, for example, that some of the best work mapping out the connections between Chinese front companies and the North Korean military these days should apparently come from a small think tank, C4ADS, that relies entirely on open sources. And that is just one small example of intelligence insufficiency. Our government also appears to know much less than it should about the financial relations between Pyongyang and its backers in Tehran, North Korea’s money ties with terrorist groups, and its adventures in crypto-currencies and other harder-to-trace instruments of finance.
Much of what is currently unknown—by our government—about North Korea’s covert international financial networks and overseas holdings is in fact knowable, given better legwork and intelligence. The story of the U.S. government’s interagency Illicit Activities Initiative (2001–6), which methodically mapped out North Korea’s money trails before being derailed by bureaucratic infighting under the George W. Bush administration, provides an “existence proof” that such research can be done. North Korea’s overseas financial networks have had more than a decade since the demise of IAI to evolve and hide their tracks—so a new IAI-style effort would have to play catch-up.
With the information we could gather from a well-funded and coordinated intelligence initiative, we can help shut down North Korea’s worldwide criminal enterprises, arrest their international accomplices, freeze and seize violators’ overseas assets (not just Kim Jong Un’s assets: think Iran, Syria, Hezbollah, and the rest), and levy potentially devastating fines against commercial and financial concerns that willfully aid North Korea in violating the law. We can also improve the efficacy of existing proliferation-security efforts.
With better intelligence, better international coordination, and the will to get the job done, an enhanced “maximum economic pressure” policy could swiftly and severely cut both North Korea’s international revenues and the vital flows of foreign supplies that sustain the economy. An enhanced Proliferation Security Initiative (PSI), indeed, could use interdiction not only to monitor the goods entering North Korea but also to regulate and, as necessary, suppress that level. (UN sanctions, by the way, make provisions for humanitarian imports into North Korea a matter the U.S. and others must attend to faithfully.) Yes, this is economic warfare, and it can be conducted with much more sophisticated tools than were available in the 1940s. In fact, it should be possible through such a campaign to send the North Korean economy—and the North Korean military economy—into shock, possibly even in fairly short order.
Eight: Success and Its Failures
If comprehensive sanctions and counter-proliferation against North Korea fail, we enter into a new world with darker and much less pleasant options. But what if, by some measure of success, they turn out to succeed? What then?
In addition to their intended consequences, successful policies always have unintended ones. Three potential consequences of an effective economic-pressure campaign against the North Korean regime deserve special consideration in advance.
The first concerns the role of North Korea’s donju elite in a future where North Korea is increasingly squeezed economically. These “money masters,” who until now have enjoyed waxing wealth and have lived with rising expectations under Kim Jong Un, would stand to suffer very sharp financial loss. What would a serious reversal in the fortunes of this privileged element in North Korean society mean for elite cohesion and for regime dynamics? Even North Korea has domestic politics. Poorly as we may be able to apprise North Korean politics, it would behoove us to try to understand in advance how such a change would alter the realm of the possible within the country—and what new opportunities such internal developments might present.
Second is the all-too-likely possibility that North Korea would careen back into famine under an effective sanctions campaign—and not because Pyongyang would be incapable of purchasing or procuring sufficient food to feed its populace. The reason North Koreans starved last time was the government’s dreadful songbun system, still very much in force today. Songbun is a unique North Korean instrument of social control that carefully subdivides the North Korean populace into “core,” “wavering,” and “hostile” classes, lavishing benefits and meting out penalties according to one’s station. Life chances in North Korea—and no less important, death chances—turn on one’s assigned class. Just as it is a safe bet that virtually no one outside the “core classes” has amassed great donju riches, so too death from starvation is almost entirely consigned to the state’s designated enemies from the “hostile classes.” Only “intrusive aid” (provided on site by impartial outsiders) and public diplomacy, including calling out Dear Respected on this vile practice, stand to mitigate the toll of the impending humanitarian-cum-hostage crisis should “maximum economic pressure” work.
Finally, there are the countermeasures Pyongyang will surely adopt if the economic-pressure campaign is attaining a measure of success. These will be intended to terrify and to break the will of the sanctioners. North Korean leaders are practiced masters of white-knuckle, bared-fang diplomacy—and they would naturally regard the stakes in this contest as particularly high. No national directorate is so expert in brinkmanship or so consummate at carefully gaming through seeming “outbursts” well in advance.
North Korea will test the stomach and the will of the pressure alliance, threatening what sees as the campaign’s weakest and the most exposed elements and ranks. These probes and tests may be military in nature, with a range of options that could well include threats of nuclear war. Pyongyang will try to make Washington and the international community fear that they are facing a “Japan 1941 moment,” with a cornered Kim family regime: a déjà vu of the drumroll that led to World War II in the Pacific, only this time against a nuclear-armed adversary.
This would be a point of incalculable danger. There are good reasons to think North Korea would not resort to first use of nuclear weapons, most compelling among them, its own state-enshrined doctrine known as “Ten Principles for the Establishment of a Monolithic Ideology.” (The essence of this doctrine: The Hive must keep the Queen safe, and at all cost.) But there is no sugarcoating the terrible risks, including risks of miscalculation, inherent in North Korea’s most likely countertactics.
Any way you look at it, North Korea’s adversaries are in for a long and bumpy ride. The alternative to thwarting North Korea’s war drive now is permitting Pyongyang to prepare to fight and win a limited nuclear war in the future, at a time and place of its own choosing, when the situation for America and her allies may be even more perilous.
Like it or not, Pyongyang plays for keeps, and we are in this with them for the long game. The next move is ours.
1 Full disclosure: I am one of those who seriously underestimated North Korea’s resilience in the 1990s. Twenty years ago, I would have thought it almost unimaginable for the North Korean state to survive to this day. Needless to say, subsequent events have proved otherwise, and studying my own mistakes has led to the analysis under way here.
2 Joan Robinson, “Korean Miracle” Monthly Review, January 1965, Vol. 16, No. 8, pp. 541–549.
3 Korea, the economic race between the north and the south: a research paper, ER 78-10008, January 1978, CIA.
4 Kim Il Sung, Works, Vol. 31 (Pyongyang: Foreign Languages Publishing House, 1987), p.76.
5 Nicholas Eberstadt and Judith Banister, The Population of North Korea. (Berkeley, CA: University of California, 1992).
6 Kim Il Sung, Selected Works, Vol. 5 (Pyongyang: Foreign Languages Publishing House, 1972), p. 431.
7 On this man-made, and completely unnecessary, tragedy, see Stephan Haggard and Marcus Noland, Famine in North Korea: Markets, Aid and Reform, (New York: Columbia University Press, 2007).
8 Justin V. Hastings, A Most Enterprising Country: North Korea in the Global Economy. (Ithaca NY: Cornell University Press, 2016).
9Perhaps the best analysis of this transformation is Kim Byung-Yeon, The North Korean Economy: Collapse and Transition. (New York: Cambridge Univer sity Press, 2017)
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
s I write, Michael Wolff’s Fire and Fury has become a mere husk of a book, emptied of everything consumable and tasty. And it’s only been out a week! In the hinterlands, the book is selling briskly, but here in Washington, we already find ourselves in the final phase of a mass hysteria, a hangover that we would call the Woodward Detumescence.
Woodward is Bob Woodward, of course. Every few years, for more than 30 years, Woodward has sent Washington reeling with a book-length, insider account of one administration after another, presenting government as high drama, with a glittering cast of villains and heroes.
The sequence of the symptoms seldom varies. First comes the Buildup. We hear premonitory rumblings: Freshly minted Woodward revelations are on the way! His publisher declares an embargo on the book, mostly as a tease. Another reporter writes an unauthorized report guessing at what the revelations might be. Washington can scarcely breathe. At last the first excerpts appear in a three-part serial in Woodward’s home paper, the Washington Post.
We enter the Swoon.
The excerpts tell of betrayals and estrangements, shouting matches and tearful reconciliations, tough decisions and disappointing failures of nerve, all at the highest levels of government. Woodward goes on TV shows to explain his findings. Sources attack him; he stands by his book. The frenzy intensifies, the breathing is labored, until, at last, comes the Spasm, as all the characters from the book refuse to comment on a “work of tabloid fiction.”
Then the newspaper excerpts end, there is a collapsing sigh, a dying fall, and the physical book, the thing itself, appears. The text seems an afterthought, limp as a wind-sock and, by now, even less interesting. If there were more revelations to be found in its pages, after all, we would have read them already. We skulk back to the routines of what passes for normal life in Washington, slightly abashed at our momentary loss of self-control. This is the Woodward Detumescence. Shakespeare foresaw it in a sonnet: “the expense of spirit in a waste of shame.”
The Fire and Fury frenzy omitted some of these steps, prolonged others. It was touched off by an excerpt in New York, appearing a week before the book’s original publication date. Running to roughly 7,000 words, the excerpt was densely packed and so juicy it should have come with napkins. The article’s revelations about White House backbiting and self-loathing are by now universally known, and have been from the moment the excerpt hit the Web. One thing they make plain is that Michael Wolff bears little resemblance to Bob Woodward. Over a long career, our Bob has shown himself to be a tireless and meticulous reporter. He is a creature of Washington, besotted by government; Woodward never found a briefing paper he wouldn’t happily read, as long as it was none of his business.
Wolff, on the other hand, is an incarnation of Manhattan media. He’s a 21st-century J.J. Hunsecker, the gossip columnist in the great New York movie Sweet Smell of Success, although, unlike J.J., he has a pleasing prose style and a sense of irony. His curiosity about the workings of government and the shadings of public policy is nonexistent. “Trump,” Wolff writes with typical condescension, “had little or no interest in the central Republican goal of repealing Obamacare.” Neither does Wolff. Woodward would have given us blow-by-blow accounts of committee markups. Wolff mentions Obamacare only glancingly, even though it was by far the most consequential failure of Trump’s first year.
If you want to learn how Trump constructs that Dreamsicle swirl that rests on the top of his head, or the skinny on Steve Bannon’s sartorial habits, then Wolff is your man. He tries to tell his story chronologically, but he occasionally runs out of things to say and has to vamp until the timeline lets him pop in a new bit of shocking gossip. Early in the book, for example, after he has established that Trump is reviled and mocked by nearly everyone who works for him, Wolff leads us into a tutorial on The Best and the Brightest, David Halberstam’s doorstop on the 1960s White House wise men and whiz kids who thought it would be a great idea to get in a land war in Southeast Asia. He calls Halberstam’s book a “cautionary tale about the 1960s establishment.” Wolff’s chin-pulling goes on for several hundred words. Apparently, Steve Bannon had had the book on his desk.
This is interesting, I guess, and so are the excessive digressions about New York real estate, Manhattan’s media culture, the evolution of grande dames into postfeminist socialites, and many other subjects that are orthogonal to the book’s purpose. If you’ve bought Fire and Fury, chances are, you wanted to learn things you didn’t know about the first year of the Trump administration. The New York excerpt was chockablock with such stuff, told in sharply drawn scenes and vivid, verbatim quotes. But the book dwells much more on general impressions, flecked here and there with scandalous asides. In these longeurs—most of the book—Wolff writes at an odd remove, from the middle distance. The prose loses its immediacy and becomes diffuse.
He’s not so much padding his book as filibustering his readers, perhaps hoping to deflect a reader’s attention from another revelation: He really hasn’t delivered the goods. All of Wolff’s most scandalous material was filleted and packed into the New York excerpt. Listening to discussions among friends and colleagues, I keep hearing the same items, all from the magazine: Staffers think Trump might be (literally) illiterate, Steve Bannon thinks the Mueller investigation puts Trump’s family in legal jeopardy, the president uses vulgar language when talking about women. He is a child, Wolff wants us to know, and the disorder of his government is directly traceable to that alarming fact.
And it is indeed alarming, but nobody who has followed Trump’s Twitter feed or watched his news conferences will think it’s news. Wolff wrote a scintillating 7,000-word magazine article; the problem is that he spread it over a 328-page book. The rumor has gone around (hey, if he can do it, so can I) that before submitting his manuscript, Wolff warned his publisher that it didn’t contain much that was new.
This explains a lot. Wolff clearly was unprepared for the explosion set off by the magazine article. You could see it in his halting explanations of his journalism techniques. When his quotes were questioned, he let it be known that he had “dozens of hours” of tapes. (Other news reports inflated the number to hundreds.) When quotes continued to be questioned, he was asked, by colleagues and interviewers, to release the tapes. He refused. Wolff said his book threatens to bring down the president—on evidence that he alone has and won’t produce.
Spoken like a true journalist! Much has been made of this modern Hunsecker’s techniques. One explanation for the candor of his sources is that Wolff gained their confidence by misleading them about his intentions; they had concluded he was writing a book that would show the administration in a kinder light. “I said what I had to to get the story,” he proudly told one interviewer. Many of his colleagues in the press have shrugged at his willful misdirection—his deception, in fact—as a standard trick of the trade.
They’re probably right. But they demonstrated again the utter detachment of journalists from normal life. Whole professions are generally and rightly maligned—trial lawyers, car salesmen, lobbyists—because ordinary people see that prevarication is built into their work. When it comes to the people who write the books they read, they have a right to ask how far the deception goes. If a writer will mislead his sources, how can we be sure he won’t he do the same to his readers?
“My evidence is the book,” Wolff responds. I’m not sure what he means. In any case, as the Detumescence recedes, it becomes clearer that his evidence is thin. The book isn’t particularly good journalism, but it’s a triumph of marketing. Our Trump hatred has been targeted with such precision that we’ll lower any standard to embrace Fire and Fury, even if the tale as told signifies nothing, or nothing much.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
An uncontroversial museum still manages to offend the ignorant
t one point during his 2000 campaign, George W. Bush gave his listeners a folksy admonition: “Don’t be takin’ a speck out of your neighbor’s eye when you got a log in your own.” This amused Frank Bruni of the New York Times, who called it “an interesting variation on the saying about the pot and the kettle.” Bruni’s words in turn amused the substantial portion of Americans who knew that Bush was actually quoting Matthew 7:3. To them it was simply unimaginable that someone could graduate Phi Beta Kappa with a degree in English and subsequently study at the Columbia School of Journalism, as Bruni did, without having once encountered the Sermon on the Mount. The anecdote revealed the extent to which, in the space of a few generations, America went from habitual Bible reading to biblical illiteracy, and of the most abject and utter kind. This is the justification for the Museum of the Bible.
The Museum of the Bible, which opened in Washington, D.C., in November, is an enterprise of appropriately pharaonic ambition. At a capacious 430,000 square feet, it cost half a billion dollars to build, all of it contributed privately. It is the brainchild of Steve Green, the president of Hobby Lobby, the chain of arts-and-crafts supply stores that successfully challenged the contraception mandate of Obamacare. Indeed, to those who felt the Burwell vs. Hobby Lobby decision was a catastrophic setback to the separation of church and state, the coming of the Museum of the Bible seemed nothing less than the physical manifestation of that threat—an unwelcome expression of evangelical political power standing in plain sight of the Capitol. Burwell vs. Hobby Lobby has loomed large in the coverage of the museum, as has the Green family—as well as the $3 million fine levied on Hobby Lobby for illegally importing cuneiform tablets from Iraq.
But those who looked forward to exposing the museum as a bigoted and ignorant enterprise, with a laughably literal view of biblical truth, have been bitterly disappointed. Its exhibitions are conspicuously even-handed and scholarly, and not at all sectarian. The Museum of the Bible is no vehicle of theological indoctrination. If anything, it errs in the other direction. When it was first incorporated as a nonprofit organization in 2010, it pledged itself “to inspire confidence in the absolute authority and reliability of the Bible.” It has quietly lowered its sights since, and now seeks only “to invite all people to engage with the history, narrative, and impact of the Bible.” This makes the museum less objectionable (who can object to an invitation?), but a less incendiary Bible is also a less interesting one. The danger of the Museum of the Bible is that by sidestepping the question of biblical truth it might downgrade the Good Book, as it were, into one of the Great Books.W
ith all their resources, the Green family might easily have commissioned a celebrity architect to build a prodigy of a museum. But they did not want a building that would compete with its contents. Instead, they bought a 90-year-old cold-storage warehouse two blocks south of the Mall, and into its windowless brick shell they inserted six stories of exhibition and administrative space. The interior is intelligently planned but hardly remarkable, and nothing about its materials, finishes, or details speaks of the Bible or antiquity. If anything, it has the glossy impersonal cheeriness of contemporary hotel architecture.1
The heart of the museum is in the exhibitions of the third floor (The Stories of the Bible) and the fourth (The History of the Bible). These are utterly different in texture and tone, but they work in tandem—one delivering sensation and the other information. This is hardly a new distinction; it is the difference between the stained-glass window and the sermon.
The Stories of the Bible are told through crowd-pleasing “immersive” galleries—the fashionable term for displays in which a coordinated battery of sound effects, musical cues, dramatic lighting, and moving forms are combined to induce an overwhelming sensory experience in the viewer. These were devised by BRC Imagination Arts, a design firm that specializes in corporate branding—as they put it, in “creating emotionally engaging experiences that generate lasting brand love.” When it comes to emotionally engaging material, the exhibits Genesis and Exodus offer at least as much as the Heineken Experience (another recent BRC creation) and here the designers have outdone themselves. Noah’s Ark presents “a unique, stylized representation of the great flood, they tell us.”
“Stacks of boxes tower over them. Inside each box are artistic representations of animals—two by two—lit by flickering candlelight. Guests hear the raging of the storm outside and the creaking of the wooden ship.”
Somewhat later, although not until they have seen “a hyssop bush bursting into flames from the story of Moses,” visitors themselves can part the Red Sea, or an abstraction thereof, created by a web of taut metal cables shimmering under blue light. (It is curious how the highly cinematic events of the Hebrew Bible lend themselves to abstract expression.)
By contrast, the World of Jesus is rendered in literal terms, by means of a realistic re-creation of a first-century village complete with actors in period costume. In the Galilee Theater, visitors can watch a short film and see John the Baptist confronting King Herod (as played by John Rhys-Davies). Even those of us who are allergic to historic reenactments will see that it is carried through with extreme competence and attention to detail. What is there is done well; it is what is not there that has caused a good detail of quiet grumbling. To the bafflement of many, the central events of the Christian Bible—the Crucifixion and Resurrection—are not represented. Were there fears that a scene of unspeakable horror would disturb the museum’s upbeat, family-friendly ambiance? Or is it that its academic advisors come from the mainstream of contemporary Biblical studies, for whom the Resurrection is not a truth but a trope? Perhaps both factors are at play.
Another curious aspect of the display, though unhappy, is understandable: The Hebrew and Christian Bibles are rendered as two segregated and self-contained experiences, and like oil and vinegar, the exhibition paths are not allowed to mix. Unfortunately, the visitor who has waited for the one is unlikely to stand in line again for the other. One can appreciate that the organizers wanted to avoid a linear sequence in which the Hebrew Bible serves as mere prelude to the New, but in the process, the relationship between the two is lost. Surely a compromise might have been found, perhaps with the occasional physical passage between the two, so that the viewer might move back and forth and make his own connections—alas, a proposition that is heretical in today’s world of manipulative museology.
If the third floor gives us the stories in the Bible, the fourth gives us the book itself—not only the text itself but its translations, copies, orthography, printing, binding, illustrations and all else that is associated with a literary artifact. The oldest objects here (although of disputed authenticity) are tiny fragments of the Dead Sea Scrolls, and from them to the most recent translations, one is struck by the fastidious probity with which the text was transmitted. Here we learn the high stakes of tampering with the Bible in the story of how the 14th-century theologian John Wycliffe was posthumously excommunicated for daring to make the first English translation. We also learn how the Bible acted to codify and order regional dialects into a national language; Martin Luther’s translation did this for the German language just as the King James translation did a century later for English. A remarkable display shows the innumerable phrases from the Bible that have entered vernacular speech in the world’s languages, some of which I did not know (e.g., “den of thieves,” “suffer fools gladly,” “at their wit’s end,” etc.
Here one senses a certain reservation—a curatorial suspicion, perhaps, that vellum manuscripts and printed books are intrinsically boring. There is nothing an exhibition designer fears more than a bored visitor. This would account for the rather plaintive effort to provide visual relief in the form of arresting objects: a facsimile of the Liberty Bell with its inscription from Leviticus, a tableau of books burned by the Nazis, and statues of Galileo and Isaac Newton. These diversions suggest that the designers did not trust the words themselves and their hotly disputed variants and interpretations to generate interest on their own.
This is a lost opportunity. For instance, the history of the English translations would have been far more effective with a comparison of representative examples. One might illustrate various renderings of the 23rd Psalm, juxtaposing the lapidary King James version (“The Lord is my shepherd; I shall not want”) with the explanatory translation of the International Standard Version (“The Lord is the one who is shepherding me; I lack nothing”) or the willful flatness of the Good News Bible (“The Lord is my shepherd; I have everything I need”). A few examples from the recent push to purge the Bible of any and all sexist language would also have been eye-opening. To refer to this trend blithely in passing, as the wall labels do, without confronting the viewers with the sobering reality of a gender-neutral Bible is a sign of either haste or indifference.
And for those who are not fascinated by the fact that the neuter possessive its appears just once in the entire King James translation, they still have the chance to take a peek at Elvis Presley’s personal copy of the Bible.T
he truth is, the Museum of the Bible is as innocuous, gregarious, multifaceted, and congenial an institution as one might have hoped. It certainly does not preach biblical inerrancy; the attentive reader will see that Noah’s flood is anticipated by the much older flood story in the epic of Gilgamesh, complete with divine instructions on building the ark.
Nonetheless, the museum has been greeted with extraordinary hostility, although of a strangely unfocused sort. It has hardly been “dogged by scandal,” as Business Insider charged, apart from the importation of antique materials with a false provenance (something of which the Metropolitan Museum of Art and the Getty Museum have both been guilty). The real objection is not its business practices or its theology (which it wears so lightly as to be invisible), but rather that it comes from the wrong side of the cultural tracks. One has the sense that the museum is a social faux pas, that the wrong guests have crashed the party, blundering uninvited into Washington and violating rules of which they are ignorant. CityLab, the digital magazine of the Atlantic, expressed this attitude most pithily when it called the museum “pure, 100 percent, uncut megaplex evangelical white Protestantism…megachurch concentrate.”
The charge that the museum presents a narrow and exclusively white version of Protestantism is undercut by a single visit; the audience is comprehensively ecumenical and international. But it has been repeated endlessly nonetheless, in part because of the recent publication of Bible Nation: The United States of Hobby Lobby, by Candida R. Moss and Joel S. Baden—a furiously ambitious attempt to discredit the museum, its theology, its founders, and Hobby Lobby itself. (This may be the first time a book has been published condemning a museum before it was built.) Moss first came to public attention in 2013 with The Myth of Persecution: How Early Christians Invented a Story of Martyrdom, which charges early Christians with forging accounts of their suppression. Bible Nation is written in a similarly debunking spirit. For her, the “thousands of fragments of contradictory material” in the Bible make it pointless to try to make of it a coherent or meaningful document. The insights of contemporary biblical scholarship, she says with conspicuous exasperation, ought to be “a faith killer.”
Clearly they have been for her. But if anything, the museum’s fourth floor testifies to the opposite: This is a building built by believers for whom the analysis of the materials contained within is a noble task. The curators have taken painstaking efforts to get it right, as did those scribes who through the millennia worked to reconcile the discrepancies, to choose among the contradictory variants the ones that are most rigorously supported. And where the conflicting documents are irreconcilable—as between the two opening chapters of Genesis, or between the four Gospels—the procedure has always been to preserve multiple sources rather than impose an arbitrary uniformity. In the end, the Museum of the Bible pitches it about right.
Its greatest surprise is that it makes no truth claim. The central propositions of the Hebrew Bible (God’s covenant with his chosen people) and the Christian Bible (Christ’s Resurrection) are subordinated to the existence of the Books that carry those propositions. One might imagine that a museum devoted to other monumental culture-shaping books, say The Iliad and The Odyssey, would look similar in approach.
And of course they are right to have done so. The place to make claims to the truth in these cases is a church or synagogue, not a museum. But even the lesser claim that the Museum of the Bible makes, that the Bible is a foundational document of our civilization, is to many an unwelcome one. And as biblical ignorance grows, the claim grows progressively more unwelcome. The Bible seems to be one of those books that the less people know about it, the less they like it. And for those who know it only as a “Bronze Age document” (one of Christopher Hitchens’s favorite epithets) and from some of the livelier passages in Leviticus, it is an offensive absurdity.
Writing in the Washington Post, the novelist and art historian Noah Charney asserted that “in Washington, separation of church and state isn’t just a principle of governance, it’s an architectural and geographic rule as well.” It’s unclear who established such a rule, and in any case, the “principle” of the “separation of church and state” does not originate in the Constitution. Rather, its source is to be found in Matthew 22:21: “Render therefore unto Caesar the things which are Caesar’s; and unto God the things that are God’s.” We all carry a stock of mental habits and moral values, and a language with which to express them, that ultimately derives from the Bible, whether we have read it or not. The Museum of the Bible merely proposes that we read it. And for all its shortcomings and missed opportunities, and all its fits of cuteness (there’s a Manna Café), it does so with refreshing sincerity and surprising effectiveness.
1 The building has one passage of real brilliance. The entrance portal on Fourth Street is flanked by a pair of immense bronze panels, nearly 40 feet high, that call to mind Boaz and Jachin, the mighty bronze pillars that guarded Solomon’s Temple. In fact, they are panels of text inscribed with the opening lines of Genesis, as printed in the Gutenberg Bible of 1454, the first mass-produced book to use moveable metal type. The letters are reversed, confusingly, until one realizes that this aids in making souvenir rubbings that themselves embody the printing process. The genesis evoked here is that of universal literacy and the cultural transformation wrought by the printed book.
Choose your plan and pay nothing for six Weeks!
Review of '(((Semitism)))' By Jonathan Weisman
Now, two years later, Weisman has published a book about anti-Semitism—and, more specifically, about the supposedly grave threat to Jews springing from the alt-right and the Trump administration. (((Semitism))), for such is the book’s title, suffers from two grave ills. First, Weisman believes that political leftism and Judaism are identical. Second, he knows little or nothing about the political right, in whose camp he places the alt-right movement. Combine these two shortcomings with a heavy dose of self-regard, and you get (((Semitism))): a toxic brew of anti-Israel sentiment, bagels-and-lox cultural Jewishness, and unbridled hostility toward mainstream conservatism, which he lumps together with despicable alt-right anti-Semitism.
According to Weisman, Judaism derives its present-day importance from the way it provides a religious echo to secular leftism. This is his actual opening sentence: “The Jew flourishes when borders come down, when boundaries blur, when walls are destroyed, not erected.” Thus does he describe a people whose binding glue over the millennia is a faith tradition literally designed to separate its adherents from those who are not their co-religionists.
This ethnic-Jew-centric perspective leads Weisman to reject not merely Jewish observance, which he finds parochial and divisive, but the tie between Judaism and Israel, which he subtly titles “The Israel Deception.” He laments: “The American Jewish obsession with Israel has taken our eyes off not only the politics of our own country, the growing gulf between rich and poor, and the rising tide of nationalism but also our own grounding in faith.” He sneers at Jews who promote the “tried and true theme of the little Israeli David squaring off against the giant Arab Goliath.” Weisman believes, like John Mearsheimer and Stephen Walt, that members of both parties are guilty of “kissing the ring” at AIPAC, of “turn[ing] to mush when the subject was Israel.” In fact, Weisman says, the anti-Semitic BDS movement on college campuses “is worrisome as much for what it says about the American Jew’s inextricable links to Israel as for what it says about anti-Semitism.” In his view, “Barack Obama was the apotheosis of liberal internationalism.…The Jew thrived.”
Thus Weisman has this to say about his infamous Iran-deal chart: “I had my own brush with fratricidal Jew-on-Jew violence during that heated debate.” Was Weisman attacked? Assaulted? No, he received some nasty notes in response to running a chart. Weisman says he found the uproar “absurd” and laments that he is “still hearing about it.” Poor lamb.W eisman gets it right when he writes about the mainstreaming of the alt-right—the winking and nodding from Breitbart News and Donald Trump himself, the willingness of many in the mainstream to reward alt-right popularizers like Milo Yiannopoulos. (I left Breitbart in March 2016 due to differences regarding our coverage of the presidential campaign). Weisman is at his best when describing the origins of the alt-right and their infiltration of more well-read outlets.
But he can’t stop there. Instead, he seeks to impute the alt-right to the entire conservative movement and builds, Hillary Clinton–style, a fictitious basket of deplorables amounting to half the conservative movement. He cites “Christian fundamentalist” Israel supporters, to whom he wrongly attributes universally apocalyptic End of Times motivation. He condemns anti-immigration advocates, whose opposition to importation of un-vetted Muslim refugees he likens to anti-Semitic anti-immigrant movements of years past. He reviles “anti-feminists,” those who oppose political correctness in video games, Republican Jewish Coalition members who laughed at Trump making a Jewish joke, and free-speech advocates supposedly engaged in “forcible seizure of the free-speech movement” (a weird charge to level, considering that it cost Berkeley $600,000 to prevent Antifa from burning down the campus when I visited). In other words, pretty much anyone who didn’t vote for Hillary Clinton gets smeared with the alt-right brush, outside of those specifically targeted by the alt-right.
The problem of alt-right anti-Semitism, Weisman thinks, is just a problem of anti-leftism. If we could all just give money to the notoriously left-wing propaganda-pushing Southern Poverty Law Center, watch Trump-referential productions of Eugene Ionesco’s Rhinoceros at the Edinburgh National Festival (yes, this is in the book, and no, it is not parody), ignore anti-Semitic attacks at the Chicago Dyke March (I am not making this up), slap some vinyl signs on synagogues (no, I am still not making this up), and “not get too self-congratulatory” (seriously, guys, this is all real), all will be well. In the end, Weisman’s goal is to build a coalition of ethnic and political groups, cobbled together in common cause against conservatives—conservatives, he says, who represent the alt-right support base.
As the alt-right’s chief journalistic target in 2016, I’m always happy to see them clubbed like a baby seal. And there is a good book to be written about the alt-right. At times, Weisman borders on it, particularly when he seeks to investigate the bizarre relationship between Trump and the trolls who worship him.
But Weisman’s ardent allegiance to leftism leads him to misdiagnose the problem, to ignore the rising anti-Semitism of his own side (the DNC nearly elected anti-Semite Keith Ellison its leader last year), to prescribe the wrong solutions, and, most of all, to react in knee-jerk fashion to the alt-right by flattering himself as the epitome of everything the alt-right hates. Thin as the paper it was printed on, (((Semitism))) is a failure of imagination.
Choose your plan and pay nothing for six Weeks!
Review of 'The People vs. Democracy' By Yascha Mounk
The save-democracy writers have generally taken two tacks in answering it. Some see a simple replay of the previous century: The West’s authoritarian spirit has resurfaced, they say, and seduced the multitudes once more. It is up to heroic liberals to fight back, as their forebears in the 1940s did. But others have tried to trace today’s crack-up to liberal missteps or even to flaws in the liberal-democratic idea. This is a more useful avenue for those of us concerned with the preservation of self-government.
Yascha Mounk’s The People vs. Democracy wants to be the latter kind of (subtle, thoughtful) book but too often ends up making the cruder arguments of the former. The author, a lecturer on government at Harvard, argues that while liberals took liberalism’s permanence for granted, voters became “fed up with liberal democracy itself.” Elections across the developed world, in which fringe characters and populists routed mainstream establishments, provide the main evidence. Mounk has also collected mountains of public-opinion data, mainly from the World Values Survey, which shows a deeper transformation: People in the U.S. and Europe increasingly reject democratic principles and even hanker for strongman authority.
Fewer than a third of U.S. millennials “consider it essential to live in a democracy.” One out of 4 believes that democracy is a bad form of government. One-third of Americans of all ages now favor some sort of strongman rule, without checks and balances, and 1 of 6 would prefer the strongman to don a military uniform. Similarly, a third of German respondents and an astonishing half of those from Britain and France support strongman rule. Parties of the far right and far left are rapidly expanding their appeal, particularly among young people. There are many more depressing statistics of the kind, presented in numerous charts and graphs throughout.
Mounk thinks there are two factors at play in these attitudes. The first is the emergence of illiberal democracy, or “democracy without rights,” as a serious rival to the current order. Vladimir Putin in Russia, Recep Tayyip Erdogan in Turkey, Narenda Modi in India, and Viktor Orbán in Hungary, among others, exemplify this model. Once elected, these leaders chip away at individual rights and independent institutions until democracy is all but hollowed out and it becomes nigh impossible to remove the ruling party from office. Mounk strongly suspects that the Trump administration plans to pull something like this on the American public, though thus far the president’s illiberal bluster has proved to be just that.
The second factor is undemocratic liberalism, or “rights without democracy.” Here Mounk has in mind technocratic liberalism’s drive to remove an ever-growing share of policy decisions from the purview of voters and their elected representatives. This has been necessitated by the complexity of contemporary problems such as climate change and international trade, Mounk contends. Yet rights without democracy has generated mistrust and cynicism. Liberals, he says, should aim to “strike a better balance between expertise and responsiveness to the popular will.”
Mounk’s sections on the damage wrought by undemocratic liberalism should be instructive to his fellow liberals. But conservatives have for years stamped their feet and pulled their hair over the same phenomenon, only to be ignored by elite liberals on both sides of the Atlantic. Right-of-center readers might be forgiven for sarcastically muttering “no kidding” as Mounk takes them on a guided tour of liberal folly.
Conservatives have been warning about administrative bloat, for example, since at least the first half of the 20th century. It turns out that they had a point. Writes Mounk: “The job of legislating has been supplanted by so-called ‘independent agencies’ that can formulate policy on their own and are remarkably free from oversight.” Ditto activist judges: “The best studies of the Supreme Court do suggest that its role is far larger than it was when the Constitution was written.” And ditto the European Union’s democratic deficit: “To create a truly ‘single market,’ the EU has introduced far-reaching limitations” on state sovereignty.
He also strikes upon the idea that nations really are different from one another, and in politically significant ways. “After a few months living in England,” the German-born author confesses, “I began to recognize that the differences between British and German culture were much deeper than I imagined.” No kidding. What about the anti-Western monoculture that lords over most college campuses? Here, too, the right was on to something. “Far from seeking to preserve the most valuable aspects of our political system,” Mounk writes, liberal academe’s “overriding objective is, all too often, to help students recognize its manifold injustices and hypocrisies.”
Mounk’s discovery of these core conservative insights, however, doesn’t spur a rethink of his reflexive disdain for conservatives. This is most apparent in his coverage of American politics. The book is supposed to be a battle cry for democracy to rally left and right alike. Yet, with few exceptions, conservatives and Republicans are cast as cynical operators who rely on underhanded tactics and coded racism to undermine democracy and ultimately abet the populists. (Hillary Clinton and Barack Obama receive adulatory treatment.)
He describes Senate Majority Leader Mitch McConnell’s refusal to hold hearings for Merrick Garland, Obama’s final Supreme Court nominee, and GOP filibustering of Democratic legislation as “abuse[s] of constitutional norms” (they weren’t). But he pooh-poohs popular outrage at Clinton’s unlawful use of a private email server and elides the Obama Internal Revenue Service’s selective targeting of conservative nonprofits ahead of the 2012 election.
He also underestimates a third development of recent years—liberal illiberalism (my term, not his)—a liberalism that not only lacks democratic legitimacy but seeks to destroy, in the name of tolerance, the fundamental rights of those who stand in the way of full-spectrum progressivism. This is the kind of liberalism that compels nuns to pay for contraceptives and evangelical bakers to bake gay-wedding cakes, silences conservative speakers on campus, and denounces sushi restaurants as “cultural appropriation.”
Mounk isn’t ignorant of these tendencies, and he wants liberals to ease up (a bit). Yet, because he maintains that the censorious left’s heart is in the right place, he can’t seem to reach the necessary conclusion: that much illiberalism today comes, not from the right, but from ostensibly liberal quarters, and that this says something about the nature of contemporary liberal ideology. The true illiberal villains, for Mounk, are only ever the Modis, Trumps, and Orbáns—plus the troglodytes down South. Well-intentioned liberals who back censorship, he writes at one point, “ignore what would happen if the dean of Southern Baptist University…were to gain the right to censor utterances” he dislikes.
In fact, there is no such institution as “Southern Baptist University.” According to the most recent rankings from the Foundation for Individual Rights in Education, however, four of the 10 worst U.S. colleges for free speech last year were public schools located in blue states, while five were blue-state private or religious schools with longstanding reputations for progressivism (Mounk’s own Harvard among them).
His quickness to frame Southern Baptists as illiberal bogeys is telling and suggests that, for all its exhortations against liberal highhandedness, Mounk’s book comes from the same high-handed place. It colors the author’s approach to questions of nationalism and immigration that are at the heart of the current ferment. He concedes that liberal democracy is compatible with voter demand for limits on mass migration. But he can’t help but attribute those demands to irrational “resentment,” eschewing completely the—perfectly rational—fear of Islamist terrorism.
He sees the nation-state as an “imagined community” to which too many of our fellow citizens remain attached. Ideally for Mounk, the empire of rights and procedural norms would thrive independently of nationhood, civilizational barriers, and sacred communities. For now, he allows, liberals unfortunately have to contend with these anachronisms. His view is an improvement over the liberal transnationalism that is still committed to doing away borders altogether, even after the popular counterpunch of 2016. Still, why should Poles or Hungarians or Britons remain politically attached to Polish, Hungarian, or British democracy? What is it about Polishness as such that matters to Poland’s democratic character? Mounk has no answers.
No wonder, finally, that the author never satisfactorily links liberalism’s turn against democracy and the rise of illiberal democrats. He can never bring himself to say outright that the one (rights without democracy) is begetting the other (democracy without rights). Liberals, of the classical and the contemporary varieties, badly need a book that offers such uncomfortable reckonings. Yascha Mounk’s The People vs. Democracy is not it.