The Demotion of Man
In the 20th century, the ordered universe broke apart into atoms, and man found himself alone. William Butler Yeats expressed the new sense of breakdown in a famous phrase, “the center cannot hold” (from his poem “The Second Coming”); it is at the heart as well of T.S. Eliot’s “The Waste Land,” surely the best known modern poem, with its no less famous ending, “These fragments I have shored against my ruins.” The American novelist Walker Percy, alluding to Yeats, recently summed up this conception, which has dominated the modern imagination right up to the present, in the following words:
I think it fair to begin with the assumption, which seems fairly obvious, that, as the poet said, the center is not holding; that the consensus, while it might not have disappeared, is at least seriously called into question. Indeed, to judge from a good many contemporary novels, films, and plays, it often appears that the only consensus possible is a documentation of the fragmentation.
The lost consensus to which Walker Percy refers is one which stretched back, changes and permutations notwithstanding, for centuries. The Renaissance scholar E.M.W. Tillyard, in The Elizabethan World Picture (1943), describes the form it took in Shakespeare’s time. As epitomized in Ulysses’s speech on degree in Troilus and Cressida, a great chain of being was imagined, extending vertically in logical, hierarchical steps from the highest to the lowest spheres of the universe. On the earth, rulers held sway over their subjects, fathers over their families, and man dominated the lower animals. The essential order of things was expressed in the harmonies of music and the rhythm and rhyme of poetry.
It was precisely the loss of this sense of order and authority that Eliot and Yeats mourned, a loss that found formal expression in the early 20th century in musical disharmony and free verse. Yet if the 20th century as a whole has been characterized by the conviction that we are living in a fragmented universe, the truth is that a coherent and ordered picture of the world has nevertheless gradually emerged and in the last decades has taken on definable form. The new picture is, in fact, modeled exactly on the old idea of an ordered hierarchical universe, except that the terms have been systematically reversed. Where once man was at the top of this order—only a little lower than the angels—now he is somewhere near the bottom; and where once his authority was taken to be divinely inspired and ordained, now it has come to be seen as a kind of usurpation.
The traditional justification for man’s dominion over the earth, namely his superiority over other creatures, has now been challenged. After all, it is argued, animals, too, communicate and reason, they have emotions, and what is more they are on the whole less destructive than man. And as with animals, so with primitive man and with societies less developed than our own: both are closer to the sources of natural wisdom, and both wreak less damage upon the ecosystem and biosphere than does Western man.
By analogous reasoning, the earliest human societies, with their pre-capitalist forms of organization, are seen as the best. Here, unlike in earlier attempts to celebrate the past over the present, which tended to favor the high achievements of classical civilization, the contemporary world view elevates the simplicities of past times. As for the future, that ultimate expression of the present, it is seen in the new perspective not as a hoped-for culmination but as a looming epoch of decline, a nightmare of overpopulation, cancer, famine, ecocide, nuclear winter.
The work of hierarchy reversal proceeds on many different fronts, and is undeterred even by such seeming obstacles as the Darwinian theory of biological adaptation, with its suggestion of progress upward to an apex that is man. “From a strict evolutionary point of view,” writes one contemporary biologist, Bernard Rollin,
there is no top of the ladder, there is only a branching tree. If there is a “top,” it has to do with adaptability and species durability and reproductive success. In that case, the rat shares billing with us and we both lose to the cockroach.
The unfavorable comparison of man with cockroach has become a familiar staple of biological discussion—although Stephen Jay Gould is ready to take a stunning step even further down the scale:
Evolution is a copiously branching network, not a ladder, and I do not see how we, the titular spokesmen for a few thousand mammalian species, can claim superiority over three quarters of a million species of insects who will surely outlive us all, not to mention the bacteria, who have shown remarkable staying power for more than three billion years.
Here we have very nearly the ultimate demotion of man, the inferior not only of primitive peoples, other mammals, and the cockroach, but even of bacteria.
But there is still more—or, rather, less. It has remained for contemporary poetry to accomplish a fully imagined reduction of man beneath the status of cockroaches and bacteria. In his 1978 survey, “How to Read the Contemporary Poem,” the critic Paul Breslin points out that “the most popular key word in the new poetry” is “stone,” standing for the “furthest things from the human—the least conscious, the simplest, the most innocent.” What stones represent in the contemporary world picture may be gathered from an interview, cited by Breslin, with the Pulitzer prize-winning poet Galway Kinnell. “If you could go even deeper” on the contemporary scale, says Kinnell,
you’d not be a person, you’d be an animal; and if you went deeper still, you’d be a blade of grass, eventually a stone. If a stone could speak, your poem would be its words.
Beyond this, surely, it is impossible to go, though if one considers the matter carefully enough, it is evident that there must be a reverse hierarchy among stones themselves, from the least valuable (diamonds, surely) to the most (perhaps the friable, vulnerable sandstone).
The strength of a world picture may be said to lie in its capacity to sustain itself in the face of contradictions. Thus, some lines written by John Donne a few years after the death of Queen Elizabeth indicate how it must have felt when the world picture of the time came under threat. Alluding to the philosophical repercussions of the idea that the universe might not be earth-centered, Donne observed that the
. . . new Philosophy calls all in doubt
. . . . . . .
‘Tis all in peeces, all cohaerence gone;
All just supply, and all Relation:
Prince, Subject, Father, Sonne, are things forgot.
Something like the confusion registered by Donne manifests itself (though not always with Donne’s frankness) whenever our own contemporary world picture is threatened.
Of course, where post-Elizabethans were disturbed by evidence that the universe and the earth had not evolved in a harmonious fashion, those who hold the new view feel threatened by evidence of the opposite kind, and specifically evidence militating against the theory that the universe originated in a giant explosion—a big bang. This theory, suggesting chance and randomness as the underlying principles of the universe, supplies nothing that can be taken to ennoble man, and therefore offers satisfactions that are fully the equivalents of those the earth-centered universe of Ptolemy offered to the medieval and early-Renaissance minds. When evidence appears that would seem to contradict randomness—and in some fields, such evidence has been accumulating—it is routinely dismissed. In this respect, at least, the contemporary world picture is as efficacious as any that preceded it.
When it comes to geological history, chance and discontinuity have been wrung from the record with missionary zeal. Starting in the early 1970’s, the mysterious disappearance of the dinosaurs came to be explained by a theory with negative implications for man in two ways. Purportedly, the dust from the crash of a meteorite, a sort of earthly big bang, blocked out the sun for a number of years, thereby depriving the dinosaurs of food. It could thus be concluded that man’s survival as a species was due not so much to his adaptability as to the chance escape from catastrophe of his mammalian predecessors. At the same time, the prehistoric catastrophe served as a warning: through a similar blockage of the sun—this time in a nuclear winter brought on by the use of atomic weapons—man could bring about his self-destruction.
In 1985, however, paleontologists marshaled geological evidence that the extinction of the dinosaurs occurred over too long a period of time to be reconciled with a scenario of catastrophe. Then it emerged that most paleontologists—some 96 percent—had never credited the catastrophe theory in the first place, but had felt constrained to remain silent in the face of their opponents’ moral fervor concerning the threat of nuclear war. To such an extent can the need to adhere to a world picture compromise scientific objectivity.
That the demeaning of man in the new world picture rests on a rather tenuous basis in observation has been repeatedly demonstrated by anthropological and ethnographic contretemps similar in their outlines to the fate of dinosaur catastrophism. The deciphering of the ancient Mayan hieroglyphs, for example, has made it possible to describe the life of that people, and incidentally to interpret the symbolic drawings accompanying their hieroglyphic script. The drawings turn out to depict a range of cruel tortures and self-lacerations. “Before going to war, for example,” says a report in the New York Times, “the king would puncture his penis with a stingray spine, while his wife drew a thorn-barbed rope through her tongue.” The Mayans were frequently at war,
in large part to capture aristocrats for torture and sacrifice. If the Maya sacrificed humans in lesser numbers than the Aztecs, against whom they have often been held up as superior, they tortured their victims more viciously. Ancient ball games, like Roman gladiatorial contests, pitted captives against one another for their lives; the heads of losers were sometimes used for balls.
These discoveries are of interest both in themselves and for the light they throw on the scholars who have been surprised by them. As the Times reports, “Evidence of these darker practices has been available for decades in the Maya’s stone reliefs and paintings,” but scholars had explained it all away, interpreting the reliefs and paintings in such a manner as to place the Maya “on a mist-shrouded pedestal as austere, peaceful. . . students of the stars and the calendar.” The pictures, in other words, functioned as virtual Rorschach tests, onto which their interpreters projected a collective wish to find in Meso-American prehistory an indigenous people at once gentle and mystical and accomplished in science and the arts.
Now that a rather different portrait of the Maya has emerged, a scramble is occurring to keep the bigger world picture in place. If, as the Times reports, the “new image” of the Mayans “is less romantic, it is also more human, scholars are quick to assert.” Thus Professor David Friedel, an anthropologist at Southern Methodist University, believes that the self-laceration practiced by kings in advance of going to war in search of torture victims “indicates a cooperative, sacred relationship between the elites and the commoners.” In other words, if the evidence shows a society’s aristocrats obsessed with self-mutilation and torture, a bit of interpretation will help us see beneath the surface to the class solidarity so characteristic of pre-Columbian America and so lamentably missing from the modern world.
To the north of the Mayans, the American Indians inhabiting what is now the United States fit into the new world picture as well—not so much exemplars of a superior civilization as object lessons demonstrating the defects of our own. The displacing of the Indians has long been a source of shame to enlightened consciences; modern historical accounts commonly include a broad indictment of the early American colonists and their descendants on this score. In the latest perspective, the actions of the colonists are often cited as examples of “exploitation” and “commodity capitalism,” while Indians are portrayed as having lived in benign relationship with the land until introduced to anti-ecological practices by the whites. Such is the thesis of William Cronon’s Changes in the Land, a recent, much-praised ecological history of colonial New England that has won the Francis Parkman prize of the Society of American Historians.
Yet here, too, the evidence has proved to be shaky. At least one scholarly reviewer of Cronon’s book felt bound to point to “a body of evidence” showing that colonial agricultural practices did not rape and deplete the land, as Cronon has it, but rather were “carried on without plows, on fields that were fallowed far more frequently than he concedes.” As for the Indians, “they burned the woods, sometimes usefully but at other times more harmfully, and before moving on registered a heavy impact on the land on which their fields and villages were situated.”
More interesting than this refutation, however, is the spirit in which it is presented. For the reviewer judges the book as a whole to be one “of impressive originality” and “penetrating scholarship,” and he is at pains to assert that the factual refutations in his review in no way damage the book’s thesis. After all, the “ultimate effect” of American settlement “was very much as Cronon has so brilliantly and provocatively described it.” Thus, it does not matter whether an author is right or wrong concerning the period he has chosen for study, so long as he holds the right view of modern civilization.
The effects of a scholarly orthodoxy on those who dissent from it can be seen in the case of Alden Vaughan’s 1965 book, New England Frontier: Puritans and Indians 1620-1675. Even before Vaughan’s book appeared, a consensus had grown according to which, in Vaughan’s words, the Puritans “appropriated the natives’ land, bargained unscrupulously for their furs, and abused individual Indians with impunity.” But Vaughan, starting out in perfect accord with this view, and wanting only to offer a more detailed account than any that existed for the earliest period of New England settlement, was surprised to find that “the evidence suggested a more humane and equitable treatment of the natives.” The Puritan colonists had “sought peaceful and equitable methods of acquiring land and furs, administering justice, and recruiting converts”: they had tried “to deal justly and peacefully with their Indian neighbors.”
Although Vaughan’s book was favorably received on its initial appearance, it soon enough came under attack as its conclusions were increasingly seen to violate the orthodoxy of the time. In one disputed instance, that of the war between the Puritans and the Pequots, Vaughan’s critics found not just that he was wrong in assigning blame to the Pequots but that the war was an act of Puritan “genocide.” No matter that the Pequot tribe was not annihilated. No matter that here and elsewhere the historical record failed to support the charge of genocide. The Puritans, the new scholars declared, must have destroyed the relevant documents.
In the face of this assault, Vaughan was forced into retreat. In 1979 he brought out a revised version of his book in which he muted both his defense of the Puritan settlers and his exposures of Indian atrocities. Vaughan writes in his 1979 preface that on reflection he had overstated the case for the Puritans. And as for the Pequot war,“I am less sure than I was fifteen years ago that the Pequots deserve the burden of blame.” Still, a careful reader will note that Vaughan never really yields on the substantive issues, and in his footnotes he makes it clear that the new historians have built their case on a misuse of scholarship. In the end, Puritan “genocide” turns out to be another catastrophe that never occurred—but also another scholarly article of faith to doubt which can be damaging to one’s reputation.
Doubt of any kind has hardly affected the new world picture of primitive peoples and their virtues. In the early 1970’s, for example, two primitive societies were discovered and presented to the world: the gentle Tasaday of the Philippines and the benighted Ik of Uganda. Both were much exploited in the service of demonstrating the degeneracy of civilization.
In 1971 a dozen natives and their children were discovered living in the rain forest on the island of Mindanao in the Philippines. In The Gentle Tasaday, John Nance, the reporter who first publicized the discovery, wrote in personal terms of the lesson these near-naked people teach to civilized man. As against “the popular image of Stone Age man, hulking and grunting, crude and dull,” the Tasaday knew no cruelty or evil. They were ignorant of “killing, murder, war”; they did not so much as punish their children. When asked to describe evil, they replied: “We can’t think of anything that is not good.”
Charles Lindbergh, appropriately enough a member with Nance of the large, media-event expedition that quickly made its way to the Tasaday, expounded further on their significance for modern man. “The rise of intellect has coincided with the decline of natural life,” Lindbergh explained. Technology, produced by intellect, has “taken us to a stage where life itself is endangered, man included. We are destroying our environment through pollution.” For both him and Nance, the answer to this danger lay in the countercultural ideal of “touch and tenderness,” epitomized by the Tasaday with their “simple beauty, their mysterious purity.”
As it happens, the enthusiastic reception accorded to Nance’s book revealed far more about the self-image, of contemporary civilized man than was ever revealed about the two dozen Tasaday, who were only briefly visited before being cut off from the outside world. Nance, in fact, had himself originally set out to expose the early reports of the Tasaday as a hoax, perpetrated (as he then thought) by a flamboyant official of the Marcos government named Manuel Elizalde. Nance had been taken to the Tasaday under suspicious circumstances. Elizalde let him see the group only after he himself had gone ahead to meet with them, by his own admission arranging for them to remove their clothing and don orchid leaves, which he claimed to have been their authentic dress up until the time they were discovered. He then seated them on a log and finally presented them to Nance, who found their pose “too perfect.” But at precisely this moment, Nance, flooded with awe, was converted from skeptic to apologist.
In retrospect, it becomes evident how much common sense had to be sacrificed in the service of this conversion. One notices now, for instance, that Nance’s report of a “lack of serious health problems” among the Tasaday is belied in his book by accounts of disease. And recently it has been said once again that the whole story of the discovery of the Tasaday may be fraudulent. Yet the wish to believe remains as strong as ever. Thus, when the New York Times, breaking a month-long silence, finally carried a report of the new charges of fraud, it restricted itself to the information that recent visitors, the first in ten years, had found the Tasaday wearing clothing and in possession of manufactured tools and other objects. “In interviews,” the Times went on,
two anthropologists who recently revisited the tribe said these new possessions, which had aroused the skepticism of Swiss and German reporters who saw them recently, were an expectable product of the tribe’s first contacts with outsiders in the early 1970’s.
But why, if the Tasaday had come into extended contact with civilization in the early 1970’s, should the Germans and Swiss have been surprised at the clothing? The answer, though one would not have learned it from the Times, is that the Germans and Swiss were not surprised at all. Instead, they reported being told by the Tasaday that Elizalde had “forced us to live in the caves so we could be called cave men. Before then, we had worn clothing, even though it was rather torn.” The Times not only omitted this information, which was available to it as a subscriber to the Reuters news service, but also failed to mention that the two anthropologists interviewed for its own story were members of the original Nance expedition and deeply committed to the Tasaday story.
The “discovery” of the Tasaday, whether or not a hoax, seemed tailor-made to buttress the new world picture; by contrast, the mere existence of the Ik seemed calculated to undermine it, and to redress the balance in favor of civilization. The Ik were studied by Colin Turnbull, a professional anthropologist whose book on them, The Mountain People, appeared in 1972. As against the Tasaday, the life in nature of this isolated and primitive African group was one of unexampled Hobbesian viciousness. The individual Ik, whether adult or child, foraged strictly for himself. The group had failed even to establish a sufficient degree of community to arrange for the disposal of human waste, and as a result it lived in the unrelieved stench of human feces. Children were barely tolerated by their parents, and then only until they were three, at which point they were expelled from their homes. “Men would watch a child with eager anticipation as it crawled toward the fire, then burst into gay and happy laughter as it plunged a skinny hand into the coals.” Virtually universal among the Ik were adultery, deception, cruelty to others and to one another. Far from protecting one another, “anyone falling down was good for a laugh . . . particularly if he was old or weak, or blind.”
The implications of such a people for the romanticized conception of the primitive would seem all too clear. Yet Turnbull himself defended the Ik—on theoretical grounds. He concluded that they had probably once held to higher values—and “very likely” with greater fidelity than Western civilized men—but had had to abandon them “for the very good reason that in their context these militated against survival.” The “context” Turnbull refers to is the severely depleted region the Ik inhabited, which itself had degenerated “with the advent of civilization to Africa.” In other words, the conditions that produced the Ik were “a part of that phenomenon we so blandly and unthinkingly refer to as progress.”
Turnbull goes even farther, arguing that the Ik, “if we are honest,” are not “greatly different from ourselves in terms of behavior.” After all, starting with our kindergartens, and “reaching on through school and summer camps,” we “effectively divorce” ourselves from our children. In fact, “there is not all that much difference” between us and the animals. “Technologically,” Turnbull grants, “we are superior in some re-respects,” but of course our technology is evil. And if it is true that we can speak—or as Turnbull grudgingly phrases it, if “we do seem to have developed the art of verbal communication to a point that gives us an enormous potential advantage over other animals”—here too “it can readily be shown that both speech and writing, misused, have led to many of the disasters peculiar to humanity.” Much, indeed, can be readily shown, especially when the contemporary world picture is threatened as radically as it is by the horrors of the Ik.
Not surprisingly, Turnbull later was one of those who sprang to the defense of Margaret Mead when the methods and conclusions of her famous 1928 book, Coming of Age in Samoa, were challenged in 1983 by the Australian anthropologist Derek Freeman. Mead, it will be recalled, had depicted a Tasaday-like society which, thanks to unrepressed sexuality during adolescence, was virtually immune from such ills of modernity as rape and suicide. On the basis of his own investigations Freeman found to the contrary that the Samoans were not sexually unrepressed during adolescence but placed a high value on female virginity. Using court records and other evidence, he also showed that Samoan society displayed greater amounts of hostility than Western society, and had higher rather than lower rates of both suicide and rape. Turnbull, in his defense of Mead, avoided the scientific issue and instead praised Mead for her bravery and personal sacrifice—and her admirable criticisms of Western civilization.
An analogous defense has been mounted on behalf of those who, by attempting to teach sign language to primates, have challenged man on the grounds of his verbal uniqueness. In Nim, an account of one such attempt, Herbert Terrace writes that “until recently, humans could take comfort in the assurance that our language made us unique.” Setting out to challenge that assurance, Terrace trained a chimpanzee named Nim for two years, teaching it to recognize numerous signs as well as to signal its wants.
As Terrace sat down to review his written and videotaped records, he had reason to believe that he had achieved a breakthrough. To his dismay, however, he discovered that neither Nim nor any other trained primate had used combinations of signs in genuinely sentence-like ways. Terrace came to this conclusion “reluctantly,” and no wonder, given the hopes with which he had begun. Nevertheless, his was a rare victory for scientific objectivity.
The responses to that victory are another matter. The well-known science writer Martin Gardner, who accepts Terrace’s conclusions, has pointed out that the desires of researchers have colored experiments to such a degree that an animal’s failure to talk is often explained away as a joke or a lie. He cites Francine Patterson on Koko (made famous in the film documentary, Koko, A Talking Gorilla):
She asks Koko to sign drink. Koko touches her ear. Koko is joking. She asks Koko to put a toy under a bag. Koko raises it to the ceiling. Koko is teasing. She asks Koko what rhymes with sweet. Koko makes the sign for red, a gesture similar to the one for sweet. Koko is making a gestural pun. She asks Koko to smile. Koko frowns. Koko is displaying a “grasp of opposites.” Penny [Francine Patterson] points to a photograph of Koko and asks, “Who gorilla?” Koko signs “Bird.” Koko is being “bratty.”
Though Gardner could not be more unequivocal in dismissing talking-animal claims, he himself, it is worth noting, declares a certain uneasiness at being put thereby in the company of the likes of Mortimer J. Adler (The Difference of Man and the Difference It Makes). Still, Gardner’s uneasiness is mild compared with that of others, including Terrace’s publisher. As Gardner observes, “nowhere on the jacket of Nim or in the book’s advertising does the publisher so much as hint that the book severely criticizes practically all earlier work with talking apes.” The publisher’s silence was well calculated; in the event, one lone researcher has accepted Terrace’s conclusions.
Today, Ursula LeGuin remains positively outraged at the “bad faith” of those who deny animal speech. She attributes their position to “speciesism,” to “a need to believe in the unquestion-ability of human uniqueness, human supremacy.” (Actually, the proclivities of Terrace and other critics lie exactly in the opposite direction.) Miss LeGuin holds in favor of animal speech despite several bizarre revelations that she herself recounts of eccentric, erratic behavior on the part of ape researchers. For example, Francine Patterson now keeps Koko “jealously and zealously guarded,” and her results “have been so selectively released that even the most sympathetic scientists have trouble defending Miss Patterson’s work.” Another animal and its trainer “are keeping a low profile in the Pacific Northwest.” Finally, almost unbelievably,
Janis Carter, who worked in language experiments with the chimpanzee Lucy (raised as a “baby” by a couple who gave her to the training center), took her to an island in an African river, where the woman must live in a cage while two groups of chimpanzees roam free. Trying to free Lucy of her human dependence, Miss Carter will not use sign language with her.
Similar contortions in the service of the theory that animals are at least the equals if not the superiors of man have been undergone by Jane Goodall, the lady who lived “Among the Wild Chimps,” as a National Geographic TV special had it. Goodall spent most of the years between 1960 and 1984 in close proximity to chimpanzees in the wild. She has concluded that they share essential traits with humans, including altruistic behavior, and in addition they possess a number of admirable and endearing qualities of their own. After fourteen years of study, however, Goodall observed something never before suspected of chimpanzees: they had begun to murder and devour their own kind. A mother and daughter became adept at seizing and eating the infants of other chimpanzees. There was no question of a food shortage, or of any resentment at work.
Goodall could offer no explanation for this unanticipated behavior—though a conclusion never mentioned by her or those reporting on her work is one that would have instantly sprung to the minds of scientists and amateurs alike in any period preceding our own, namely, that animals in their way, just as primitive peoples in theirs, exhibit behavior that is considered brutal by civilized norms. But to concede this would be tantamount to conceding the superiority of civilized man, an obvious impossibility. To anyone still foolish enough to hold such an idea, the usual retort is to point triumphantly to acts of torture, infanticide, cannibalism, and genocide committed by civilized societies themselves (while ignoring the fact that such violations of civilized norms have always been condemned as aberrations by the civilized world at large).
To remind us of our place in the scheme of things, the curators of the Bronx Zoological Garden in New York have thoughtfully positioned a mirror for visitors to the Great Ape House. Above the mirror is the legend, “The Most Dangerous Animal in the World.” Under it is written:
This animal, increasing at the rate of 190,000 every 24 hours, is the only creature that has killed off entire species of other animals. Now it has achieved the power to wipe out all life on earth.
The curators, one understands, do not have in mind the American Indians—who joined civilized men in hunting the ivory-billed woodpecker until the bird was extinct on the North American continent. Nor are they thinking of the Maori, a South Pacific people who hunted the six-foot-tall land bird, the Moa, to extinction in New Zealand. No, the “animal” being referred to is the one from whose ranks came those volunteers in California who recently took turns supporting beached whales in their arms to enable them to breathe, not to mention the marchers in the 1984 All-Species parade in San Francisco who costumed themselves as crustaceans, birds, and trees to express their solidarity with forms of life ordinarily looked down on by man.
Most writers on these subjects are so committed to the relativist view that they routinely put quotation marks around the key terms of the discussion, “civilization” and “primitivism.” These indicate that the writer rejects any tendency to elevate the one or to fail in appreciation of the other. “Cannibalism,” too, regularly appears in quotation marks, both to indicate skepticism that it is as widespread as claimed and to allude to the view that “civilization” brings harms as great as or greater than primitivism. A note of irony often hovers about such quotation marks, as when Colin Turnbull distinguishes between “‘advanced’” societies, which show signs of violent collapse, and “‘backward’” societies, to which “this new violence has not yet come.” Similarly, when he writes that the Ik have “‘progressed’” to their present condition, he really implies that they have been the casualties of modern progress.
The vocal equivalent of such quotation marks can be heard in the tones of narration employed in television documentaries about animals and primitive peoples. The title of one of these, “Testament to the Bushmen,” may stand for all, with their invariable presumption of a widespread and deplorable tendency toward feelings of civilized human superiority. His voice dripping with sanctimony, the narrator at once gently castigates the audience and renders homage to the rare, underappreciated values to be found among the peoples and species who have been photographed.
Interestingly enough, however, what we see on the screen is often a picture of destitution, disease, or blank despair. As a narrator intones the virtues of Masai tribal life, for example, the camera may be revealing naked children covered with sores and beset by cattle flies so persistent that the children have ceased to pay attention as they crawl over their open eyes. Or the Maori will be extolled for their devotion to art and respect for nature while the camera roams over ancient carvings depicting a people living in fortified villages and constantly at war.
In a film about Nigeria, a solemn respect is accorded to “traditional healers,” long spurned by “orthodox [Western] medicine”; it is wrong, we are told, to think that “one is better than another,” when what is needed between the two is “mutual respect.” Without irony, the camera shows a traditional bone setter, who helps heal his patient by breaking a cock’s leg; when the cock heals, the patient’s leg will heal as well. With the mentally disturbed, we are informed, the native way in Nigeria is to keep them in village surroundings rather than exiling them to the indifferent confines of institutions. In the village, they are cared for by a traditional healer, who uses incantations and herbal concoctions. The camera now takes the viewer to the dirt area behind the healer’s hut. Here sit approximately a dozen mental sufferers in a line (the camera shoots only in close-up, making it impossible to view conditions in the dirt enclosure or to see how many are held in it). Some of them are shackled, some drugged; all straddle a ditch; none moves.
At least one public-television documentary has taken an anthropological look at modern man using images compiled from a year of photographing at and near a single, busy corner in Manhattan. People are shown crossing the street, often hurrying. Some are abstracted, few are animated. In one sequence, the corner is viewed from a rooftop high above, so that the people resemble insects in a hive. The camera lingers over those passers-by who seem upset, disoriented, even frantic. The narration? Modern man is lonely, unhappy. He lacks a sense of community, suffers from anomie. Not revealed is that the disturbed faces are those of released mental patients who mingle with the crowds in this particular neighborhood. When it comes to civilization, the accepting, embracing eye—the eye that looks benignly on the disposition of the mentally ill in Nigeria—turns suddenly censorious and judgmental.
This is not to say that there are no discriminations to be made within Western civilization; the new world picture has as many gradations as any previous world picture. But once the principle of reversal is understood, it is easy to arrive at the contemporary view on any number of matters. Simply elevate students with low grades and poor test scores, for example, over high achievers (the former are talented, the latter unimaginative). Find virtue in the criminal, stodginess in the law-abiding citizen. For the highest accolades, move quickly to what used to be thought of as the bottom of society: children and the insane.
Better yet, find a Caliban. In Shakespeare’s The Tempest, the wise magician Prospero is accorded sway over the lower creatures and the lower mortal Caliban—or so the play has always been understood. As Richard Levin has shown, however, contemporary literary criticism has systematically turned the meaning of this play upside down. The first step was to argue that Shakespeare actually intended Prospero to be “equated” with Caliban in evil. Next, Caliban himself began to be viewed favorably. By the 1980’s he “represented any group that felt itself oppressed.” In one production of the play in New York, “he appeared as a punk rocker, complete with cropped hair, sunglasses, and Cockney accent.” A feminist critic has imagined a soliloquy for Prospero’s daughter, Miranda, whom Caliban once tried to rape, in which she concludes, “I need to join forces with Caliban.”
In the Elizabethan world picture, man was constantly admonished to remember that his place was both high and low in the scale of things: above the animals but below the angels, possessed of reason but subject to the passions. “What a piece of work is a man!” Hamlet exclaims, but in the same speech calls him “this quintessence of dust.” And Prospero himself, as Tillyard points out, calls Caliban “this thing of darkness . . . mine,” as a reminder that his limitation as a human being is to be linked forever with the bestial. That this link should become a point of pride, that Caliban should be apotheosized rather than seen as a cautionary example—this is a development that could hardly have been predicted.
In the 1980’s signs typical of a disintegrating world picture have appeared. Those clinging to the still-dominant, low view of man increasingly have resort to the unverifiable to sustain their case, positing eons-old explosions in space, fanciful interpretations of prehistoric hieroglyphs, putatively destroyed documents, conjectures about the mind processes of inarticulate primates. When their slenderly supported metaphysical constructs are refuted, they respond with recriminations against the bearers of bad tidings. Still, some, like Herbert Terrace, have found that their first allegiance is to the scientific method, and in this there is hope. It now remains, just as it did, mutatis mutandis, at the time of the breakup of the Elizabethan world picture, for the implications of evidence favorable to man to be assimilated. The contemporary world awaits its John Donne: a poet capable of abandoning the comfort of stones in favor of the rigors of self-respect.