Some say the world will end in fire,
Some say in ice.
Predictions of the end of the world, as old as human history itself and lately a subject of scholarly inquiry, have by no means abated in our own time. Nor are those who believe in such predictions confined to isolated religious sects, as was the case as recently as the 19th century. While such sects do continue regularly to spring up and disappear, predictions of catastrophe have become the virtual orthodoxy of society as a whole. Journalists, educators, churchmen, and philosophers daily endorse one or another script foretelling the end of individual life, of human civilization, or of the entire earth. Some say the world will end in fire—through the conflagration of a nuclear holocaust; some say in ice—through the same event, this time precipitating a “nuclear winter.”
Fire and ice. We need only add earth and air to include within the apocalyptic genre all four of the elements understood as basic by the Greeks, and water to include the biblical account of the flood. Contemporary prophecies of flood are stated in apparently scientific terms: as a result of global warming, the polar ice caps will melt and inundate the world’s major cities. As for earth and air, we anticipate the disappearance of the one thanks to the erosion of farmland and shorelines, while the other is to be depleted of its ozone, if not first saturated with carbon dioxide or poisoned by man-made pollutants.
Pagan man projected his fears outward; contemporary man internalizes. Like biblical man (at least to that very limited extent) he holds himself, or more accurately his own society, responsible for the coming end of the world. Not only the disaster allegedly threatened by pollutants but every prospective modern apocalypse stipulates man rather than the gods or nature as the primal cause. And the charge is always the same: whether it is to be by fire or by ice, mankind faces extinction as a punishment for its impiety.
The continuities and discontinuities between ancient and modern imaginations of disaster would amount to no more than curiosities if it were the case that superstitious fears had been replaced by rational ones. But it is not the case. On the contrary, given the best scientific understanding of reality available to early man, it made sense for him to ascribe natural disasters, present and future, to the gods. Later, it made sense for the Greeks to ascribe such disasters to some wayward or even malign characteristic of matter itself. Nor was Empedocles a simpleton for regarding the personification of the elements by gods as a persuasive account of reality. Would that our own conceptions of apocalypse were similarly founded on the best available scientific understanding. Instead, most if not all of the disasters currently being predicted have gained widespread credence despite a lack of scientific basis, or even in the face of definitive counterevidence.
Without question the most spectacular example of such a wholly suppositious theory has to do with the so-called greenhouse effect. The greenhouse effect itself, as every schoolchild knows, is simply the process by which the earth’s atmosphere traps enough heat from the sun to create a habitable planet. As for the disaster scenario that bears the same name, it posits, in the words of a New York Times editorial, an increased “warming of the atmosphere by waste gases from a century of industrial activity.” The Times goes on:
The greenhouse theory holds that certain waste gases let in sunlight but trap heat, which otherwise would escape into space. Carbon dioxide has been steadily building up through the burning of coal and oil—and because forests, which absorb the gas, are fast being destroyed.
Now, aside from the mistaken assumption that forests worldwide are decreasing in size (they are not), the theory of a runaway greenhouse effect, otherwise known as global warming, presents even its advocates with a variety of internal contradictions. In the first place, the earth has a number of mechanisms for ameliorating fluctuations in global temperature: a significant rise in temperature, for example, leads to increased evaporation from the oceans; this is followed by the formation of clouds that shield the sun and then by a compensating drop in temperature. Too, if the greenhouse theory were valid, a global warming trend should be observable in records of temperatures soon after the jump in man-made carbon dioxide that is the result of modern industrial activity. Yet if there has been such a rise over the past one hundred years, it does not follow but precedes the onset of modern industrialism, and anyway it amounts to a barely detectable change of no more than one degree Fahrenheit over the entire period.
Here is a particularly significant problem for any hypothesis—the lack of evidence. Purveyors of the global-warming theory counter it by pointing to computer projections which show a catastrophic upward trend in the next century. Once again, however, a knotty problem presents itself: computer models, writes Andrew R. Solow, a statistician at Woods Hole Oceanographic Institution, “have a hard time reproducing current climate from current data. They cannot be expected to predict future climate with any precision.”
Does any of this detract from the persuasive power of the global-warming theory? Apparently not. As in certain forms of religion, the less evidence, the more faith. And in the resultant climate of belief (as it deserves to be called), not only the lack of evidence but even outright counterevidence can work to a theory’s benefit. According to the late Leon Festinger, Henry W. Riecken, and Stanley Schachter, the authors of the classic study, When Prophecy Fail. (1956), “Although there is a limit beyond which belief will not stand disconfirmation, it is clear that the introduction of contrary evidence can serve to increase the conviction and enthusiasm of a believer.” So it has been during the most recent phase of prediction, which itself represents a revival of the great irruption of ecological warnings that dominated the early 1970’s.
The central document in that earlier wave was Limits to Growth, a report issued by the Club of Rome in 1972 foretelling a worldwide doom brought on by the combined forces of “resource depletion,” overpopulation, pollution, and starvation. The future conjured up by computer simulation in Limits to Growth bore a certain resemblance to the still more spectacularly stated predictions of Paul Ehrlich in his 1968 book, Population Bomb. Ehrlich had offered specific dates for specific catastrophes: 1983, for example, would see a precipitous decline in American harvests and the institution of food rationing, by which time a billion people worldwide would have already starved to death. The Club of Rome, more cautiously, assigned likely years for the exhaustion of specific resources: petroleum (1992), silver (1985), natural gas (1994), mercury (1985), tin (1987).
In 1982 one of the authors of the Club of Rome report had to admit that his predictions were not coming true. Yet he was not repentant. There may have been a postponement, a temporary reprieve, but man and the earth still remained poised on the brink of cataclysm. Presumably Paul Ehrlich, who never recanted, felt the same way. Just so have members of religious sects always responded when their confidently predicted apocalypses pass without incident.
True, the general public and even some members of the sect begin to fall away after such disappointments; in our time, both Paul Ehrlich and the Club of Rome did fade out of the spotlight. But instructively, and in contrast to the sects studied in When Prophecy Fails, they did so without having been exposed to the full glare of adverse publicity and ridicule that used to attend the collapse of prophecy. Perhaps that is why so little time elapsed before the public could be brought to credit similar predictions.
For even as the “population bomb” failed to explode on schedule, or ecological disaster to strike, new predictions of not only global but galactic proportions were being prepared. By the late 1980’s these were receiving the same respectful, credulous hearing as their forerunners, and were being promoted just as avidly by the press. In the case of nuclear winter, the most publicized apocalypse, the cycle from prediction to publicity to disconfirmation took only a few years, from approximately 1985 to 1988; yet once again the end came without bringing ridicule or discredit to the theoreticians.
Nuclear winter was at once a prediction of what would happen after a nuclear war and the claim that an identical disaster, never detected in the geological record, had already taken place once before, in the age of the dinosaurs. A giant explosion, the theory went, had been caused on earth by a “nemesis” or “death star” wheeling in from far out in the universe and returning so quickly whence it had come as to be invisible to the most far-seeing of modern telescopes. The clouds of dust kicked up by that explosion had shielded the sun and thus caused the earth’s vegetation to wither, bringing about the extinction of the dinosaurs by cold and starvation. The lesson for the mid-1980’s was clear: intermediate-range nuclear missiles should not be emplaced in Western Europe and disarmament should commence forthwith.
As chance would have it, not long after the nuclear-winter theory gained currency there was a giant volcanic eruption at Mount St. Helens in the state of Washington. It was followed by the spreading of just such dark clouds as had been described—but without any hint of the predicted effect on vegetation or climate. At about the same time, too, paleontologists demonstrated that the dinosaurs could not possibly have been the casualties of a single, catastrophic event, since they had disappeared over a period of some thousands of years. Finally, a check of some of the nuclear-winter projections exposed gaping errors of math and physics.
As a result of these and other refutations of the theory, nuclear winter died its own death-by-theoretical-starvation. But so quickly was its place taken by similar predictions, similarly linked to geopolitical issues, that the event seems to have almost entirely escaped notice. Nuclear winter remains today in the public mind as a proven hypothesis, vying for popularity with its mirror opposite, the greenhouse effect.
Actually, not so long ago (as the journalist John Chamberlain has pointed out) we were being assured that we were living not in a warming but in a “cooling world.” In the 1970’s, as Science magazine reported in 1975, meteorologists were “almost unanimous” that such a trend was taking place, and that its consequences, especially for agriculture, were potentially disastrous. Climatologists, according to Fortune magazine, warned that the cooling trend “could bring massive tragedies for mankind.” A decade later, all of this quite forgotten, the opposite theory of global warming has drifted past the rocks of evidentiary lack, tumbled safely through the falls of skepticism, and sailed triumphantly onto the smooth lake of public respect.
The status of global warming as an unassailable, self-evident truth was recently confirmed by the reaction to a scientific report that challenges its assumptions. This report, compiled at the National Oceanic and Atmospheric Administration and duly described on the front page of the New York Times and other newspapers, traces U.S. temperatures since 1895. It shows that the putative one-degree rise in temperature worldwide over the past hundred years, a figure widely accepted even by many of those skeptical of the global-warming scenario, is wrong for the United States. As the Times headline put it: “U.S. Data Since 1895 Fail to Show Warming Trend.”
The reaction was immediate. All of the experts consulted by the Times were in agreement that the report does not set back the global-warming theory by so much as an iota. Prominent among these experts was Dr. James E. Hansen, director of the National Aeronautic and Space Administration’s Institute for Space Studies in Manhattan, a leading proponent of global warming and the man who produced the data showing a one-degree rise in global temperature. “We have to be careful about interpreting things like this,” he warned, and went on to explain that the United States covers only a small portion of the earth’s surface. Besides, the steadiness of the temperature readings could be a “statistical fluke.” Note the implicit distinction here: we must be “careful” in interpreting data that appear reassuring, but it is virtually our duty to indulge any strongly felt premonitions of disaster even if they are based on the flimsiest evidence, or none.
The concept of the “statistical fluke” could easily be applied to many current predictions, but is not. Thus, the acidification of a number of fresh-water lakes in the eastern United States is considered not a fluke but a definite trend, even though it might be taken to fall well within the range of natural fluctuations. Similarly, the disturbing deaths of numerous dolphins during the summer of 1988 were traced to the same pattern of human depredation of the environment supposed to be causing acid rain and other ecological catastrophes. Later, it developed that the dolphins were killed by a so-called “red tide” of algae—itself first seen as a man-created scourge but then conceded to be a natural phenomenon. Here, in other words, was a genuine statistical fluke; but it was never labeled as such since it exonerated industrial man.
It does not give pause either to the catastrophists or to their credulous promoters in the media that some predictions cancel out others. Dr. Hansen, for example, suggests that the absence of a warming trend, as shown by the new study, might be “the result of atmospheric pollutants reflecting heat away from earth.” Yet these are the same pollutant particles supposedly responsible for global warming in the first place. Now it develops that the particles they carry with them counter the greenhouse effect. In fact, Dr. Hansen is worried that “anti-pollution efforts are reducing the amount of these particles and thus reducing the reflection of heat” away from the earth. It is surely a measure of the power of catastrophic thinking that what may have been the first public revelation of an actual decrease in man-made atmospheric pollutants should prompt the fear that such a decrease itself portends the direst consequences.
What all this suggests is that we have come to depend at any given moment on a constant degree of threat. When times are bad—because of war, depression, or real natural disasters—proximate fears tend to dominate the imagination. When times are good—through the conquest of disease and famine, the achievement of high employment, prosperity, and an upward curve of longevity—apprehension has to be supplied from without. And during extended good times, a supply of fresh disasters is required as each one comes progressively to lose its appeal. Air pollution, rising to disaster proportions in the Club of Rome report, declines in importance but is soon succeeded by loss of the ozone layer, which will supposedly leave mankind vulnerable to the unfiltered rays of the sun and a consequent plague of, among other things, skin cancers and blindness. Continent-wide poisoning of fresh water through the eutrophication of lakes and streams from fertilizer runoff is forgotten only to be replaced by the threat of acid rain. Direct incineration of all mankind by atomic war cedes to a secondary stage of destruction by nuclear winter, and nuclear winter in turn to global warming.
This persistent and insistent imagining of disaster might be no more than a sideshow were it not for its political dimension. But in the 1970’s and 80’s, successive waves of catastrophism followed and reflected episodes of defeat for radical political movements. The 70’s wave succeeded the collapse of the New Left and engaged many of that movement’s disillusioned supporters (as well, of course, as many people opposed or indifferent to the New Left). That of the 80’s followed the worldwide discrediting of the economic, political, and moral record of Communism. It was as if sanguine hopes of an end to the cold war required a compensatory new fear, one that natural catastrophe alone could supply. And thus, soon after James Baker’s nomination as Secretary of State, a bipartisan memorandum from members of the Senate Foreign Relations Committee called his urgent attention to the leading foreign-policy issue he would have to face, a “global problem of unprecedented magnitude.” The issue was global warming.
It goes without saying that clean air and water, the retention of farmland and forests, a satisfactory ozone layer, and the avoidance of nuclear war are all desirable things. But the pursuit of these goals through the rhetoric of hellfire renders more immediate political concerns mundane and secondary. Many are the societies that have been distracted from the actual dangers they faced by the allure of disasters wholly imaginary. That consideration aside, though, our obsession with distant and unprovable catastrophe is so stultifying, from both the moral and the intellectual point of view, as to constitute a cultural disaster in its own right.