Peace by the threat of scientific destruction is a fairly new idea, but older than H. G. Wells who got…
Peace by the threat of scientific destruction is a fairly new idea, but older than H. G. Wells who got the credit for it in the recent editorials. It was conceived before the British novelist was born, by a Swedish multi-millionaire who once told a young lady, “I wish I could produce a substance or a machine of such frightful efficacy for wholesale devastation that it would make wars altogether impossible.”
Cruising on a Swiss lake twenty years later, he told the same lady, meanwhile matured into a prominent pacifist: “On the day when two army corps will be able to annihilate each other in one second, all civilized nations will recoil from war in horror and disband their forces.” On that day and no sooner. With a half-wistful, half-cynical smile at the Baroness von Suttner, whose speech to the Fourth World Peace Congress had just been applauded by delegates from twenty nations, Alfred Nobel added, “Perhaps my factories will end war sooner than your Congresses.”
He finally pursued the great aim with both factories and congresses; and in an even more indirect third way—through his science prizes—which may yet prove to be the most effective.
On V-J Day, General Douglas MacArthur dismissed the use of force as a method of settling international disputes, stating, “The utter destructiveness of war now blots out this alternative. If we do not now devise some greater and more equitable system—” For the first time, the end of a war has found the generals seriously inclined to give a chance to lasting peace. Should there be peace now, an atomic peace, it would be the crowning success of a deliberate even though devious pacifist experiment launched by one man just half a century ago.
Alfred Nobel was the inventor of high explosives and, while he lived, their virtually exclusive manufacturer. As a youth, he became interested in an oily liquid discovered by an Italian chemistry professor. When a vial exploded and destroyed a wing of Turin University, the horrified scientist abandonned all work on his “nitroglycerin.” Young Nobel was not horrified. He set about bringing the oil under control. His first laboratory blew up, killing his brother; hundreds of lives were lost in other blasts before he finally found his “safety powder,” dynamite. He advertised it for use in mining and railroad construction; but almost as fast as his plants were built, his principal customers came to be the ministries of war. He kept developing new, more powerful explosives and selling them to anyone who would buy, without distinction. Within a decade his gigantic trusts linked ammunition firms all over the globe. Nobel was the first “merchant of death” to operate on a world scale—and an enemy of war.
“The trouble is,” he said, “that the action of explosives is too limited. War must be made as deadly for all the civilians, back home as for the troops in the front lines.” He retired from the direction of his business and withdrew to the marvelously equipped laboratories he maintained at Bofors in his native Sweden and at San Remo in Italy—there he designed and tested new types of guns and shells, tried out an “aerial torpedo” that became the ancestor of the rocket-bomb, and pondered the feasibility of bacteriological warfare. In 1895 he made his will. The interest from his estate was to be divided annually between those having “rendered the greatest services to mankind,” in the form of prizes for the greatest physical, chemical and medical discoveries, the “outstanding work of idealistic literature,” and “the most and best work for the brotherhood of nations, the abolition or reduction of standing armies, and the formation or popularization of peace congresses.”
It was, Alfred Nobel added, his “express wish that the prizes be distributed without regard to nationality, so that in every case the prize may be awarded to the worthiest.” He had nothing but scorn for the competitive nationalisms to which he owed his wealth. Born in Sweden, he had grown up in Russia, spent most of his adult life in France and England, and died in Italy. “My home is where I work,” he said, “and I work everywhere.” Nations struck him as still unavoidable evils; he would have understood, even less than nationalism, the racialism which arose after his day. Bertha von Suttner found him as sympathetic to her Austrian “League to Combat Anti-Semitism” as to her pacifist efforts. He wrote enthusiastically to her about “the prejudice-hunters and darkness-hunters, among whom you hold an exalted rank.”
His constant aim was to break down frontiers—frontiers dividing men and frontiers confining their knowledge. The world was one, and science was one, as it had been to Aristotle; knowledge and peace were facets of human perfection—the one, in the end, could not but lead to the other. Nor was there such a thing as evil knowledge: “Every new discovery modifies the human brain, and makes the new generation capable of grasping new ideas.”
Nothing diverted him from his own discoveries and ideas. At the end of 1896, old and ill, he rejoiced over a “splendid” new explosive, and on the same day wrote to Bertha von Suttner: “I am delighted to see the peace movement gaining ground.” Two weeks later Alfred Nobel’s only servant found him dead at his desk.
His factories continued to produce ammunition and did not end war. His prizes were first distributed in 1900; in 1940, when they were suspended for the duration, they had not ended war, either.
The Nobel Prizes in the sciences were to serve the evolution of a new view of the world. They were landmarks of pure science. In physics especially, nothing could seem so far removed from the realities of life as the achievements recognized by the Nobel awards at the beginning of the twentieth century. The goal the physicists pursued was the most abstract of all: the theoretical breakdown of the basic structure of the universe. They measured, weighed and dissected the infinitesimal particle—on paper, because even they had to admit that no conceivable microscope would be able to show it.
They worked without thinking of “uses,” in a secret language of incomprehensible formulae, piling theories upon theories. If one tried to explain their theories, they sounded absurd. The public heard only of one, and quickly turned the name Einstein into a by-word for unintelligibility.
Today the infinitesimal particle has been broken down in practice, and its use shown in the newsreels.
A golden thread runs through the past four and a half decades of scientific progress: the thread of the Nobel Prizes. From the start, as if to bear out the donor’s belief in the essential unity of knowledge, the thread veered occasionally from physics into chemistry; from the start it shuttled back and forth across the Atlantic, with theories usually worked out in the Old World and experimentally confirmed in the New.
In 1900, at the threshold of the century, Max Planck published a theory which became the basis of modem physical thinking; another, by the Dutch physics-laureate of 1902, Lorentz, led to Einstein’s “special theory of relativity” of 1905; whose bold assumption of the equivalence and interconvertibility of energy and mass revised age-old “natural laws.” Einstein had little evidence besides a famous, otherwise inexplicable experiment by Michelson, the Russian-born American physics-laureate of 1907; but he already suggested that further proof might be forthcoming from closer study of the radioactive phenomena whose discovery had won a 1903 Nobel Prize for Marie Curie, her husband, and another French scientist. This investigation brought the Britons, Rutherford and Soddy, the chemistry prizes of 9008 and 192, and produced the fundamental conception of atomic physics, the “Rutherford atom model, which was more or less simultaneously worked out by the German physics-laureate of 1905, Lenard, later a prominent Nazi. Meanwhile, Marie Curie carried on in another direction. For her discovery of new radio-active elements she received the 1911 Nobel Prize in chemistry, the only person ever to be honored twice.
The First World War broke out, and for the first time the Nobel catalogue showed the notation, “not awarded.” Marie Curie took her miracle element, radium, to the front, forgetting the scientific disdain for “uses” in her desire to mitigate human suffering. A war-time physics prize went to Max Planck for his work of a decade and a half earlier; the two war-time chemistry prizes went to two German Jews, Willstaetter and Haber. The first had lifted the veil from a profound mystery of plant life; he was said to have refused to work for war. Haber, on the contrary, enabled the Germans literally to get munitions from the air. Parenthetically, had Jews been barred from German science in those days, the German empire might have collapsed sooner. But in this respect the Kaiser was not like the Fuehrer: he took a collector’s pride in all Nobel awards to German subjects. When Paul Ehrlich received the 1908 prize in medicine for his work on syphilis, William II patted his shoulder, and addressing him as he would a drill sergeant whose recruit has won a badge for marksmanship said: “Na, Ehrlich, nun mal ‘ran an den Krebs!” (The inflection is untranslatably Prussian; the words mean, “Well, Ehrlich, now let’s get going on cancer.” Today cancer treatment looms as the probable first peaceful use of atomic fission—another circle closing in the unity of knowledge.)
Months before World War I, a 28-year-old Dane, whose mother was Jewish, had carried Planck’s old theory a step farther: his “atomic quantum theory” and his improvement of Rutherford’s atom model brought Niels Bohr the Nobel physics prize for 1922 (the year after Einstein) and made him the father of our “nuclear physics.”
Ever since the end of the last war scientists have been talking the language now made popular by the spectacular ending of this one. In 1919 Rutherford succeeded in transmuting elements by “alpha particle bombardment.” In 1922 Aston received the chemistry prize for work on “isotopes.” The 1925 physics prize went to Franck and Hertz for the discovery of laws governing “the collision of an electron with an atom,” and the 1927 physics prize rewarded an American, Compton, for his discovery of an effect traced to “the collision of electrons and light quanta” and experimentally confirming Einstein’s conception of the equivalence of energy and mass.
The late 1920’s mark the high point of the estrangement between science and the public. The world at large lived comfortably in the 19th-century knowledge it had acquired in school; the physicists had moved bag-and-baggage into a new world that the laymen could not comprehend, because it had no “uses.”
In the following years a succession of Nobel Prizes rewarded a new generation of young physicists—mostly born in this century and mostly taught by Bohr—for a theoretical revolution that has not yet borne its full fruits. Another succession of prizes, however, recognized achievements that will sound familiar to the reader of yesterday’s newspapers. Nineteen thirty-two was an important year. The American Urey discovered the hydrogen isotope which makes “heavy water.” A Briton, Chadwick, found that a mysterious radiation first observed at the Curie Institute was due to an unknown particle, the neutron. Another American, Anderson, experimentally discovered the positron—a particle which one of the young European revolutionaries had theoretically predicted, years before. The three. 1932 discoverers got Nobel awards in 1934, 1935 and 1936; the 1935 chemistry prize went to Iréne and Frédéric Joliot, Marie Curie’s daughter and son-in-law, the first to induce artificial radioactivity. The Joliots had still used Rutherford’s old “alpha bombardment” method, but in 1934 a young physicist at the University of Rome, Enrico Fermi, reasoned that Chadwick’s new particle might work even better. His three-year series of experiments resulted in scores of new, artificially radioactive elements and made a sensation. It earned Professor Fermi the physics prize of 1938, and set laboratories all over the world to work on “neutron bombardment.” But among a thousand laymen, hardly one knew or cared what it was all about.
For four decades the scientific Nobel Prizes were the domain of an exclusive, international and inter-racial fraternity. The scientists’ fight to break down frontiers of knowledge was waged apart from a lay world unable to grasp it; but among themselves there were no frontiers. The very fields of modem, specialized science showed a tendency to overlap and amalgamate—perhaps, in a new sense, to resurrect that unity of knowledge which the Greeks had known and of which Nobel had dreamed. First there were chemistry awards for physicists, and vice versa. Medicine soon followed; between 1929 and 1938 four medical prizes went to chemists: Hopkins of England, Warburg of Germany, von Szent-Györgyi of Hungary and Loewi of Austria. The circle was widened even beyond the scientific prizes by Wagner-Jauregg, the Viennese psychiatrist, who told the Freudians among his students, “Gentlemen, some day you all will get the Nobel Prize—for literature. . . .” It was a jest—but in literature, too, the Nobel laureates fought to break down frontiers. They assailed the frontiers of hatred; and another great line ran from Björnsterne Björnson, the “Dreyfusard” who swore that “art must never part from the realization of good and evil,” to contemporaries like Thomas Mann and Sigrid Undset, who only later in life found the way to the principle stated by their fellow-laureate Romain Rolland: “He who can see injustice without trying to combat it, is neither a complete artist nor a complete man.” With few exceptions they, too, worked for a new world view—“to make the new generation capable of grasping new ideas.”
Quite apart—and notably unsuccessful—were the awards designed to stimulate a new world-order: the peace prizes. The very first collided with Nobel’s views: Henri Dunant, who founded the International Red Cross and brought about the Geneva Convention, surely rendered mankind a service but not one likely to bring peace in the eyes of Nobel, who had sought to end war by making it more horrible. Subsequent prizes went to pacifist wheel-horses and pact-negotiating statesmen. Bertha von Suttner, the donor’s “dear Baroness and friend” whom he had in mind for the first, got only the fifth. One went to her loyal Viennese aide, the Jewish journalist Fried. The Treaty of Portsmouth earned one for Theodore Roosevelt, who in his day had been known to find war not wholly unpleasant. Before 1914 it was an open secret that both the Kaiser and the Czar aspired to the honor; and in 1936, when a committee of high moral courage chose a German pacifist, Carl von Ossietzky, then languishing in a German concentration camp, Adolf Hitler was said to be all the more incensed because of having thought that the prize should have gone to himself.
The German reaction to Ossietzky’s award, if not prompt, was at least characteristic. The laureate was removed from the camp to a hospital, and when the committee deposited the prize money in a Norwegian bank, a German agent asked for it, “so poor Ossietzky might pay his hospital bill.” Eventually the money reached Berlin (though not Ossietzky, of course) and three days later Hermann Goering announced that henceforth no German would accept a Nobel prize, adding, “When we Germans do a thing, we do it thoroughly.”
In Stockholm a year later, chuckling officials of the Nobel Foundation told Clinton Davisson, American physics-laureate, that Hitler had offered to rescind his ban on condition that no more awards be made to “non-Aryans.” The Fuehrer, it seems, had thought there was only a peace prize. The others were so far removed from his world that he had never even heard of them. And so, surprised to learn of their relation to the sciences and letters, the leader of the “people of thinkers and poets” magnanimously had declared himself willing to let the good Swedish kroner into the Reich again—provided they could be had without compromising Nazi principles.
“Needless to say,” Dr. Davisson wrote, “the Foundation wasn’t having any.”
The executors of Alfred Nobel’s will could not have accepted the generous German offer even if they had wanted to. It was a plain case of incompatibility—with the donor’s will, with the tradition of the prizes, and with the spirit of humanity for which they always stood. No group could be barred from them, not even a group that excluded itself: thrice after Hitler’s decree were Nobel prizes awarded to and dutifully refused by scientists of the Third Reich. And least of all was it possible to bar a group which always, in the literal sense of Nobel’s will, had “rendered services to mankind” at large, “without regard to nationality.”
The relatively large number of Jewish Nobel laureates has often been noted, and usually caused more astonishment than it should. It was inevitable, just as the Jewish part in the development of the atomic bomb was more than the “historic justice” quoted in so many recent editorials. For centuries intellectually gifted Jews have been all but forced into the ranks of science by their exclusion from other fields. Prejudice, one of Nobel’s pet hates, also forced them by constant discrimination to extraordinary achievements. That the Nobel Prizes, awarded “in every case to the worthiest,” on the exclusive suggestion of scientists from all over the world, should often go to Jews was thus not surprising. In many countries of Europe, Jews were to find a Nobel Prize easier to get than a university chair.
It is evident, too, that in the long run Hitler’s Reich could not put up with the Nobel Prizes, even had there been no Jewish issue. Knowledge, the goal of the prizes, and prejudice, the essence of Nazism, are truly antithetical. Between them no lasting compromise was possible. The Third Reich not only killed or expelled the Jews and barred the Nobel Prizes, it also forced terms on research, and tried to impose conditions on knowledge itself. But there is no such thing as conditional knowledge. In their attempt to blast the unity of science, the Nazis only succeeded in blasting themselves out of it. They could snub the Nobel Prizes; they could stop their distribution by invading Norway and encircling Sweden; they could boast of a strictly Aryan “German science” and sneer at any other—but the circle closed again, outside, and inexorably turned on them.
This, if one will, is historic justice.
In the years 1937 and 1938 Fermi’s neutron bombardment tests were repeated, as in many other research centers, at the Kaiser-Wilhelm-Institut für Chemie in Berlin-Dahlem. A long series of experiments was conducted by the director of the institute, Professor Otto Hahn, with his long-time collaborator, a brilliant Austrian physicist whose striking resemblance to Albert Einstein was not merely facial. Lise Meitner was born in Vienna in 1870; she had studied mathematics and theoretical physics, come to Berlin and became Professor Hahn’s associate even before the First World War; their first important discovery together was made in 1918. The German Republic made her an assistant professor of physics at Berlin University; in the early years of Nazism she lost this position, of course, being Jewish—but being an Austrian, too, she was permitted to keep working undisturbedly at the Kaiser-Wilhelm-Institut while Austria was still an independent foreign country. This ceased to be the case in March 1938, but Dr. Meitner still was not bothered directly—and besides, the work was just approaching its most interesting stage.
September’ 1938 brought “peace in our time” at Munich; October brought the first violent pogrom wave; the outside world intruded even upon an elderly woman scientist. Once a protection, her Austrian passport had now become a trap; it had to be exchanged for a German one, which in Lise Meitner’s case would bear the telltale “J”—if it was granted at all. It was said on good authority that no scientists except reliable Nazis were any longer allowed to leave the Reich.
The director of the physics department of the Institut was a Dutch Nobel Prize-winner. Dutch colleagues offered to get Dr. Meitner across the border. A special visa was confidentially granted by the Dutch government. Professor Hahn agreed to the departure; a third scientist, Dr. Strassmann, had lately joined their team and the experiments could go on. In fact, Hahn thought it might be just as well if Dr. Meitner were to work on in some other country.
On a winter night the elderly woman scientist was packed into a car with Dutch license plates and spirited out of Hitler’s Reich, into Holland.
She would go to Sweden eventually; she was a member of the Swedish Academy of Sciences and one of her sisters was living there—her brother-in-law, once the owner of a printing business in Vienna, was now working in the German refugee publishing firm of Bermann-Fischer in Stockholm. But first the new refugee decided to go to Denmark. Her nephew, Robert Frisch, an experimental physicist who before Hitler had taught at the University of Hamburg, was at the moment working at Niels Bohr’s Institute in Copenhagen—and Lise Meitner wanted to continue the work which in Berlin had been so rudely interrupted.
Just before her flight the experiments with Hahn and Strassmann had yielded stunning results—results which possibly might shake the world at large as well as the abstract one of the physicists. The neutron bombardment, which for Fermi and scores of others had only produced new substances, seemed in Berlin to have led to a phenomenon for which Dr. Meitner as a theoretical physicist could find but one explanation. And this explanation was that the bombarded uranium atoms had split approximately in half—a process which according to all laws of modem physics, as developed from Einstein to the latest work of Bohr, involved the release of six billion times the energy used for the bombardment.
A source of practically inconceivable power had been discovered, at the Kaiser-Wilhelm-Institut in Dahlem—not ten miles from Adolf Hitler’s Chancellery.
It was the winter of 1938-39. Dr. Meitner knew what such a power source might mean if developed in time for Nazi use—and, strange to say, so did Professor Hahn. The report on his experiments, which he prepared in those days for a scientific journal, was strictly confined to the chemistry of the matter without mentioning any possible energy release.
Outside, however, in the exclusive, international and inter-racial fraternity, things began to happen. In Copenhagen, Lise Meitner found Niels Bohr about to sail for the United States, for talks with Albert Einstein and others at the Institute for Advanced Study in Princeton. Dr. Meitner discussed her hypothesis of “atomic fission” with Dr. Frisch; together, aunt and nephew took it to Bohr who immediately arranged for them to repeat the Berlin experiment in his laboratory while he was overseas. He landed on January 16, 1939, and the news he carried spread swiftly among physicists. At Columbia University it came to the ears of Enrico Fermi, the neutron-bombardment pioneer and Nobel laureate of some weeks back, who had left his homeland with his Jewish wife when Mussolini embraced racism. Fermi and Professor Dunning of Columbia, an old friend of Hahn’s, prepared at once for another experimental check of Hahn’s discovery. But before they were ready, Fermi had to go to Washington for a conference on theoretical physics. Bohr was there, too, and during the conference received a letter from Frisch: the experiment had been repeated, the hypothesis of fission confirmed. Bohr announced the discovery to the physicists at the meeting; they were “as excited as children.”
Bohr and Fermi talked about the new development. It was a private, abstract discussion, beween the dark-haired youngest Nobel laureate, with the classic Roman features, and the big, rubicund, slow-spoken Dane, the mentor of a scientific generation who had worn the highest title in the realm of knowledge for nearly two decades. The Italian speculated on the manner in which fission might occur. Could it be that as the atom split under the impact of the bombarding neutron, other neutrons were emitted? If so, they might in turn strike and split other atoms, which would emit further neutrons—was there a possibility of a “chain reaction”?
A conversation of two Nobel prizewinners, at the physicists’ conference in Washington on January 26, 1939, planted the germ that was to be the atomic bomb.
Fermi returned to New York. The Columbia test yielded unqualifiedly positive results; so did others, undertaken after the Washington meeting. A steady flow of papers on the subject began to appear in scientific journals, and some sensational accounts made their way into the daily press. Fermi was interviewed on the radio and asked how soon the world might be expected to blow up. A Saturday Evening Post writer told the story of Lise Meitner with sympathy and some mistakes, and wondered how soon we might “start the fires of atomic energy burning” in industry.
At that time,” states the official report Ton the atomic-bomb project by H. D. Smyth, head of the Princeton physics department, “American-born nuclear physicists were so unaccustomed to the idea of using their science for military purposes that they hardly realized what needed to be done. Consequently the early efforts . . . were stimulated largely by a small group of foreign-born physicists. . . .”
They were about half a dozen Hungarian, German and Czechoslovak refugees, besides Fermi, the Italian. Even if this was still only 1939, they knew what was impending in Europe. Their first step, undertaken with Bohr’s aid, was an attempt to stop publication of further atomic data by voluntary agreement. The second step followed in March; Dean Pegram of Columbia put Fermi in touch with the U. S. Navy Department. The Navy “expressed interest and asked to be kept informed.” The headlines featured the visit of King George and Queen Elizabeth of England, and the Polish crisis.
The little group of foreign-born physicists was driven by a sense of desperate urgency. They took the matter to Albert Einstein, and Einstein gave them a letter to President Roosevelt. The man who saw the President, and whose tireless prodding kept the work going through the next difficult months, was Alexander Sachs of New York.
In September 1939 war broke out in Europe, and President Roosevelt appointed an “Advisory Committee on Uranium.” Its report, submitted in November, brought the first funds for experiments: $6,ooo.
In Europe Hitler was liquidating his Polish conquest and waging a “sitzkrieg” against Britain and France. December Io saw the last Nobel Prize distribution. E. O. Lawrence of the University of California received the physics prize for inventing the cyclotron, the giant “atom-smashing” machine. Two other scientific awards were made to Germaris who declined.
No peace prize was awarded.
In the spring Hitler invaded Norway and Denmark. Worldwide concern was felt by scientists over the fate of Bohr, who had returned to Copenhagen. Soon Swedish sources reported that he was working quietly at home. Lise Meitner was safe in Stockholm. From Berlin came secret reports that half the Kaiser-Wilhelm-Institut had been put to work on uranium.
Hitler invaded the West. France collapsed in June. Roosevelt set up the “National Defense Research Committee” with a sub-committee on uranium. Contracts eventually totaling $300,000 were let to sixteen different research institutions. Informal exchanges of information were started with the British, who had so far spent some $100,000.
In 1941 Hitler invaded the Balkans and Russia. America had already adopted the draft. Fermi and his assistants set up the first “uranium pile” at Columbia. All over the country scientists were at work on different aspects of the problem. A group under Compton at Chicago experimented with neutron production and one in California, under Lawrence, with the new radioactive element, plutonium; at Columbia Dunning’s team studied “gaseous diffusion,” and Urey’s, the “heavy water” which had won him a Nobel Prize. The Naval Research Laboratory worked on “thermal diffusion” and Princeton on “resonance absorption.”
In the late summer, NDRC Director Vannevar Bush laid everything before the President, who had just returned from his Atlantic Charter meeting with Churchill. Bush was cautious; he stressed that success could not be guaranteed, that staggering sums might be wasted. Roosevelt decided to go ahead. He ordered funds provided from a special source, and in the early fall wrote to Churchill, suggesting that the whole effort be conducted jointly by the scientists of both nations.
In the late fall Urey and Pegram were sent to England. They found the British, notably Chadwick, the neutron discoverer, convinced that an atomic bomb was feasible, and fearful that the Germans might get it first. Secret service reports showed the Germans hard at work on projects that could serve no other purpose. Filled for the first time with the same sense of urgency as their foreign-born colleagues, Pegram and Urey returned early in December—in the week that ended on Pearl Harbor Sunday.
On Wednesday Urey was honor guest at a dinner. It was December 10—the forty-fifth anniversary of Nobel’s death and the fortieth of the brilliant annual ceremony which could no longer take place in Stockholm. The New York American-Scandinavian Center had gathered some hundred persons at the Hotel Roosevelt to mark the date. Eight Nobel laureates spoke. One, the chemist Otto Meyerhof, told of a new meaning that the prizes had acquired in our time: a refugee with papers identifying him as a Nobel Prize-winner was always treated much better. Halfway through the dinner the guests heard that Germany had declared war on the United States.
The theme of Professor Urey’s speech had been “The Spirit of Nobel.” He had not mentioned the work to which he would return that night; but this work was more in the spirit of the great searcher for the prohibitively frightful weapon than any verbal tributes to his memory.
December 6, the day before Pearl Harbor, had brought the announcement that there was to be an “all out” effort. Urey was one of three “program chiefs.” The organization, now with unlimited funds, was supervised by Harvard President Conant. The chiefs met in “an atmosphere of enthusiasm and urgency.” In January 1942 it was decided to concentrate the “metallurgical project” in Chicago. Fermi’s Columbia group, the Princeton group and others moved there. The group from California included a young professor charged with studies of the “fast neutron” reaction needed for an actual bomb—Professor J. Robert Oppenheimer of the University of California.
In Chicago the first self-sustaining “chain reaction” was started in a uranium and graphite pile in the late fall. The pile was fitted with “control strips” of absorbing material, which were placed in “retard” position from the start. “This,” Dean Smyth dryly reported, “was fortunate, since the approach to the critical condition was found to occur at an earlier stage of assembly than had been anticipated.” The “critical condition” is one at which explosion is no longer avoidable. Chicago seems to have had a tight squeeze in 1942.
By then the project had come under the War Department. A whole new district in the Engineer Corps, the Manhattan District, had been set up to handle it, and the word “Manhattan Project” had become a magic’ formula at WPB and other priority-granting agencies. DuPont had started to build the tremendous Clinton plant in Tennessee, and Los Alamos in the New Mexican desert, a site on a mesa that could be reached only by one winding mountain road, had been chosen for the “bomb laboratory.” Its director was to be Professor Oppenheimer.
“Oppy,” as everyone called him, was a graduate of New York’s Ethical Culture School and Harvard; his Manhattan cocktails were famous long before he became scientific director of the Manhattan Project. He was then .still in his thirties, sensitive, somewhat dreamy, a man who collected Bach records and French Impressionists in a beautiful house overlooking the Golden Gate. His students worshipped him. His colleagues, not without some irony, talked of “Oppy’s disciples.”
But Secretary Stimson in 1945 would call success “largely due to his genius and the inspiration and leadership he has given to his associates.” The job required a man who would turn others into disciples.
Oppenheimer arrived in New Mexico in March 1943, with a dozen aides and three carloads of apparatus from the Princeton project to fill the most urgent needs. Equipment soon poured into the desert; in a matter of months Los Alamos became the best-equipped physics research laboratory in the world. The staff, too, grew by leaps and bounds, in numbers and in scientific luster. Enrico Fermi was one of its seven division-heads; Urey, Compton, Lawrence, Anderson came for longer or shorter periods, to work out special problems; an expeditionary force of British physicists was headed by Chadwick. There was a flurry of excitement when word arrived that Bohr was on his way: the Danish master had escaped from his occupied country in a fishing boat, was flown across the Atlantic in a bomber and came straight to New Mexico, his very presence in America a top military secret. The firmament of science had moved to Los Alamos, and its stars of first magnitude moved in courses set by young Professor Oppenheimer. The Bach devotee conducted the biggest scientific orchestra ever assembled for a full performance, smoothly and with authority. There was a universal feeling on the mesa: “Oppy knows best.”
In time, many who lived with the nascent agent of horror came to hope that they would fail. Hoping that the bomb would prove impossible, they still worked furiously—first, because they were scientists, and secondly, because only their own failure could prove that the enemy might not succeed. They did not fail. On July 16, 1945, the bomb was tested. At the main control post, six miles from the place where the bomb hung from a steel tower, a general kept his eyes on the director:
Dr. Oppenheimer grew tenser as the last seconds ticked off. He scarcely breathed. He held on to a post to steady himself. For the last few seconds he stared dreamily ahead and then when the announcer shouted, ‘Now!’ and there came this tremendous burst of light followed shortly thereafter by the deep growling roar of the explosion, his face relaxed into an expression of tremendous relief. . . .
But a few days later, Dr. Oppenheimer addressed a number of the young refugee physicists who had worked under him, and one of them wrote home: “He specifically stated that he would not say one word to alleviate the fears of those of us who might feel that we had actually done a terrible thing and indicated that this should remain a problem to be solved by our own consciences. He felt, though, that we owed a great deal to the people of this country and that, at least in the short-term view of things, we had to some extent paid our debt.”
Last August two atomic bombs wiped out a city apiece, and Japan begged to quit. World War II ended. But the bomb stayed on the minds of men, and now, as time goes on, seems apt to weigh more and more heavily on the minds of more and more men.
Albert Einstein, vacationing in the Adirondacks in August, bluntly said what Oppenheimer had merely suggested. Einstein had refused to work on the bomb; he denied that it was a triumph of science. It was an evil, a terrible thing which disgraced science.
Niels Bohr, in Copenhagen, calls for international control. For some time, similar calls have come from the American bombmakers. They made a thing which was to end war. It ended a war. What now?
Nobel ‘s theory is on trial. The “substance or machine” he wished he could produce is here. There can be no doubt, either, that the world is frightened. Is it frightened enough?
The scientists seem to have asked themselves this question. For in recent weeks they have begun to do things rarely, if ever, done by scientists before. They have banded together and launched educational campaigns, defied a rigid military censorship and insisted on testifying at Congressional hearings. They are now addressing lay meetings, writing for the daily press and preparing to keep their case even more urgently before the public: ever since the statesmen first seemed to be backing away from the intentions which all of them professed directly after Hiroshima, the scientists have started direct action to prevent the next atomic bomb from being dropped. They began by shattering the comfortable reliance on a “secret”; they continued by urging the establishment of a world-government. It may yet happen that Nobel prize winners in science, whose work led to Nobel ‘s all-devastating weapon, will deserve and perhaps even earn a peace prize.
Choose your plan and pay nothing for six Weeks!
Nobel’s Prizes and the Atom Bomb
Must-Reads from Magazine
With the demise of the filibuster for judicial nominations, the Senate has become a more partisan body. Members of the opposition party no longer have to take difficult votes to confirm presidential nominees, and so they no longer have to moderate their rhetoric to avoid the appearance of hypocrisy. Many expected, therefore, that Brett Kavanaugh’s confirmation hearings would tempt Democrats to engage in theatrics and hyperbole. Few, however, foresaw just how recklessly the Judiciary Committee’s Democratic members would behave.
The sordid performance to which Americans were privy was not the harmless kind that can be chalked up to presidential ambitions. Right from the start, Democratic committee members took a sledgehammer to the foundations of the institution in which they are privileged to serve.
Sen. Cory Booker made national headlines by declaring himself “Spartacus,” but the actions he undertook deserved closer attention than did the scenery he chewed. Booker insisted that it was his deliberate intention to violate longstanding Senate confidentiality rules supposedly in service to transparency. It turns out, however, that the documents Booker tried to release to the public had already been exempted from confidentiality. Booker was adamant, however, that he had undermined the Senate’s integrity. You see, that, not transparency, was his true objective. It was what he believed his constituents wanted from him.
Booker wasn’t alone. Sen. Sheldon Whitehouse appeared to share his colleague’s political instincts. “I want to make it absolutely clear that I do not accept the process,” he said of the committee’s vetting of Kavanaugh’s documents. “Because I do not accept its legitimacy or validity,” Whitehouse added, he did not have to abide by the rules and conventions that governed Senate conduct.
When the committee’s Democratic members were not trying to subvert the Senate’s credibility, they were attempting to impugn Judge Kavanaugh’s character via innuendo or outright fabrications.
Sen. Kamala Harris managed to secure a rare rebuke from the fact-checking institution PolitiFact, which is charitably inclined toward Democratic claims. “Kavanaugh chooses his words very carefully, and this is a dog whistle for going after birth control,” read her comments on Twitter accompanying an 11-second clip in which Kavanaugh characterized certain forms of birth control as “abortion-inducing drugs.” “Make no mistake,” Harris wrote, “this is about punishing women.” But the senator had failed to include mitigating context in that clip, which would have made it clear that Kavanaugh was simply restating the arguments made by the plaintiffs in the case in question.
Later, Harris probed Kavanaugh as to whether he believed the Chinese Exclusion Act of 1882, which has never been explicitly ruled unconstitutional, was wrongly upheld by the Supreme Court. Despite calling the decisions of this period “discriminatory,” Kavanaugh declined to elaborate on a case that could theoretically come before the Supreme Court. This, the judge’s detractors insisted, was “alarming” and perhaps evidence of latent racial hostility. In fact, it was an unremarkable example of how Supreme Court nominees tend to avoid offering “forecasts” of how they will decide cases without having heard the arguments—a routine deemed “the Ginsburg Rule” after Ruth Bader, who perfected the practice.
Over a week later, Harris had still not explained what she was getting at. But she doesn’t have to. The vagueness of her claim was designed to allow Kavanaugh’s opponents’ imaginations to run wild, leading them to draw the worst possible conclusions about this likely Supreme Court justice and to conclude that the process by which he was confirmed was a sham.
Harris may not have been alone in appealing to this shameful tactic. On Thursday, Sen. Dianne Feinstein shocked observers when she released a cryptic statement revealing that she had “referred” to “federal investigative authorities” a letter involving Kavanaugh’s conduct. It’s human nature to arrive at the worst imaginable conclusion as to what these unstated claims might be, and that’s precisely what Kavanaugh’s opponents did. It turned out that the 35-year-old accusations involve an anonymous woman who was allegedly cornered in a bedroom by Kavanaugh and a friend during a high-school party. Kavanaugh, the letter alleged, put a hand over her mouth, but the woman removed herself from the situation before anything else occurred. All were minors at the time of this alleged episode, and Kavanaugh denies the allegations.
Some thought it was odd for Feinstein to refer these potentially serious allegations to the FBI this week and in such a public fashion when the allegations contained in a letter were known to Democrats for months. The letter was, after all, obtained by Democratic Rep. Anna Eshoo in July. But it doesn’t seem confusing when considering the facts that the FBI all but dismissed the referral off-hand and reporting on the episode lacks any corroboration to substantiate the claims made by the alleged victim here. It is hard not to conclude that this is an attempt to affix an asterisk to Brett Kavanaugh’s name. Democrats will not only claim that this confirmation process was tainted but may now contend that Kavanaugh cannot be an impartial arbitrator—not with unresolved clouds of suspicion involving sexual assault hanging over his head.
Ultimately, as public polling suggests, the Democratic Party’s effort to tarnish Kavanaugh’s reputation through insinuation and theatrics has had the intended effect. Support for this nominee now falls squarely along party lines. But the collateral damage Senate Democrats have done to America’s governing institutions amid this scorched-earth campaign could have lasting and terrible consequences for the country.
Choose your plan and pay nothing for six Weeks!
While the nation’s attention is focused on the Carolina coast, something very odd is happening across the country in Sunspot, New Mexico.
Sunspot is hardly a town at all–the nearest stores are 18 miles away. It’s actually a solar observatory 9,200 feet up in the Sacramento Mountains. It is open to the public and has a visitor’s center, but don’t visit it right now. On September 6th, the FBI moved in and evacuated all personnel using Black Hawk helicopters. Local police were told to stay away. The only explanation being given by the FBI is that an unresolved “security issue” is the cause of the evacuation.
The sun is the only astronomical body capable of doing major damage to planet earth without actually hitting us. A coronal mass ejection aimed at the earth could have a devastating impact on satellites, radio transmission, and the electrical grid, possibly causing massive power outages that could last for weeks, even months. (It would also produce spectacular auroras. During the Carrington Event of 1859, the northern lights were seen as far south as the Caribbean and people in New England could read newspapers by the light.)
So, there are very practical, not just intellectual reasons, to know what the sun is up to. But the National Solar Observatory right now is a ghost town, and no one will say why. Such a story should be catnip for journalists.
Choose your plan and pay nothing for six Weeks!
It's not paranoia if they're really out to get you.
Americans awoke Thursday morning to a familiar noise: The president of the United States waxing conspiratorial and declaring himself the victim of a nefarious plot.
“3,000 people did not die in the two hurricanes that hit Puerto Rico,” Donald Trump declared on Twitter. He insisted that the loss of life in the immediate aftermath of 2017’s Hurricane Maria topped out in the low double-digits and ballooned into the thousands well after the fact because of faulty accounting. The president did not claim that this misleading figure was attributable to flaws in the studies conducted in the aftermath of last year’s disaster by institutions like George Washington University or the New England Journal of Medicine but to a deliberate misinformation campaign orchestrated by his political opponents. “This was done by the Democrats in order to make me look as bad as possible,” Trump insisted.
If, for some mysterious reason, Trump wanted to attack the validity of these studies, he might have questioned the assumptions and biases that even their authors admit had an unavoidable effect on their confidence intervals. But Trump’s interest is not in accuracy. His desire is to shield himself from blame and to project his administration’s failings—even those as debatable as the disaster that afflicted Puerto Rico for the better part of a year—onto others. The president’s self-consciousness is so transparent at this point that even his defenders in Congress have begun directly confronting the insecurities that fuel these tweets.
Donald Trump has rarely encountered a conspiracy theory he declined to legitimize, and this tendency did not abate when he won the presidency. From his repeated assertions that Moscow’s intervention in the 2016 election was a “hoax,” to the idea that the FBI shielded Hillary Clinton from due scrutiny, to the baseless notion that “millions and millions” of illegal-immigrant voters deprived him of a popular vote victory, all of this alleged sedition has a common theme: Trump is the injured party.
The oddest thing about all this is that these are the golden days. Trump-era Republicans will look back on this as the halcyon period in which all of Washington’s doors were open to them. The president’s ostensible allies control every chamber of government. The power his adversaries command is of the soft sort—cultural and moral authority—but not the kind of legal power that could prevent Trump and Republicans from realizing their agenda. That could be about to change.
The signs that a backlash to unified Republican rule in Washington was brewing have been obvious almost since the moment Trump took the oath of office. Democrats have consistently overperformed in special and off-year elections, their candidates have outraised the GOP, and a near-record number of Republicans opted to retire rather than face reelection in 2018. The Democratic Party’s performance in the generic ballot test has outpaced the GOP for well over a year, sometimes by double-digits, leading many to speculate that Democrats are well positioned to retake control of the House of Representatives. Now, despite the opposition party’s structural disadvantages, some are even beginning to entertain the prospect of a Democratic takeover in the Senate.
Until this point, the Trump administration has faced no real adversity. Sure, the administration’s executive overreach has been rejected in the courts and occasionally public outcry has forced the White House to abandon ill-considered initiatives, but it’s always been able to rely on the GOP majorities in Congress to shield it from the worst consequences of its actions. That phase of the Trump presidency could be over by January. For the first time, this president could have to contend with at least one truly adversarial chamber of the legislature, and opposition will manifest first in the form of investigations.
How will the White House respond when House Oversight and Reform Committee Chairman Elijah Cummings is tasked with investigating the president’s response to a natural disaster or when he subpoenas the president’s personal records? How will Trump respond when Judiciary Committee Chair Jerrold Nadler is overseeing the investigation into the FBI’s response to Russia’s meddling in the 2016 election, not Bob Goodlatte? Will the Department of Homeland Security’s border policies withstand public scrutiny when it’s Mississippi’s Bennie Thompson, not Texas’s Michael McCaul, doing the scrutinizing? How will Wall Street react to a Washington where financial-services oversight is no longer led by Jeb Hensarling but Maxine Waters? If the Democrats take the House, the legislative phase of the Trump era be over, but the investigative phase will have only just begun.
In many ways, this presidency behaved as though it were operating in a bunker from day one, and not without reason. Trump had every reason to fear that the culture of Washington and even many of the members of his own party were secretly aligned against him, but the key word there is “secret.” The secret is about to be out. The Trump White House hasn’t yet faced a truly adversarial Washington institution with teeth, but it is about to. If you think you’ve seen a bunker mentality in this White House, you haven’t seen anything yet.
Choose your plan and pay nothing for six Weeks!
Podcast: Google and Kavanaugh.
Will Google survive the revelations of its political bias, or are those revelations nothing new? We delve into the complexities of the world in which important tech companies think they are above politics until they decide they’re not. Also some stuff on the Supreme Court and on polls. Give a listen.
Choose your plan and pay nothing for six Weeks!
Smeared for doing the job.
When then-presidential candidate Donald Trump famously declared his intention to be a “neutral” arbiter of the conflict between Israel and the Palestinian territories and put the onus for resolving the conflict on Jerusalem, few observers could have predicted that Trump would run one of the most pro-Israel administrations in American history.
This year, the Trump administration began relocating the U.S. embassy in Israel to the nation’s capital city, fulfilling a promise that began in 1995 with the passage of a law mandating this precise course of action. The administration also declined to blame Israel for defending its Gaza border against a Hamas-led attack. Last week, the administration shuttered the PLO’s offices in Washington.
The Trump administration’s commitment to shedding the contradictions and moral equivalencies that have plagued past administrations has exposed anti-Zionism for what its critics so often alleged it to be.
This week, Department of Education Assistant Secretary of Education for Civil Rights Kenneth Marcus announced his intention to vacate an Obama-era decision that dismissed an alleged act of anti-Semitism at Rutgers University. Marcus’s decision to reopen that particularly deserving case has led the New York Times to publish an article by Erica L. Green full of misconceptions, myths, and dissimulations about the nature of the anti-Israel groups in question and the essential characteristics of anti-Semitism itself.
In reporting on Marcus’s move, Green declared the education activist and opponent of the Boycott, Divestment, and Sanctions (BDS) movement a “longtime opponent of Palestinian rights causes,” a designation the paper’s editor felt fine printing without any substantiating evidence. You could be forgiven for thinking that BDS itself constituted a cause of “Palestinian rights” and not an international effort to stigmatize and harm both Israel and its supporters. If you kept reading beyond that second paragraph, your suspicions were confirmed.
Green contended that Marcus’s decision has paved the way for the Education Department to adopt a “hotly contested definition of anti-Semitism” that includes: denying Jews “the right to self-determination,” claiming that the state of Israel is a “racist endeavor,” and applying a double standard to Israel not “expected or demanded of any other democratic nation.” As Jerusalem Post reporter and COMMENTARY contributor Lahav Harkov observed, this allegedly “hotly contested definition” is precisely the same definition used by the International Holocaust Remembrance Alliance. In 2010, the IHRA’s working definition was adopted almost in total by Barack Obama’s State Department.
Green went so far as to say that this not-so-new definition for anti-Semitism has, according to Arab-American activists, declared “the Palestinian cause anti-Semitic.” So that is the Palestinian cause? Denying Jews the right to self-determination, calling the state of Israel itself a racist enterprise, and holding it to nakedly biased double standards? So much for the two-state solution.
Perhaps the biggest tell in the Times piece was its reporters’ inability to distinguish between pro-Palestinian activism and anti-Israeli agitation. The complaint the Education Department is preparing to reinvestigate involves a 2011 incident in which an event hosted by the group Belief Awareness Knowledge and Action (BAKA) allegedly imposed an admissions fee on Jewish and pro-Israel activists after unexpected numbers arrived to protest the event. An internal email confirmed that the group only charged this fee because “150 Zionists” “just showed up,” but the Obama administration dismissed the claim, saying that the organization’s excuse—that it expected heftier university fees following greater-than-expected attendance—was innocuous enough.
Green did not dwell on the group, which allegedly discriminated against Jews and pro-Israeli activists. If she had, she’d have reported that, just a few weeks before this incident, BAKA staged another event on Rutgers’s campus—a fundraiser for the organization USTOGAZA, which provided aid to the campaign of “flotillas” challenging an Israeli blockade of Gaza. USTOGAZA’s links to the Turkey-based organization Insani Yardim Vakfi (IHH), which has long been associated with support for Hamas-led terrorist activities, rendered the money raised in this event legally suspect. Eventually, as Brooke Goldstein wrote for COMMENTARY, even BAKA conceded the point:
After community members demanded that Rutgers, a state-funded university, hold an investigation before handing over any money to USTOGAZA, the school responded by offering to keep the money raised in an escrow account until a suitable recipient could be found. In June 2011, BAKA sent out an e-mail admitting the University had, after “much deliberation” and despite their initial approval, “decided that they are not willing to release the funds to the US to Gaza effort” due to concerns of being found liable for violating the material-support statutes.
Rutgers prudently limited BAKA’s ability to participate in on-campus events after these incidents, but the organization that took their place—Students for Justice in Palestine (SJP)—is no better. The Times quoted officials with the Center for Law and Justice who praised Marcus’s move and cited SJP as a source of particular consternation, but the reporters did not delve into the group’s activities. If they had, they’d find that the organization’s activities—among them declaring that “Zionists are racists,” supporting anti-Zionist individuals despite credible accusations of child abuse, and endorsing Hamas’s governing platform, which labels the entire state of Israel “occupied territory”—fits any cogent definition of anti-Semitism. This is to say nothing of the abuse and harassment that American Jews experience on college campuses that play host to SJP’s regular “Israel apartheid weeks.”
Some might attribute the Times’ neutral portrayal of groups that tacitly support violence and people like Omar Barghouti—an activist who “will never accept a Jewish state in Palestine” and has explicitly endorsed “armed resistance” against Jews, who he insists are “not a people”—to ignorance, as though that would neutralize the harm this dispatch might cause. But the Times piece has emboldened those who see Israel’s Jewish character as a threat both to its political culture and our own. That worrying sentiment was succinctly expressed by New York Magazine’s Eric Levitz: “You don’t have to be a staunch supporter of the Palestinian cause to question Israel’s right to exist as a Jewish state.”
The benefit of the doubt only extends so far. Even the charitably inclined should have discovered its limits by now.