The Democratic Candidate wants to talk directly with the nation's enemies; his approach has a long and disastrous history.
In the never-ending torrent of attacks on the Bush administration’s conduct of American foreign policy, the President’s supposed neglect of the peaceful instruments of diplomacy has come in for especially pointed criticism. Bush has been faulted, for example, for his unwillingness to talk directly with Iran and to negotiate one-on-one with North Korea, for his crude attempts to “isolate” Syria, and for his failure to immerse himself in the Israel-Palestinian peace process. The general line was encapsulated by the television news star Anderson Cooper last year in a leading question to Barack Obama:
In the spirit of . . . bold leadership would you be willing to meet . . . during the first year of your administration . . . with the leaders of Iran, Syria, Venezuela, Cuba, and North Korea, in order to bridge the gap that divides our countries?
Obama did not hesitate. “I would,” he replied. “And the reason is . . . that the notion that somehow not talking to countries is punishment to them—which has been the guiding diplomatic principle of this administration—is ridiculous.”
As is his wont, Obama later offered up a number of qualifications and clarifications of this statement. Electoral maneuvering aside, however, he stuck to his main position, which he seemed to treat as so obviously true as to be irrefutable. But is it? How valuable is diplomacy? Can it in fact “bridge the gap” with enemies by disclosing unsuspected common ground and thereby changing the equation between them and us? Has it ever done so?
The most frequently cited example of the power of transformative diplomacy is Richard Nixon’s trip to China in 1972. Yet, for all the stirring drama of the event, Nixon’s visit was itself made possible only by a preceding momentous development—namely, a radical turn in China’s perception of its own situation vis-à-vis the Soviet Union. As Henry Kissinger would put it, “Only extraordinary concern about Soviet purposes”—and about the large buildup of Soviet forces on the Chinese border—“could explain the Chinese wish to sit down with the nation heretofore vilified as the archenemy.” In addition, according to the Sinologist Jonathan Spence, Chinese leaders were desperate to expand domestic oil production, a task for which they had “neither the resources nor the technology.” There thus ensued, over a period of years, an exchange of hints, signals, and messages through third parties, leading first to secret talks between Kissinger and the Chinese leaders and finally to Nixon’s visit.
Of that visit, Kissinger would write:
I know of no presidential trip that was as carefully planned. . . . The voluminous briefing books . . . contained essays on the trip’s primary objectives and on all subjects of the agenda previously established. . . . They suggested what the Chinese position would be on each topic, and the talking points the President might follow. . . . Nixon . . . committed the talking points to memory and followed them meticulously in his meetings with Chou En-Lai.
The responses from Chou and Mao were no less scripted.
The visit to China, in other words, consecrated but did not cause a change in course by the two nations; it was a culmination of new thinking, not a catalyst of or occasion for it. And something analogous may be said of the other famous peacemaking summit of recent decades, namely, Egyptian President Anwar Sadat’s groundbreaking trip to Jerusalem in 1977.
Notwithstanding the surprise attack he launched on Israel in October 1973, Sadat had led Egypt away from the bankrupt “revolution” of his predecessor Gamal Abdel Nasser, including by expelling Soviet military forces and turning Egypt toward the West. In July 1977, in an interview on ABC TV, the Egyptian president signaled his intention to end the conflict with Israel. He reiterated the point in a speech to the Egyptian People’s Assembly in November: “I state in all seriousness that I am prepared to go to the end of the world—and Israel will be surprised to hear me tell you that I am ready to go to their home, to the Knesset itself.” Two days later, Prime Minister Menachem Begin responded in a televised speech addressed to the Egyptian public, saying that “it will be a pleasure to welcome and receive your president with the traditional hospitality you and we have inherited from our common father, Abraham.” Eight days later, Sadat arrived.
Long and tortuous negotiations, mediated by Jimmy Carter, lay ahead before this breakthrough would result in a peace treaty. If the visit itself constituted a watershed, this was due not to any of the words that passed between Egyptian and Israeli leaders but to the fact that Sadat, as a result of much thought and calculation of Egypt’s self-interest, had decided to address at long last the crux of the conflict: the refusal of the Arabs to recognize the existence of Israel. For him as for Chou and Mao, meeting his opposite numbers face to face was the result, not the cause, of the change in direction, with summitry serving as a confirming ritual.
Numerous other reversals and re-positionings have taken place without or in spite of extraordinary bouts of diplomacy. One of them was the ending of the cold war—the most monumental shift in modern times. Mikhail Gorbachev became the Soviet General Secretary in 1985 and met President Ronald Reagan later that year. They met again in 1986 and 1987 and twice in 1988, and Gorbachev met President George H.W. Bush in 1989. But neither in Gorbachev’s own accounts nor in those of outside observers can one find any evidence that these tête-à-têtes exercised a significant influence on his intellectual evolution—which was the real motive force behind the transformation and eventual abolition of the Soviet Union.
Gorbachev has described that inner journey as being driven by the wish to modernize and revitalize the country he ruled. By his third year in office he was ready to confess that the system gave him a “troubled conscience,” to speak positively of “pluralism,” and to exhort the Central Committee that “we need democracy like air.” The liquidation of the cold war was a by-product of his encounters with Soviet reality, not with American presidents.
It is actually difficult to think of any case where diplomacy has served to “bridge the gap” with a hostile or enemy nation. Which is hardly to say that such diplomacy has not been tried. Time and time again, American and other Western statesmen have undertaken strenuous diplomatic efforts at the highest levels in order to reach and change the minds of enemy leaders. As the history of cold-war summitry attests, the results have been at best trivial, at worst deleterious.
After World War II, the first conclave of Soviet with American (and British and French) heads of state was held in Geneva in July 1955, soon after the emergence of Nikita Khrushchev as General Secretary. The press enthused over “the spirit of Geneva,” but Charles Bohlen, then America’s ambassador to Moscow, called the outcome “disappointing and discouraging.” Within a year, Soviet tanks suppressed the Hungarian revolution and Khrushchev was threatening London and Paris with nuclear attack over the issue of Suez.
The first bilateral U.S.-Soviet summit took place in 1959 in the course of a visit by Khrushchev to the United States. His private conversations with President Dwight D. Eisenhower at Camp David were marked by cordiality and gave rise to talk of a “spirit of Camp David.” But in substance, as the Kremlinologist Adam Ulam noted, “the visit produced nothing beyond the predictable agreement to hold [another] summit meeting” of the Big Four. At the opening of that session, in May 1960, Khrushchev demanded an apology for the overflight of a U-2 spy plane that the Soviets had just downed. Ike refused, and the meeting ended before it began.
Summitry resumed in Vienna soon after the inauguration of John F. Kennedy. This time, it was the turn of the Soviet press to hail “the spirit of Vienna.” But to Bohlen, who was present as an adviser to the new President,
The results of the discussions were meager. . . . The Soviet leader assured Kennedy that the Soviet Union would resume nuclear testing only if another country did first. He must have lied, because in August the Soviet Union began a two-month series of blasts, ending with the detonation of the most powerful explosion ever set off.
The unhappy aftermath did not stop there. In his memoirs, Khrushchev wrote that he came away “generally pleased” at finding JFK to be “interested in . . . avoiding conflict with the Soviet Union.” So pleased, evidently, that two months later the Communists laid down barbed wire and began to erect the Berlin Wall.
Kennedy’s successor, Lyndon B. Johnson, met Khrushchev’s successor (or co-successor) Alexei Kosygin in 1967 in Glassboro, New Jersey; like clockwork, the American media invoked “the spirit of Glassboro.” So empty was this particular diplomatic exercise that no communiqué was issued at its conclusion. The following year, Warsaw Pact forces crushed Czechoslovakia’s “Prague Spring” and the Communists mounted their Tet offensive in Vietnam.
Far more ambitious talks were undertaken by President Nixon, aided by Henry Kissinger as his National Security Adviser. Nixon’s interlocutor was Leonid Brezhnev, who had elbowed aside Kosygin as the Kremlin’s top man, and the talks culminated in a 1972 Moscow summit that saw the signing of a Declaration of Basic Principles and the SALT I and other agreements. American critics, led by Senator Henry M. Jackson, protested that SALT I allowed the USSR an advantage in intercontinental missiles. Defenders argued that the importance of the agreement lay not in its impact on the two nations’ respective arsenals but rather in its symbolic function as a cornerstone of détente—one element, in Kissinger’s words, “of a larger decision to place relations on a new foundation of restraint, cooperation, and steadily evolving confidence.”
These high hopes were dashed a year later by the Soviet role in arming Egypt and Syria for their surprise attack on Israel in the Yom Kippur war and subsequent threat to intervene directly in the fighting. Over the next years, so great a mockery did the Kremlin make of the declaration that Nixon’s successor, Gerald Ford, forbade any further use of the term “détente.”
Jimmy Carter, for his part, renewed the pursuit of détente on terms even more conciliatory and self-effacing than those struck by Kissinger. In 1979, having stood by as a handful of countries in Africa and the Americas fell to Communism, Carter at last reached his holy grail, a new SALT agreement signed at a summit meeting at which the President eagerly hugged and kissed a visibly nonplussed Brezhnev. A few months later, Soviet forces invaded Afghanistan, and the U.S. Senate shelved the pact.
If 35 years of U.S.-Soviet summitry were barren of positive results, the outcomes of other high-level talks with Communist regimes were outright damaging. In 1973, Kissinger and North Vietnam’s negotiator Le Duc Tho shared the Nobel Peace Prize for an agreement committing both countries to withdraw their troops from South Vietnam. The Americans kept their side of the bargain; the North Vietnamese ignored theirs. In further violation, Hanoi launched an all-out offensive that completed its conquest of the south. Kissinger complained bitterly when Congress cut off support to the Saigon government—which, he said, would otherwise “not have collapsed in 1975.” He may well have been right, but this hardly redeemed the wisdom of the deal negotiated with the Communists.
Similarly with the “agreed framework” between the Clinton administration and North Korea in 1994. This was engineered by ex-President Carter, still sublimely confident of his diplomatic talents despite the dismal record of his own administration. Carter undertook a pilgrimage to North Korean dictator Kim Il Sung that eventuated in a fraudulent structure of disarmament behind which Pyongyang sped its development of nuclear weapons.
To appreciate the persistence of this drearily repetitive pattern, one is inevitably pushed back to earlier but still-emblematic efforts of Western leaders to talk with their enemies. In World War II, the Soviet dictator Joseph Stalin was not an enemy but an ally. Some in Washington and London vaguely foresaw the possibility of future conflict with the USSR—a conflict that Stalin on his side took as axiomatic. In hopes of securing the continued cooperation of Moscow following the war, both Franklin D. Roosevelt and Winston Churchill placed great store in their personal contacts with Stalin. After visiting the Soviet leader in Moscow in 1942, Churchill informed FDR: “I feel I have established a personal relationship that will be helpful.” In April 1943, the American ambassador in Moscow conveyed FDR’s wish to “sit down with Stalin and talk over problems” so as to avoid any “lack of understanding.” As the President wrote Churchill, “I think I can handle Stalin better than either your foreign office or my State Department. . . . He thinks he likes me better, and I intend to keep it that way.”
Stalin played on these illusions like a drum, maneuvering for Western assent to postwar Soviet rule over Eastern Europe while answering every appeal for moderation by referring to supposed hardliners in the Kremlin who were making it “impossible for me to fulfill your wish.” Stalin likewise ran circles around FDR when it came to the latter’s pet project of creating a structure through which the wartime allies would work together to uphold the postwar peace. He declined to agree to the idea of the UN until Roosevelt assured him in a private meeting that it would be headed by “four policemen”—the U.S., the UK, the USSR, and China—and that the burden of “policing” Europe would fall to Britain and the Soviet Union. Even then he continued to toy with Roosevelt, demanding that the USSR be given fifteen memberships in the UN (one for each Soviet republic) before finally settling for three.
At the final wartime summit, held in Potsdam in July 1945, America’s principal goal was to secure Soviet entry into the war against Japan. While the conference was under way, Harry Truman, who by then had succeeded FDR—and who today is justly remembered as the President who faced up to the Soviet threat and devised our counter-strategy of containment—received a coded message that an atomic weapon had been successfully tested. He shared the news with Stalin, who, according to Truman’s memoirs, said “that he was glad to hear it and hoped we would make good use of it against the Japanese.” On August 9, the day the second bomb was dropped, the Soviet Union descended like a vulture to devour what it could of Japan’s now-moribund empire. In the five days until Tokyo’s formal surrender, the Russians seized and occupied Manchuria and northern Korea, with consequences that plague us to this day.
Going back still further in time, we arrive at the disaster wrought by the democracies’ summitry with Hitler. The era of appeasement did not amount to a solitary moment of misjudgment by British Prime Minister Neville Chamberlain at Munich in 1938. Rather, it encompassed several years of diplomatic exertions by means of which the democracies attempted to cope with the looming threat from both the fascist and the Communist world.
As early as 1935, when Hitler defied the disarmament provisions of the Versailles Treaty, British Foreign Secretary John Simon hastened to Germany for talks with the Fuehrer. Ignoring Germany’s violations of the treaty, he agreed with Hitler on a new pact aimed at limiting naval forces. Hitler, who had not the slightest intention of abiding by the pact, would refer to the signing ceremony as “the happiest day of my life.” The French, meanwhile, sent Foreign Minister Pierre Laval to Moscow to sign a defense pact with Stalin, and to Rome to offer Benito Mussolini a “free hand” in Ethiopia. When Mussolini subsequently declared his aim of conquering Ethiopia, London dispatched Anthony Eden to try to buy him off by offering up a chunk of British Somaliland instead. The democracies’ avidity for talks with any and all potential enemies convinced Mussolini of their weakness; in late 1936 he announced the formation of a “Rome-Berlin Axis.”
The story of Munich itself has been too often told to bear repeating, but it is worth recalling that it was not a single meeting but a series of three, with each serving to embolden Hitler further. Not confronting only would the democracies get the war they had hoped to avert, they would get it on the most disadvantageous terms. For in addition to delivering to Hitler the considerable industrial strength of Czechoslovakia, the pusillanimity of the West at Munich had an effect on Stalin similar to what earlier displays of accommodation had on Mussolini. Within months, the Soviet ambassador would visit the German foreign office in Berlin to initiate the conversations that would blossom into the Stalin-Hitler pact.
Whether confronting with fascism in the 1930’s, or meeting with Stalin during World War II, Western leaders were played for fools, and their mistakes cost dearly. In their defense, it may be said that they were dealing with something new: messianic revolutionary ideologies and ambitions far more sweeping than those entertained by any conventional power, and men whose behavior was unconstrained by even the least trace of a common morality. Little in diplomatic history had prepared these democratic statesmen to deal with totalitarian dictators.
How then are we to judge our most practiced statesmen today who continue to indulge in similar follies with no such excuse? Two years ago, with America’s fortunes in Iraq at a low ebb, Congress formed an Iraq Study Group led by James Baker and Lee Hamilton and comprising former Senators, Secretaries of State and Defense, White House chiefs of staff, and a Supreme Court Justice. Concluding that the U.S. was losing the war, these eminences rejected either “staying the course,” or “precipitous withdrawal,” or sending more troops. To square the circle of escaping from Iraq without incurring an American defeat or surrender, they proposed instead a “robust diplomatic effort” involving Iran and Syria—two countries that allegedly “share[d] an interest in avoiding the horrific consequences that would flow from a chaotic Iraq, particularly a humanitarian catastrophe and regional destabilization.”
Let us review briefly what we know of these two suggested interlocutors. In the 1980’s, the Syrian Baathist regime annihilated 20,000 of its own citizens in order to suppress an Islamist uprising, while the Islamic Republic of Iran in its war with Iraq used waves of human beings as mine sweepers. Both regimes practice systematic torture, and both have reduced their prison populations by slaughtering hundreds of defenseless inmates. The two collaborate in arming Hamas and Hizballah and in turning Lebanon and Gaza into platforms for war. They are the principal conduits for moving, respectively, terrorists and advanced explosives into Iraq.
To imagine that a devotion to “humanitarian” concerns and regional “stability” would lead either of these regimes to set aside its oft-voiced resentments toward the “Great Satan” so as to pull our chestnuts from the fire is to indulge in illusions on the order of those that misled FDR at Yalta, Laval in Rome, and Chamberlain at Munich—without their excuse of historical inexperience. Fortunately, President Bush adopted instead a course specifically rejected by the Baker-Hamilton commission. He ordered the “surge,” and as a result, by all accounts, we began to turn the tide in Iraq. If we win the war, it will be in no small measure because the counsel of Baker-Hamilton was ignored.
Alas, however, the diplomatic temptation persists, and in the corridors of the Bush White house itself. Having spent, so it seems, its last reservoir of political capital on the surge, the administration has given ground to its critics in other arenas. It has resumed trying to reason with North Korea, sponsored a new round of Israel-Palestinian negotiations, and most recently joined talks with Tehran even while that regime continued enriching uranium. All in all, the verdict on these exercises is likely to be the same as the verdict on all those preceding them: much ventured, little if anything gained, much potentially lost.
To say that talking with our enemies has more often done harm than good does not mean that we should always avoid it. But when we do speak, it is essential that we eschew the conceit that, whoever they may be and whatever their own purposes, it lies within our power to manipulate or seduce them into becoming friends or serving our interests. If they have truly undergone a metamorphosis and wish to revise their relations with us, they will find ways to let us know about it—as happened, despite early missed signals, both in the case of Mao and Chou and in that of Sadat. That is why it is preposterous to assert, as some continue to do, that at a certain moment in 2003 Tehran sent a message through a Swiss diplomat that it was ready to settle all of its differences with Washington—but, because the Bush administration failed to seize the opportunity, the moment was lost forever.
What is essential is to understand whom we are talking to. In particular, messianic revolutionary regimes operate in a moral universe whose values are antithetical to ours. Their goal in talking is virtually never to have better relations for their own sake, but to have the advantage of us. The fatal allure of transformative diplomacy is that, by means of summitry, the lions can be charmed not just into lying down with lambs but into becoming lambs. That goal is a chimera; it has never happened.
Finally, we must always remember that such regimes are in a state of permanent war with their own subjects, and that every measure of legitimation they receive from the outside serves to discourage those subjects’ hopes for freedom and impair their will to resist. Even the government of the Soviet Union, our superpower counterpart, was eager for the symbolism of being treated by America as an equal. How much more so would the power of the petty tyrants or would-be tyrants who rule Iran, Syria, Venezuela, Cuba, and North Korea—the countries with whose leaders Barack Obama expressed his readiness to meet and talk in the first year of his administration—be inflated by an audience with an American President. In this, especially, there is much to be lost—by ourselves and by those under their rule.
Should we talk with our enemies? Yes—to tell them what we think of them, and what we ourselves stand for. We should talk to them, that is, on our own terms and not theirs, and with their captive peoples in mind. But to the question that Anderson Cooper put to Senator Obama, the simple and correct answer was “No.” If Obama ever gains the presidency, the world will be safer if he has figured that out before he enters office.
Choose your plan and pay nothing for six Weeks!
Obama’s “Talking” Cure
Must-Reads from Magazine
John Cheney-Lippold, an associate professor of American Culture at the University of Michigan, has been the subject of withering criticism of late, but I’m grateful to him. Yes, he shouldn’t have refused to write a recommendation for a student merely because the semester abroad program she was applying to was in Israel. But at least he exposed what the boycott movement is about, aspects of which I suspect some of its blither endorsers are unaware.
We are routinely told, as we were by the American Studies Association, that boycott actions against Israel are “limited to institutions and their official representatives.” But Cheney-Lippold reminds us that the boycott, even if read in this narrow way, obligates professors to refuse to assist their own students when those students seek to participate in study abroad programs in Israel. Dan Avnon, an Israeli academic, learned years ago that the same goes for Israel faculty members seeking to participate in exchange programs sponsored by Israeli universities. They, too, must be turned away regardless of their position on the Israeli-Palestinian conflict.
When the American Studies Association boycott of Israel was announced, over two hundred college presidents or provosts properly and publicly rejected it. But even they might not have imagined that the boycott was more than a symbolic gesture. Thanks to Professor Cheney-Lippold, they now know that it involves actions that disserve their students. Yes, Cheney-Lippold now says he was mistaken when he wrote that “many university departments have pledged an academic boycott against Israel.” But he is hardly a lone wolf in hyper-politicized disciplines like American Studies, Asian-American Studies, and Women’s Studies, whose professional associations have taken stands in favor of boycotting Israel. Administrators looking at bids to expand such programs should take note of their admirably open opposition to the exchange of ideas.
Cheney-Lippold, like other boycott defenders, points to the supposed 2005 “call of Palestinian civil society” to justify his singling out of Israel. “I support,” he says in comments to the student newspaper, “communities who organize themselves and ask for international support to achieve equal rights, freedom and to prevent violations of international law.” Set aside the absurdity of this reasoning (“Why am I not boycotting China on behalf of Tibet? Because China has been much more effective in stifling civil society!”). Focus instead on what Cheney- Lippold could have found out by Googling. The first endorser of the call of “civil society” is the Council of National and Islamic Forces (NIF) in Palestine, which includes Hamas, the Popular Front for the Liberation of Palestine, and other groups that trade not only in violent resistance but in violence that directly targets noncombatants.
That’s remained par for the course for the boycott movement. In October 2015, in the midst of the series of stabbings deemed “the knife intifada,” the U.S. Campaign for the Academic and Cultural Boycott of Israel shared a call for an International Day with the “new generation of Palestinians” then “rising up against Israel’s brutal, decades-old system of occupation.” To be sure, they did not directly endorse attacks on civilians, but they did issue their statement of solidarity with “Palestinian popular resistance” one day after four attacks that left three Israelis–all civilians–dead.
The boycott movement, in other words, can sign on to a solidarity movement that includes the targeting of civilians for death, but cannot sign letters of recommendation for their own undergraduates if those undergraduates seek to learn in Israel. That tells us all we need to know about the boycott movement. It was nice of Cheney-Lippold to tell us.
Choose your plan and pay nothing for six Weeks!
Convenience, wrote Columbia University law professor Tim Wu, is as much a force driving mankind’s political evolution as any of the more tangible and conspicuous factors that dominate daily life. So many modern appliances have made what were once minor chores unthinkably exacting burdens. Convenience, Wu wrote, trumps preference. Streaming video services, to name just one example, have rendered the rigors of the clock obsolete. But that convenience has also inured us to sacrifice. Once anathema, viewers have re-learned how to sit passively through a 120-second commercial break. It’s a small price to pay to enjoy our newfound freedom from wait.
That cost may be modest, but it’s a reminder that everything comes with a price tag. The instant gratification associated with on-demand society has made America’s shared cultural moments a thing of the past. The explosion of online shopping has eliminated the time consumers wasted traveling from store to store, but physical retail is dying as a result. The modern public square and the daily human interactions that it encouraged will disappear along with it. Machine learning has the power to introduce a “more compassionate social contract” and reduce physical risk associated with the workplace hazards or lifestyle choices. But risk is just another word for freedom and, in the pursuit of convenience, we risk sacrificing our independence along with our hardships.
“We’re really reinventing the traditional insurance model with our vitality program,” said Marianne Harrison, the CEO of one of North America’s largest life insurers, John Hancock, in a recent appearance on CNBC. The beaming insurance executive boasted of her firm’s effort to marry a “technology-based wellness program” with an “insurance product.” That’s a loaded way of saying that this American insurer is soon going to charge based on the real-time monitoring of your daily activities. Behavior-based insurance will track the health data of policyholders through wearable devices or smartphones and distribute rewards based on individual choices. You don’t have to wear a tracking device to participate in this program—at least, not yet. Harrison assured skeptics that they could also dole out rewards to policyholders who take simple steps like reading preapproved literature, the consumption of which they presumably track.
This innovation is optional today, but the savings it yields both consumer and insurer guarantee that it will soon become a standard feature of the insurance landscape. Your freedom to eat poorly, use tobacco products, drink alcohol, or perform any number of physical activities that include varying levels of risk are not limited. You’ll just have to pay for them. And if Democratic policymakers succeed in nationalizing the private health insurance industry under the auspices of Medicare-for-all or single-payer or whatever other euphemisms they apply to the public confiscation of private property, these “tools” will only become more pervasive.
A similar rationale—the primacy of collective health—can be applied to any number of activities that invite unnecessary risk which technology can mitigate. Foremost among these is the terribly dangerous American habit of driving a car.
In 2017, there were over 40,000 automobile-related fatalities. This was the second consecutive year in which the roads were that deadly and, if observers who attribute this rate of fatal traffic accidents to an increase in smartphone ownership are correct, there will not be a decline anytime soon. A 2015 study purported to show that replacing manual vehicles with autonomous cars or vehicles with advanced driver-assistance systems could eliminate up to 90 percent of all fatal accidents and save as many as 300,000 American lives each decade. It is perhaps only a matter of time before the option to own a driverless vehicle becomes a mandate with a hefty financial penalty imposed on those who opt out.
“[T]he threat to individual freedom that the driverless car is set to pose is at this stage hard to comprehend,” wrote National Review’s Charles C.W. Cooke. Presently, the car transports its diver to wherever they’d like to go, whether there are roads to facilitate the journey or not. In a driverless world, as Cooke noted, the driver becomes a mere occupant. They must essentially ask the car for permission to transit from point A to point B, and the whole process is monitored and logged by some unseen authorities. Furthermore, that transit could ostensibly be subject to the veto of state or federal authorities with the push of a button. That seems a steep price to pay for a little convenience and the promise of safety.
The pursuit of convenience, as Professor Wu explained, has resulted in remarkable social leveling. We enjoy more time today for the kind of “self-cultivation,” once only the province of the wealthy and aristocratic, than at any point in history. And yet, we cannot know true liberty without hardship. “The constellation of inconvenient choices may be all that stands between us and a life of total, efficient conformity,” Wu concluded.
There is more to celebrate in the technological revolutions of the last quarter-century than there is to lament. But in the pursuit of convenience, we’ve begun to make spontaneity irrational. In life, the rewards associated with experience are commensurate with that which is ventured. In a future in which the world’s sharp edges are bubble-wrapped, your life may exceed today’s average statistical length. But can you really call it living?
Choose your plan and pay nothing for six Weeks!
Podcast: Christine Rosen on Brett Kavanaugh.
The podcast welcomes COMMENTARY contributor and author Christine Rosen on the program to discuss the allegations against Supreme Court nominee Brett Kavanaugh. Have his confirmation hearings have transformed into another chapter in the national cultural reckoning that is the #MeToo moment?
Choose your plan and pay nothing for six Weeks!
Justice both delayed and denied.
According to Senate Judiciary Committee Democrat Chris Coons, Dr. Christine Blasey Ford, the woman who has accused Judge Brett Kavanaugh of sexually assaulting her when she was a minor, did not want to come forward. In an eerie echo of Anita Hill’s public ordeal, her accusations were “leaked to the media.” With her confidentiality violated, Ford had no choice but to go public. Coons could not say where that leak came from, but he did confess that “people on committee staff” had access to the letter in which Ford made her allegations. Draw your own conclusions.
Though many observers insist that what we have witnessed since Ford’s allegations were made public is about justice, it’s hard to see any rectitude in this process. Ford has been transformed into a public figure apparently against her wishes. The details of the attack that Ford alleges are deeply disturbing, but they are not prosecutable. Ford’s recollection of the events 36 years ago is understandably hazy, but what she alleges to have occurred is too vague to establish with much accuracy. She cannot recall the precise date or location in which she was supposedly attacked. Contrary to the protestations of Senate Democrats like Kamala Harris, the FBI cannot get involved in a matter that is not within the federal government’s jurisdiction. And even if local authorities were inclined to involve themselves, the statute of limitations long ago elapsed.
With precious few facts available to congressional investigators and without the sobriety that public scrutiny in the age of social media abhors, the spectacle to which the nation is about to be privy is undoubtedly going to make things worse. A public hearing featuring both Ford and Kavanaugh will be a performative and political display, if it happens at all. It will be adorned with the trappings of courtroom proceedings but with none of the associated protections afforded accused and accuser alike. It will further polarize the nation such that, whether Kavanaugh is confirmed or not, public confidence in Congress and the Supreme Court will be severely damaged. And no matter what is said in that hearing, it is unlikely to change many minds.
Given the dearth of hard evidence, it is understandable that observers have begun to look to their own experiences to evaluate the veracity of Ford’s allegations. The Atlantic contributor Caitlin Flanagan is the author of a powerful and compelling example of this kind of work. Her essay, entitled “I Believe Her,” is important for a variety of reasons. Maybe foremost among them is how she all but invalidates defenses of Kavanaugh that are based on the positive character references he’s assembled from former female acquaintances and ex-girlfriends. Flanagan was assaulted as a young woman, and her abuser—a man she says drove her to a suicidal depression similar to what Ford has described to her therapist—was not interested in a romantic relationship. CNN political commenter Symone Sanders, too, confessed that “there is no debate” in her mind as to Kavanaugh’s guilt, in part, because she was the victim of a sexual assault in college. The similarities between what she endured and what Ford says occurred are too hard for her to ignore.
These are harrowing stories, but they also reveal how little any of this has to do with Brett Kavanaugh anymore. For some, this has become a proxy battle in the broader cultural reckoning that began with the #MeToo moment. Quite unlike the many abusive men who were outed by this movement, though, the evidentiary standard being applied to Kavanaugh’s case is remarkably low. His innocence has not been presumed, and a preponderance of evidence has not been marshaled against him. It is not even clear as of this writing that Kavanaugh will be allowed to confront his accuser. At a certain point, honest observers must concede that getting to the truth has not been a defining feature of this process.
In the face of this adversity, there are some Republicans who are willing to sacrifice Kavanaugh’s nomination. Some appear to think that Kavanaugh’s troubles present them with an opportunity to advance their own political prospects and to promote a replacement nominee with whom they feel a closer ideological affinity. Others simply don’t want to risk standing by a tainted nominee. The stakes associated with a lifetime appointment to the Supreme Court are too high to confirm a justice with an asterisk next to his name—a justice who may tarnish future rulings on sensitive cases by association. Those Republicans are either capitulatory or craven.
Based on what we know now, Kavanaugh does not deserve an asterisk. Maybe he will tomorrow, but he doesn’t today. Those who would allow what is by almost all accounts an exemplary legal career to be destroyed by unconfirmable accusations or outright innuendo will not get a better deal down the line. Some Republicans are agnostic about Kavanaugh’s fate and believe that his being stopped will make room for a more doctrinaire conservative like Amy Coney Barrett. But they will not get their ideologically simpatico justice if they allow the defiling of the process by which she could be confirmed.
The experiences that Dr. Ford described are appalling. Even for those who are inclined to believe her account and think that she is due some restitution, no true justice can be meted out that doesn’t infringe on the rights of the accused. Those in the commentary class who would use Kavanaugh as a stand-in for every abuser who got away, every preppy white boy who benefited from unearned privilege, every hypocritical conservative moralizer to exact some karmic vengeance are not interested in justice. They want a political victory, even at the expense of the integrity of the American ideal. If there is a fight worth having, it’s the fight against that.