Three devastating blows in quick succession raise profound questions about the law’s life expectancy
Over the past few months, President Barack Obama’s health-care law has been subjected to three devastating blows. First was the electoral repudiation of the Democrats in November 2010, based in large part on Obama’s overreach when it came to both his political mandate and the boundaries of the Constitution. Second was the political repudiation of the new law by the newly Republican-dominated House of Representatives, which voted 245-189 in January to repeal the bill—a margin 49 votes greater than that of the original vote to pass it (followed by a close, but failed, Senate vote). The third repudiation came from the federal judiciary, and in two stages that sandwiched the House vote. Stage one was the ruling by Judge Henry Hudson of Virginia in December; the second was the ruling by Judge Roger Vinson of Florida in late January. Both held that there is no constitutional basis for the legislative imposition of a mandate that would require every American to purchase a health-care policy.
The electoral repudiation was a shock to the political system; it sent the message that, when it comes to far-reaching overhauls of America, voters prefer consensus and discussion to single-party legislative impositions. This nationwide repudiation was not surprising in the least to anyone outside the White House or the previous congressional leadership. Portents of electoral disaster had begun appearing on the political horizon more than a year before the 2010 elections. Gubernatorial triumphs for Republican candidates in New Jersey and Virginia in 2009 should have given the Obama team at least an inkling of the unrest it would be facing over its political tactics and policies. Nevertheless, those who chose to whistle past the electoral graveyard argued that both New Jersey and Virginia had had Republican governors before and shook off the 2009 election defeats as not particularly noteworthy.
It was the Scott Brown election in January 2010 that first signaled to the White House, and the rest of the political establishment, that something unprecedented was going on. Brown’s victory, in which a Republican in Massachusetts won the seat that had belonged to liberal lion Ted Kennedy, not only flummoxed the administration politically, but also put them on a dangerous policy and procedural path. For Brown had explicitly run as an opponent of the health-care bill on a promise that he would deny Democrats the sixtieth vote they would need to overcome a Republican filibuster in the Senate against the bill. Without that all-important sixtieth vote, Democrats were no longer able to jam through their initiative in a strictly partisan way, as they they could do in the majority-run House.
What this meant was that Democrats would ultimately have to resort to a strategy that would require the Senate and then the House to pass bills with the exact same language, thus vitiating the appointment of a conference committee usually needed to fix technical glitches or policy disagreements that arise when the two bodies consider similar legislation. The Democrats did exactly that, and after Senate passage in March 2010, House passage of the same bill was preordained. This legislative approach, with its attendant inability to correct or amend the bill, would become relevant in the court cases regarding the bill down the road, about which more later.
Once the Brown election took place, it became clear to everyone except perhaps Obama that there was going to be a changing of the guard in the U.S. House of Representatives, and possibly even in the Senate. (Obama once said about the prospects for health reform under his leadership, “Well, the big difference here and in ’94 was you’ve got me.”) Obama was warned multiple times for more than 15 months by members of his own party about the political havoc he was wreaking. He chose to soldier on, saying discordant things to his various messengers, such as “Let’s keep fighting” or “I feel lucky.”
Luck, as it turned out, was not with him politically. The Republicans decisively took the House, gaining more than 60 seats. Republicans took control of 29 governorships and gained 680 state legislative seats. The GOP also gained six spots in the Senate, setting up the Republicans for a likely Senate takeover in 2012, for in that year Democrats will be effectively defending 23 seats to the Republicans’ 10. Anticipating the ugliness of 2012 for the Democrats, Senator Kent Conrad of North Dakota, who has long been more liberal than his very red state, has already announced his retirement. More Democratic senators are likely to follow him out, either via preemptive retirement or electoral defeat.
The political upheaval of the November election has had a number of effects on the prospects for the Obama health-care bill. First, it shattered the air of inevitability regarding the law’s implementation. The Democrats had quite clearly counted on the notion that the bill, once passed, would be accepted by the body politic and that the nation would wordlessly move on to the next issue. There were sound historical reasons for thinking this way, as repeal of existing law is relatively rare, limited to instances of gross error, such as slavery, or overreach, such as Prohibition, and 1988’s Medicare Catastrophic Coverage Act, which was repealed in 1989.
Second, the elections put the Democrats on the defensive on their own signature initiative. They appear to have been unprepared for the intellectual fight they encountered over the bill. As New York Democrat Anthony Weiner told the Washington Post: “Republicans want another debate about health-care reform? Well, so should Democrats. They beat us in round one with lies and scare tactics. We welcome a second shot.” Weiner may comfort himself that the Democrats lost via “lies and scare tactics,” the definition of which reasonable people can disagree about, but there is no gainsaying his admission that the Democrats lost the battle of words in the health-care debate. The November election showed definitively, even to President Obama, how badly the Democrats had miscalculated the level of opposition to their health-care plan.
Since November, the Democrats have been forced to face some legislative repercussions due to their electoral defeat. The House repeal vote was the most overt act thus far, and the embarrassing nature of the rebuke is significant even if it remains unlikely that the repeal effort will reach President Obama’s desk. On February 2, despite efforts by Majority Leader Harry Reid to prevent the repeal from coming to the floor, the Republican leader did manage a Senate vote on the question. The repeal measure failed along party lines, 47-51.
But even though the Republicans failed to pass the repeal in the Senate, the vote on the Senate floor put Democrats running for re-election in 2012 from solidly Republican or leaning-Republican states—I count 12 of them—on record against repeal, exacerbating their looming political defeat. As Republican Senator Orrin Hatch said after the vote, “Yes, we were unsuccessful today, but we do know where everybody stands.”
The House vote and the way in which it forced Democrats to hold a close Senate follow-up vote do not convey the full potency of the legislative powers imbued in the new House majority. Securing a House majority gives the Republicans the capacity to hold hearings, run investigations, and call witnesses, powers that are having an impact.
Budget Committee Chairman Paul Ryan held a recent hearing on the fiscal consequences of the bill in which he called former Bush appointees James Capretta and Dennis Smith as witnesses for the majority, two critics who were unlikely to have been called when the Democrats were in charge. In addition, as the “administration witness,” Ryan called the Center for Medicare and Medicaid Services (CMS) chief actuary Richard Foster, who turned heads with his statement that the argument that the bill will bring down costs is “false, more so than true.”
This is part of what House leadership aides call continuing to “build the case” for reform, which they are going to continue to do for the rest of this Congress. Such a strategy is all the more dangerous for a Democratic establishment that lost the debate about health care even when it had the speakership and thus the ability to control the House’s agenda.
In addition, the mere presence of the new House majority has already forced the administration to downsize some of its more ambitious endeavors. The administration had begun to build a new and separate Office of Consumer Information and Insurance Oversight and brought in a number of critics of the insurance industry to run the operation. Some Republican congressmen raised questions about the authority of HHS to create this new entity, but HHS saw little need to react to these concerns while the Democrats remained in the majority and for the most part did not engage in oversight of the Obama administration. Once the Republicans gained the majority, however, HHS quickly began what a press release by its chief, Kathleen Sebelius, called the “transition of the Office of Consumer Information and Insurance Oversight (OCIIO) to the Center for Consumer Information and Insurance Oversight (CCIIO) at CMS.” What all this acronyming means is that she was preemptively reining in the new entity.
These political and legislative changes, while significant, are far less important than the potentially devastating results of the separate lawsuits filed by state attorneys general against the bill. The first decision by Judge Henry Hudson in response to a suit filed by the state of Virginia said that the so-called “individual mandate—the requirement that every American be required by law to purchase health insurance—was unconstitutional.” But Hudson stopped there—while he struck down the mandate, he ruled that the rest of the law should remain in place.
Liberals were unhappy with the decision, to be sure, and were quick to point out it was only one decision and that Hudson was a Republican appointee. Even so, they were relatively restrained in their objections, perhaps because they understood that Hudson’s decision itself was relatively restrained. Liberal blogger Ezra Klein, for example, said he found it “a far cry from a world in which the Supreme Court strikes down the whole of the health-care law.”
If this were Shakespeare, Klein’s comment would have been followed by the stage direction “Enter Vinson.” On January 31, Judge Roger Vinson issued a 78-page judicial avalanche of a decision in a separate case brought by 26 states acting in concert, citing The Federalist Papers, Madison, Hamilton, and the Constitution as part of his reasoning. In his opinion, Judge Vinson agreed with Judge Hudson and with the states “that Congress exceeded the bounds of its authority in passing the Act with the individual mandate.”
But he went further, overturning not just the mandate but the entire bill. Using the metaphor of a finely crafted watch, he determined that ObamaCare “has approximately 450 separate pieces, but one essential piece (the individual mandate) is defective and must be removed.” He made the point even sharper by noting that it was the Obama administration that had argued—“at least 14 times in its motion”—how crucial the mandate was to the working of the law. He also made clear that “this case is not about whether the Act is wise or unwise legislation. It is about the Constitutional role of the federal government.”
The drama-within-the-drama here was the political decision by Obama and congressional Democrats to ram through a flawed piece of legislation without the American-style debate and consensus-building that identifies problems and irons out kinks. In fact, according to Timothy Jost of Washington and Lee University, “Everyone expected that there would be a conference committee that would straighten out the issues in the bill. And then there was Scott Brown’s election.” Brown’s election meant that the Democrats would pursue an approach that did not include a “severability clause”—a provision that allows a law to survive should a part of it become invalidated. They did not have to take this approach. Jonathan Turley of George Washington University, a supporter of the health law, lamented that “the bill unnecessarily triggered the constitutional fight that led to its rejection in two federal courts.”
Vinson’s ruling means that even before a potential Supreme Court decision at some point in 2012, there is a good case to be made that the implementation process for the bill should come to a halt in the meanwhile. The Wall Street Journal’s Janet Adamy reported that “David Rivkin, an attorney for the plaintiffs, said the ruling meant the 26 states challenging the law must halt implementation of pieces that apply to states and certain small businesses represented by plaintiffs.”
Florida, for one, has said that it will shut down implementation efforts. A number of other states, including Wisconsin, are considering what applying the Rivkin view would mean. When Adamy brought this inconvenient aspect of the decision to the Obama White House press shop for comment, she was told, anonymously, that “we will continue to operate as we have previously.” Of course, it was this very “damn the torpedoes, full speed ahead” approach that has caused so much political and now legal trouble for the White House.
These two decisions, by Hudson and now Vinson, constituted the third heavy body blow in short order to ObamaCare. For many people, this new line of assault seemed to have come out of nowhere, and perhaps it had. The New York Times’s Kevin Sack observed that Vinson’s “ruling was nonetheless striking given that only nine months ago, prominent law professors were dismissing the constitutional claims as just north of frivolous.” Clearly, a lot had changed for the law in the nine months since it had passed.
Still, the White House tried to minimize the blow by calling it anomalous. As Stephanie Cutter wrote on the White House blog, “This ruling is well out of the mainstream of judicial opinion.” An anonymous White House official told the Washington Times that “there’s something thoroughly odd and unconventional about the analysis.” These adjectival responses prompted the Washington Post’s blogger Jennifer Rubin to observe: “They are truly nonplussed, and so they vamp, not with reasoned analysis but with an outpouring of adjectives.”
Writers and experts on the left now recognize the danger posed by these judicial decisions to their perceived relentless march of progress. The New Republic’s Jonathan Cohn worries that “health-care repeal litigation is fundamentally more threatening than attempts to repeal the law through Congress.” Even if the Supreme Court does not issue a ruling as sweeping as Vinson’s, Ezra Klein has noted with concern the distinct possibility “that Vinson’s ruling will make Hudson’s ruling seem more modest and appealing.” By flanking Hudson, Vinson made it possible for the unpredictable Justice Anthony Kennedy to appear to split the difference if he votes against the mandate but does not invalidate the entire bill.
This is good news for ObamaCare’s opponents. If Kennedy were to do this, and the Supreme Court were to rule against the mandate but not invalidate the law, the health-care law would still be dealt a mortal blow. The loss of the mandate would upset the balance the Democrats tried to impose between new impositions on insurers and the promise to those same insurers of healthy consumers who would be forced to purchase their products. If the individual mandate is gone, the law essentially dies as well.
Ironically, the challenge to the mandate might have the effect of pushing the entire debate among Democrats further leftward, since the only solution if the mandate were to go away may be a single-payer system of the sort used in Canada. Such a system would likely not be unconstitutional, which isn’t the same as saying it could secure passage through Congress. It couldn’t; single-payer is not politically feasible in the United States. In short, as the administration argued and, to their consternation, Vinson ruled, the mandate is the key, and the law cannot work without it.
So it is that the new law and the entire liberal vision for American health care have suffered three devastating blows in quick succession: the electoral repudiation in November, the political repudiation represented by the victorious repeal vote in the House, and the decisions by Hudson and now Vinson rejecting the constitutional basis for the mandate. Each has weakened the case for and the very existence of the new law, and have combined to foster the growing sense that ObamaCare may, in fact, be unsustainable, unworkable, and unwanted.
While the bill’s political and legislative vulnerabilities have increased, our constitutional structure is such that a law, once passed, is not easy to repeal. At the same time, there are other avenues for redress here, as the Vinson decision made clear. Our political process depends on the Constitution, and the Constitution has a key role to play in the preservation of liberty and equality under the law. Our founding document was tried too cavalierly by Democrats in their rush to pass the legislation. The Vinson decision, by rightfully elevating the Constitution and honing in on the unconstitutional nature of the individual mandate, now appears to have the best chance of bringing about the process that will administer ObamaCare’s deserved and eventual coup de grâce.
Must-Reads from Magazine
A foreign-policy approach based in security and pragmatism is now characterized by retrenchment and radicalism
And yet realism is currently in crisis.
Realism was once a sophisticated intellectual tradition that represented the best in American statecraft. Eminent Cold War realists were broadly supportive of America’s postwar internationalism and its stabilizing role in global affairs, even as they stressed the need for prudence and restraint in employing U.S. power. Above all, Cold War–era realism was based on a hard-earned understanding that Americans must deal with the geopolitical realities as they are, rather than retreat to the false comfort provided by the Atlantic and Pacific oceans.
More recently, however, those who call themselves realists have lost touch with this tradition. Within academia, realism has become synonymous with a preference for radical retrenchment and the deliberate destruction of arrangements that have fostered international stability and prosperity for decades. Within government, the Trump administration appears to be embracing an equally misguided version of realism—an approach that masquerades as shrewd realpolitik but is likely to prove profoundly damaging to American power and influence. Neither of these approaches is truly “realist,” as neither promotes core American interests or deals with the world as it really is. The United States surely needs the insights that an authentically realist approach to global affairs can provide. But first, American realism will have to undergo a reformation.
The Realist Tradition
Realism has taken many forms over the years, but it has always been focused on the imperatives of power, order, and survival in an anarchic global arena. The classical realists—Thucydides, Machiavelli, Hobbes—considered how states and leaders should behave in a dangerous world in which there was no overarching morality or governing authority strong enough to regulate state behavior. The great modern realists—thinkers and statesmen such as Reinhold Niebuhr, Hans Morgenthau, George Kennan, and Henry Kissinger—grappled with the same issues during and after the catastrophic upheaval that characterized the first half of the 20th century.
They argued that it was impossible to transcend the tragic nature of international politics through good intentions or moralistic maxims, and that seeking to do so would merely empower the most ruthless members of the international system. They contended, on the basis of bitter experience, that aggression and violence were always a possibility in international affairs, and that states that desired peace would thus have to prepare for war and show themselves ready to wield coercive power. Most important, realist thinkers tended to place a high value on policies and arrangements that restrained potential aggressors and created a basis for stability within an inherently competitive global environment.
For this very reason, leading Cold War–era realists advocated a robust American internationalism as the best way of restraining malevolent actors and preventing another disastrous global crack-up—one that would inevitably reach out and touch the United States, just as the world wars had. Realist thinkers understood that America was uniquely capable of stabilizing the international order and containing Soviet power after World War II, even as they disagreed—sometimes sharply—over the precise nature and extent of American commitments. Moreover, although Cold War realists recognized the paramount role of power in international affairs, most also recognized that U.S. power would be most effective if harnessed to a compelling concept of American moral purpose and exercised primarily through enduring partnerships with nations that shared core American values. “An idealistic policy undisciplined by political realism is bound to be unstable and ineffective,” the political scientist Robert Osgood wrote. “Political realism unguided by moral purpose will be self-defeating and futile.” Most realists were thus sympathetic to the major initiatives of postwar foreign policy, such as the creation of U.S.-led military alliances and the cultivation of a thriving Western community composed primarily of liberal democracies.
At the same time, Cold War realists spoke of the need for American restraint. They worried that America’s liberal idealism, absent a sense of limits, would carry the country into quixotic crusades. They thought that excessive commitments at the periphery of the global system could weaken the international order against its radical challengers. They believed that a policy of outright confrontation toward the Kremlin could be quite dangerous. “Absolute security for one power means absolute insecurity for all others,” Kissinger wrote. Realists therefore advocated policies meant to temper American ambition and the most perilous aspects of superpower competition. They supported—and, in Kissinger’s case, led—arms-control agreements and political negotiations with Moscow. They often objected to America’s costliest interventions in the Third World. Kennan and Morgenthau were among the first mainstream figures to go public with opposition to American involvement in Vietnam (Morgenthau did so in the pages of Commentary in May 1962).
During the Cold War, then, realism was a supple, nuanced doctrine. It emphasized the need for balance in American statecraft—for energetic action blended with moderation, for hard-headed power politics linked to a regard for partnerships and values. It recognized that the United States could best mitigate the tragic nature of international relations by engaging with, rather than withdrawing from, an imperfect world.
This nuance has now been lost. Academics have applied the label of realism to dangerous and unrealistic policy proposals. More disturbing and consequential still, the distortion of realism seems to be finding a sympathetic hearing in the Trump White House.
Realism as Retrenchment
Consider the state of academic realism. Today’s most prominent self-identified realists—Stephen Walt, John Mearsheimer, Barry Posen, and Christopher Layne—advocate a thoroughgoing U.S. retrenchment from global affairs. Whereas Cold War realists were willing to see the world as it was—a world that required unequal burden-sharing and an unprecedented, sustained American commitment to preserve international stability—academic realists now engage in precisely the wishful thinking that earlier realists deplored. They assume that the international order can essentially regulate itself and that America will not be threatened by—and can even profit from—a more unsettled world. They thus favor discarding the policies that have proven so successful over the decades in providing a congenial international climate.
Why has academic realism gone astray? If the Cold War brokered the marriage between realists and American global engagement, the end of the Cold War precipitated a divorce. Following the fall of the Soviet Union, U.S. policymakers continued to pursue an ambitious global agenda based on preserving and deepening both America’s geopolitical advantage and the liberal international order. For many realists, however, the end of the Cold War removed the extraordinary threat—an expansionist USSR—that had led them to support such an agenda in the first place. Academic realists argued that the humanitarian interventions of the 1990s (primarily in the former Yugoslavia) reflected capriciousness rather than a prudent effort to deal with sources of instability. Similarly, they saw key policy initiatives—especially NATO enlargement and the Iraq war of 2003—as evidence that Washington was no longer behaving with moderation and was itself becoming a destabilizing force in global affairs.
These critiques were overstated, but not wholly without merit. The invasion and occupation of Iraq did prove far costlier than expected, as the academic realists had indeed warned. NATO expansion—even as it successfully promoted stability and liberal reform in Eastern Europe—did take a toll on U.S.–Russia relations. Having lost policy arguments that they thought they should have won, academic realists decided to throw the baby out with the bathwater, calling for a radical reformulation of America’s broader grand strategy.
The realists’ preferred strategy has various names—“offshore balancing,” “restraint,” etc.—but the key components and expectations are consistent. Most academic realists argue that the United States should pare back or eliminate its military alliances and overseas troop deployments, going back “onshore” only if a hostile power is poised to dominate a key overseas region. They call on Washington to forgo costly nation-building and counterinsurgency missions overseas and to downgrade if not abandon the promotion of democracy and human rights.
Academic realists argue that this approach will force local actors in Europe, the Middle East, and East Asia to assume greater responsibility for their own security, and that the United States can manipulate—through diplomacy, arms sales, and covert action—the resulting rivalries and conflicts to prevent any single power from dominating a key region and thereby threatening the United States. Should these calculations prove faulty and a hostile power be poised to dominate, Washington can easily swoop in to set things aright, as it did during the world wars. Finally, if even this calculation were to prove faulty, realists argue that America can ride out the danger posed by a regional hegemon because the Atlantic and Pacific Oceans and America’s nuclear deterrent provide geopolitical immunity against existential threats.
Today’s academic realists portray this approach as hard-headed, economical strategy. But in reality, it represents a stark departure from classical American realism. During the Cold War, leading realists placed importance on preserving international stability and heeded the fundamental lesson of World Wars I and II—that the United States, by dint of its power and geography, was the only actor that could anchor international arrangements. Today’s academic realists essentially argue that the United States should dismantle the global architecture that has undergirded the international order—and that Washington can survive and even thrive amid the ensuing disorder. Cold War realists helped erect the pillars of a peaceful and prosperous world. Contemporary academic realists advocate tearing down those pillars and seeing what happens.
The answer is “nothing good.” Contemporary academic realists sit atop a pyramid of faulty assumptions. They assume that one can remove the buttresses of the international system without that system collapsing, and that geopolitical burdens laid down by America will be picked up effectively by others. They assume that the United States does not need the enduring relationships that its alliances have fostered, and that it can obtain any cooperation it needs via purely transactional interactions. They assume that a world in which the United States ceases to promote liberal values will not be a world less congenial to America’s geopolitical interests. They assume that revisionist states will be mollified rather than emboldened by an American withdrawal, and that the transition from U.S. leadership to another global system will not unleash widespread conflict. Finally, they assume that if such upheaval does erupt, the United States can deftly manage and even profit from it, and that America can quickly move to restore stability at a reasonable cost should it become necessary to do so.
The founding generation of American realists had learned not to indulge in wishfully thinking that the international order would create or sustain itself, or that the costs of responding to rampant international disorder would be trivial. Today’s academic realists, by contrast, would stake everything on a leap into the unknown.
For many years, neither Democratic nor Republican policymakers were willing to make such a leap. Now, however, the Trump administration appears inclined to embrace its own version of foreign-policy realism, one that bears many similarities to—and contains many of the same liabilities as—the academic variant. One of the least academic presidents in American history may, ironically, be buying into some of the most misguided doctrines of the ivory tower.
Any assessment of the Trump administration must remain somewhat provisional, given that Donald Trump’s approach to foreign policy is still a work in progress. Yet Trump and his administration have so far taken multiple steps to outline a three-legged-stool vision of foreign policy that they explicitly describe as “realist” in orientation. Like modern-day academic realism, however, this vision diverges drastically from the earlier tradition of American realism and leads to deeply problematic policy.
The first leg is President Trump’s oft-stated view of the international environment as an inherently zero-sum arena in which the gains of other countries are America’s losses. The post–World War II realists, by contrast, believed that the United States could enjoy positive-sum relations with like-minded nations. Indeed, they believed that America could not enjoy economic prosperity and national security unless its major trading partners in Europe and Asia were themselves prosperous and stable. The celebrated Marshall Plan was high-mindedly generous in the sense of addressing urgent humanitarian needs in Europe, yet policymakers very much conceived of it as serving America’s parochial economic and security interests at the same time. President Trump, however, sees a winner and loser in every transaction, and believes—with respect to allies and adversaries alike—that it is the United States who generally gets snookered. The “reality” at the core of Trump’s realism is his stated belief that America is exploited “by every nation in the world virtually.”
This belief aligns closely with the second leg of the Trump worldview: the idea that all foreign policy is explicitly competitive in nature. Whereas the Cold War realists saw a Western community of states, President Trump apparently sees a dog-eat-dog world where America should view every transaction—even with allies—on a one-off basis. “The world is not a ‘global community’ but an arena where nations, nongovernmental actors and businesses engage and compete for advantage,” wrote National Security Adviser H.R. McMaster and National Economic Council Director Gary Cohn in an op-ed. “Rather than deny this elemental nature of international affairs, we embrace it.”
To be sure, Cold War realists were deeply skeptical about “one worldism” and appeals to a global community. But still they saw the United States and its allies as representing the “free world,” a community of common purpose forged in the battle against totalitarian enemies. The Trump administration seems to view U.S. partnerships primarily on an ad hoc basis, and it has articulated something akin to a “what have you done for me lately” approach to allies. The Cold War realists—who understood how hard it was to assemble effective alliances in the first place—would have found this approach odd in the extreme.
Finally, there is the third leg of Trump’s “realism”: an embrace of amorality. President Trump has repeatedly argued that issues such as the promotion of human rights and democracy are merely distractions from “winning” in the international arena and a recipe for squandering scarce resources. On the president’s first overseas trip to the Middle East in May, for instance, he promised not to “lecture” authoritarian countries on their internal behavior, and he made clear his intent to embrace leaders who back short-term U.S. foreign-policy goals no matter how egregious their violations of basic human rights and political freedoms. Weeks later, on a visit to Poland, the president did speak explicitly about the role that shared values played in the West’s struggle against Communism during the Cold War, and he invoked “the hope of every soul to live in freedom.” Yet his speech contained only the most cursory reference to Russia—the authoritarian power now undermining democratic governance and security throughout Europe and beyond. Just as significant, Trump failed to mention that Poland itself—until a few years ago, a stirring exemplar of successful transition from totalitarianism to democracy—is today sliding backwards toward illiberalism (as are other countries within Europe and the broader free world).
At first glance, this approach might seem like a modern-day echo of Cold War debates about whether to back authoritarian dictators in the struggle against global Communism. But, as Jeane Kirkpatrick explained in her famous 1979 Commentary essay “Dictatorships and Double Standards,” and as Kissinger himself frequently argued, Cold War realists saw such tactical alliances of convenience as being in the service of a deeper values-based goal: the preservation of an international environment favoring liberty and democracy against the predations of totalitarianism. Moreover, they understood that Americans would sustain the burdens of global leadership over a prolonged period only if motivated by appeals to their cherished ideals as well as their concrete interests. Trump, for his part, has given only faint and sporadic indications of any appreciation of the traditional role of values in American foreign policy.
Put together, these three elements have profound, sometimes radical, implications for America’s approach to a broad range of global issues. Guided by this form of realism, the Trump administration has persistently chastised and alienated long-standing democratic allies in Europe and the Asia-Pacific and moved closer to authoritarians in Saudi Arabia, China, and the Philippines. The president’s body language alone has been striking: Trump’s summits have repeatedly showcased conviviality with dictators and quasi-authoritarians and painfully awkward interactions with democratic leaders such as Germany’s Angela Merkel. Similarly, Trump has disdained international agreements and institutions that do not deliver immediate, concrete benefits for the United States, even if they are critical to forging international cooperation on key issues or advancing longer-term goods. As Trump has put it, he means to promote the interests of Pittsburgh, not Paris, and he believes that those interests are inherently at odds with each other.
To be fair, President Trump and his proxies do view the war on terror as a matter of defending both American security interests and Western civilization’s values against the jihadist onslaught. This was a key theme of Trump’s major address in Warsaw. Yet the administration has not explained how this civilizational mindset would inform any other aspect of its foreign policy—with the possible exception of immigration policy—and resorts far more often to the parochial lens of nationalism.
The Trump administration seems to be articulating a vision in which America has no lasting friends, little enduring concern with values, and even less interest in cultivating a community of like-minded nations that exists for more than purely deal-making purposes. The administration has often portrayed this as clear-eyed realism, even invoking the founding father of realism, Thucydides, as its intellectual lodestar. This approach does bear some resemblance to classical realism: an unsentimental approach to the world with an emphasis on the competitive aspects of the international environment. And insofar as Trump dresses down American allies, rejects the importance of values, and focuses on transactional partnerships, his version of realism has quite a lot in common with the contemporary academic version.
Daniel Drezner of Tufts University has noted the overlap, declaring in a Washington Post column, “This is [academic] realism’s moment in the foreign policy sun.” Randall Schweller of Ohio State University, an avowed academic realist and Trump supporter, has been even more explicit, noting approvingly that “Trump’s foreign-policy approach essentially falls under the rubric of ‘off-shore balancing’” as promoted by ivory-tower realists in recent decades.
Yet one suspects that the American realists who helped create the post–World War II order would not feel comfortable with either the academic or Trumpian versions of realism as they exist today. For although both of these approaches purport to be about power and concrete results, both neglect the very things that have allowed the United States to use its power so effectively in the past.
Both the academic and Trump versions of realism ignore the fact that U.S. power is most potent when it is wielded in concert with a deeply institutionalized community of like-minded nations. Alliances are less about addition and subtraction—the math of the burden-sharing emphasized by Trump and the academic realists—and more about multiplication, leveraging U.S. power to influence world events at a fraction of the cost of unilateral approaches. The United States would be vastly less powerful and influential in Europe and Central Asia without NATO; it would encounter far greater difficulties in rounding up partners to wage the ongoing war in Afghanistan or defeat the Islamic State; it would find itself fighting alone—rather than with some of the world’s most powerful partners—far more often. Likewise, without its longstanding treaty allies in Asia, the United States would be at an almost insurmountable disadvantage vis-à-vis revisionist powers in that region, namely China.
Both versions of realism also ignore the fact that America has been able to exercise its enormous power with remarkably little global resistance precisely because American leaders, by and large, have paid sufficient regard to the opinions of potential partners. Of course, every administration has sought to “put America first,” but the pursuit of American self-interest has proved most successful when it enjoys the acquiescence of other states. Likewise, the academic and Trump versions of realism too frequently forget that America draws power by supporting values with universal appeal. This is why every American president from Franklin Roosevelt to Barack Obama has recognized that a more democratic world is likely to be one that is both ideologically and geopolitically more congenial to the United States.
Most important, both the academic and Trump versions of realism ignore the fact that the classical post–World War II realists deliberately sought to overcome the dog-eat-dog world that modern variants take as a given. They did so by facilitating cooperation within the free world, suppressing the security competitions that had previously led to cataclysmic wars, creating the basis for a thriving international economy, and thereby making life a little less nasty, brutish, and short for Americans as well as for vast swaths of the world’s population.
If realism is about maximizing power, effectiveness, and security in a competitive global arena, then neither the academic nor the Trump versions of realism merits the name. And if realism is meant to reflect the world as it is, both of these versions are deeply deficient.
This is a tragedy. For if ever there were a moment for an informed realism, it would be now, as the strategic horizon darkens and a more competitive international environment reemerges. There is still time for Trump and his team to adapt, and realism can still make a constructive contribution to American policy. But first it must rediscover its roots—and absorb the lessons of the past 70 years.
The Seven Pillars of Realism
A reformed realism should be built upon seven bedrock insights, which President Trump would do well to embrace.
First, American leadership remains essential to restraining global disorder. Today’s realists channel the longstanding American hope that there would come a time when the United States could slough off the responsibilities it assumed after World War II and again become a country that relies on its advantageous geography to keep the world at arm’s length. Yet realism compels an awareness that America is exceptionally suited to the part it has played for nearly four generations. The combination of its power, geographic location, and values has rendered America uniquely capable of providing a degree of global order in a way that is more reassuring than threatening to most of the key actors in the international system. Moreover, given that today the most ambitious and energetic international actors besides the United States are not liberal democracies but aggressive authoritarian powers, an American withdrawal is unlikely to produce multipolar peace. Instead, it is likely to precipitate the upheaval that U.S. engagement and activism have long been meant to avert. As a corollary, realists must also recognize that the United States is unlikely to thrive amid such upheaval; it will probably find that the disorder spreads and ultimately implicates vital American interests, as was twice the case in the first half of the 20th century.
Second, true realism recognizes the interdependence of hard and soft power. In a competitive world, there is no substitute for American hard power, and particularly for military muscle. Without guns, there will not—over the long term—be butter. But military power, by itself, is an insufficient foundation for American strategy. A crude reliance on coercion will damage American prestige and credibility in the end; hard power works best when deployed in the service of ideas and goals that command widespread international approval. Similarly, military might is most effective when combined with the “softer” tools of development assistance, foreign aid, and knowledge of foreign societies and cultures. The Trump administration has sought to eviscerate these nonmilitary capabilities and bragged about its “hard-power budget”; it would do better to understand that a balance between hard and soft power is essential.
Third, values are an essential part of American realism. Of course, the United States must not undertake indiscriminate interventions in the name of democracy and human rights. But, fortunately, no serious policymaker—not Woodrow Wilson, not Jimmy Carter, not George W. Bush—has ever embraced such a doctrine. What most American leaders have traditionally recognized is that, on balance, U.S. interests will be served and U.S. power will be magnified in a world in which democracy and human rights are respected. Ronald Reagan, now revered for his achievements in improving America’s global position, understood this point and made the selective promotion of democracy—primarily through nonmilitary means—a key part of his foreign policy. While paying due heed to the requirements of prudence and the limits of American power, then, American realists should work to foster a climate in which those values can flourish.
Fourth, a reformed realism requires aligning relations with the major powers appropriately—especially today, as great-power tensions rise. That means appreciating the value of institutions that have bound the United States to some of the most powerful actors in the international system for decades and thereby given Washington leadership of the world’s dominant geopolitical coalition. It means not taking trustworthy allies for granted or picking fights with them gratuitously. It also means not treating actual adversaries, such as Vladimir Putin’s Russia, as if they were trustworthy partners (as Trump has often talked of doing) or as if their aggressive behavior were simply a defensive response to American provocations (as many academic realists have done). A realistic approach to American foreign policy begins by seeing great-power relations through clear eyes.
Fifth, limits are essential. Academic realists are wrong to suggest that values should be excised from U.S. policy; they are wrong to argue that the United States should pull back dramatically from the world. Yet they are right that good statecraft requires an understanding of limits—particularly for a country as powerful as the United States, and particularly at a time when the international environment is becoming more contested. The United States cannot right every wrong, fix every problem, or defend every global interest. America can and should, however, shoulder more of the burden than modern academic and Trumpian realists believe. The United States will be effective only if it chooses its battles carefully; it will need to preserve its power for dealing with the most pressing threat to its national interests and the international order—the resurgence of authoritarian challenges—even if that means taking an economy-of-force approach to other issues.
Sixth, realists must recognize that the United States has not created and sustained a global network of alliances, international institutions, and other embedded relationships out of a sense of charity. It has done so because those relationships provide forums through which the United States can exercise power at a bargain-basement price. Embedded relationships have allowed the United States to rally other nations to support American causes from the Korean War to the counter-ISIS campaign, and have reduced the transaction costs of collective action to meet common threats from international terrorism to p.iracy. They have provided institutional megaphones through which the United States can amplify its diplomatic voice and project its influence into key issues and regions around the globe. If these arrangements did not exist, the United States would find itself having to create them, or acting unilaterally at far greater cost. If realism is really about maximizing American power, true realists ought to be enthusiastic about relationships and institutions that serve that purpose. Realists should adopt the approach that every post–Cold War president has embraced: that the United States will act unilaterally in defense of its interests when it must, but multilaterally with partners whenever it can.
Finally, realism requires not throwing away what has worked in the past. One of the most astounding aspects of both contemporary academic realism and the Trumpian variant of that tradition is the cavalier attitude they display toward arrangements and partnerships that have helped produce a veritable golden age of international peace, stability, and liberalism since World War II, and that have made the United States the most influential and effective actor in the globe in the process. Of course, there have been serious and costly conflicts over the past decades, and U.S. policy has always been thoroughly imperfect. But the last 70 years have been remarkably good ones for U.S. interests and the global order—whether one compares them with the 70 years before the United States adopted its global leadership role, or compares them with the violent disorder that would have emerged if America followed the nostrums peddled today under the realist label. A doctrine that stresses that importance of prudence and discretion, and that was originally conservative in its preoccupation with stability and order, ought not to pursue radical changes in American statecraft or embrace a “come what may” approach to the world. Rather, such a doctrine ought to recognize that true achievements are enormously difficult to come by—and that the most realistic approach to American strategy would thus be to focus on keeping a good thing going.
The Greeks and the Founders feared men like the president, and with good reason
he most striking aspect of the rise and reign of Donald Trump has been his unabashed display of vulgarity and the ease (so far) with which he gets away with it. “Vulgar,” a term of condescension, is not often heard in democracies, where it most applies. It certainly applies to The Donald. The brazen insults he strewed along his path to the presidency were more than enough to deserve the plain name of vulgar. His success despite them suggests something even more upsetting than Trump himself: that his vulgar manliness was not a drag but an advantage.
The whole Trump phenomenon, both the man and the people he appeals to, reminds us of the vulgarity in democracy. Or more, of human vulgarity—since disrespect for the high and mighty can have universal appeal.
We now treat democracy as unquestionably the best, sometimes as the only, form of government. That was not the case in the classical political science of the Greeks. They held democracy in far lower esteem. For Plato, Aristotle, Thucydides, and Plutarch, democracy was typified by the figure of the demagogue, the democratic leader. This man was hasty, angry, impulsive, brash, and punitive; he sought the favor of those like himself, the demos, the hoi polloi (the many). He opposed men of quality, nobles, aristocrats, or gentlemen, and accused them of being enemies of the people, the majority for whom he spoke. The “people” was considered in the classical conception to be just a part of the whole, the majority part to be sure, but it was not a term that included everyone: The demos was quantity against quality, the many versus the few, in practice the poor versus the rich.
The American Founders, building on the philosophy of liberalism, expanded the conception of the people so that “popular government” could include everyone. James Madison made a famous distinction (one that used to be taught in high-school civics) between “democracy”—meaning pure democracy dominated by the demos and subject to “majority faction”—and “republic,” which was based on representation and structured with separation of powers and federalism. In our republican system, the demos would be required to govern through electing the few and be kept diverse and scattered to help keep them moderate. The Founders saw to it that their popular republic would provide for government by people like themselves—no longer aristocrats or nobles but still the few, and that the American people would have those Founders for heroes, rather than vicious characters like Robespierre or naive agitators like Tom Paine, who spoke and acted for the demos.
They wanted to spare the new nation from rule by the demagogue, a vulgar man who appealed to vulgar people on the level of a vulgar manliness with the traits of the demagogue. Vulgar is not always bad, though today we avoid using the term out of concern for the self-esteem of the vulgar. (“Plebeian” can occasionally be heard, but never politically.) Hillary Clinton could speak of “deplorables,” but to condemn them as “vulgar” might have excused them from the easy remedy for being deplorable, which was to vote for the Democrats.
Vulgar people can be honest and good-hearted, but they are susceptible to passion and impatience. Madison wanted a government that would “refine and enlarge” opinions of the people, that is, the vulgar. The moderate republic—now called by the name of what it replaced, democracy—would, with the consent of the vulgar, take power from the hands of the vulgar.
The result was a Constitution that makes use of the talents and virtues of the few, especially their ambition. With its complex structure, the Constitution supplies many avenues of ambition in politics, and outside politics, it suffuses the spirit of ambition everywhere in our society. Ambition is the desire to excel, to be outstanding above the normal satisfactions of ordinary people. In our democracy, the popular desire to “get ahead” is normal and imparts a modicum of ambition to all. All of us have learned to live with enlightened innovation rather than custom, and we do not yearn for the settled comfort of aristocracy. But still some want to get ahead by rising to the top or at least by having an “impact.” This sort of ambition is democratic in origin and hostile to the aristocrats. Yet those who possess it still strive to be above the rest of the democrats. Wanting to have an impact on the world puts you in the natural legion of the few.D
onald Trump is one of these few, ambitious if nothing else. In fact, there is little else to him. Though the son of a rich man, he has the outrageous coarseness of a vulgar man. He appeals to such men and to women who like manly men. These are his audience, and they are not put off by his departures from decorum. Far from it: They appreciate his lack of good taste, of good manners, of gentlemanliness, of protocol, and of tact. His boldness in going beyond the boundaries of decency they interpret as “telling it like it is”—as if honesty were found mainly in company with indecency, and plain talk were the same as blurting lies.
Though rich (but just how rich?), Trump is not a philanthropist who wishes to elevate our democracy with magnificent gifts, like Andrew Carnegie’s libraries. He does not support the fine arts or education, apart from founding Trump University, a failed monument to the profit motive. He dresses in a dark suit, wearing an aggressive tie, and does not try to hide his uncommon wealth with presumptuous informality like the techie billionaires. He does this and gets away with it, because he knows that he retains close contact with his supporters: He uses his wealth in vulgar display just as they would. He made his name in Reality TV and lost some of his wealth in the operation of casinos. And speaking of his name, he has branded all his enterprises with the name of Trump, apparently believing that his every activity deserves the highest honor he can bestow.
Along with the tremendous value of Trump’s name, however, goes his insistence that everyone recognize it. His thin skin and amazing touchiness show in his ready reactions to slights, let alone criticism. His egoism makes his psychology an easy read—his bluster opposed and counteracted by his sensitivity. Unlike the truly manly male, who hardly notices and cares little for how he is received by others, Trump demands universal love as the reward for his just denunciations and wise observations. In this he is closer to the sensitive male than to the manly male, and differs from the former only by his optimism that women will like him for his candor.
His outrageous comments on the newscaster Megyn Kelly’s menstrual condition or on his 2016 rival Carly Fiorina’s supposed homeliness, set a record for rash behavior by a public figure in need of votes, perhaps causing, for all he knew or cared, a permanent breach in the bounds of decorum. But it did not appear that he suffered much for it in the women’s vote. With such rashness one would expect an appropriate insensitivity, a devil-may-care approach to public esteem—but not at all, he wants it just the same. His vulgar manliness wants to mask his obvious yearning for indiscriminate love, and of course doesn’t succeed. The fawning demagogue in him prevails over the impression he wants to convey of brash independence.
Yet he won the election, as he keeps reminding us. He’s a winner, and the vulgar love a winner. This fact invites us to infer that he might have a planned policy of swagger as opposed to an uncontrollable impulse. Ordinary people, decent though they may be, are impressed by extraordinary daring. They stand amazed at sensational violations of decency. So, if we are to accept the hypothesis of his Machiavellian shrewdness, we could suppose that Trump has deliberately chosen a strategy of speaking beyond normal bounds, one designed to impress ordinary folk and at the same time to dismay the elite who kept expecting that he would pay, as they would, for having gone too far. This fits with the classical demagogue, who roused the demos against the nobles or gentlemen, and Trump has used the same method against the leaders of both parties. As do all trendy folk, Trump has called these leaders the “Establishment,” taking them as a collectivity and using the name given them by the New Left in the late Sixties.
Edmund Burke in the 18th century spoke of “establishments” in the plural of the British constitution, such as the Church, the lawyers, the universities, the nobility—all unelected authorities supplying stability and guidance to a free society. By contrast, the single Establishment of the New Left, picked up by Trump, is an accusation of malignant stagnation in a free society. The term “elite” has a mixed history, good and bad, of describing the democratic replacement for the aristocratic few. In America now, the “elite” and the “Establishment” refer principally to elected officials, present or past, as well as to institutions, like the media, that have power because they have popular favor. It is strange to denounce them to the people who have chosen them, and particularly as if they were a single conspiracy when our parties seem to be so deeply at odds and said to be “polarized.”A
s Trump had it during his election campaign, our parties are together against us, yet so divided against each other as to be unable to act. He seems quite uninterested in the liberal/conservative debate, or indeed in any debate. But he found one point to attack that no other politician had seen: political correctness. Here was a well-known mind-set with practices and policies carried out and defended by Democrats, often criticized, but not by politicians. No Republican had had the cleverness to see and the boldness to exploit the weakness in political correctness. This was the name Trump gave to the general political strategy of Democrats to designate vulnerable groups, “minorities and women,” for special favor in jobs, honors, and benefits. This strategy of inclusiveness was designed to help win elections by the simple addition of vulnerable groups taught to vote by their identity, following the example of black voters.
Trump noticed that the policy of inclusiveness, in cases such as affirmative action, was actually including some by excluding others not officially identified as vulnerable—particularly white voters. Without saying so—for in this Trump was cautious and prudent—he began to mobilize a white community to match the long-existing “black community,” thus turning the strategy of identity against itself. It was now Trump voters who were encouraged to think themselves marginalized. One could call this racism only if the “inclusive” policy of the Democrats were also termed racism. Surely, however, Trump was not calling on the finer feelings of the electorate. In a democratic age without nobles to serve as targets, the demagogue has to operate against some of the people in order to claim to act on behalf of those forgotten. Arlie Hochschild, a Berkeley sociologist, has made a study of forgotten whites in Bayou Louisiana that nicely describes Trump voters before they voted for him. They were resentful, like departing airline passengers, of having to stand in line and watch other preferred groups waved ahead of them.
The Establishment, according to Trump, had made us losers; he would make America great again. Democrats had forgotten America in their preoccupation with its separate identities, and their desire to come to the aid of the vulnerable at home induced them to prefer the vulnerable abroad. America was too successful, too much a winner, the Establishment (or at least its Democratic branch) believed. America’s greatness was due to its exploitation of weaker countries, not to its virtue; its greatness was lacking in goodness. Best to apologize, and so lead the world after all in apologizing for human exploitation of nature. Nature needs protection from us (humans), and we must seek means of “sustainability” to enable it to return to functioning on its own for our good.
All this—the politics and philosophy of Barack Obama and his liberals—was fresh meat for Trump. But the hectoring manner in which they were conveyed—the schoolmarm political correctness that admonishes rather than argues—was still more inviting. Whereas the liberal policy of affirmative action was designed to help blacks, the liberal affectation of political correctness came from feminism. The feminism we know, like the New Left dating from the late Sixties, made its way through “raising consciousness,” by correcting the bias of language favoring men so as to put across a gender neutrality that favored neither sex. Of course in practice, and when combined with affirmative action, achieving gender neutrality required a massive societal feminization that was the reverse of neutral.
Political correctness, originally from feminism, now applies to blacks as well, particularly to the way whites are required to address blacks. Blacks are allowed the privileges of vulgar manliness that are denied to the rest of the population. If only black men would preach manliness, refined or not, to the rest of the population! But they are content with their own freedom and, with manly contempt of others, do not seek to justify it more generally.
Thus it was left to Donald Trump alone to attack political correctness and come to the defense of vulgar manliness. He does this not with argument but with outrageous behavior meant to be offensive. As a demagogue, he seeks direct contact with the people. He wants to bypass the media, the parties, and the Constitution that try to control and limit his contact and claim the right, whether formal or informal, to stand between him and the people. As methods of direct contact, Trump used old-fashioned rallies in his campaign rather than informal meetings; he sends tweets to all indiscriminately rather than addressing people through the media; and he features shocking talk and behavior rather than conventional politeness and respect. His desire is to transgress normal boundaries, especially those of political correctness, and thus to capture attention.
His boastfulness seems stupid, and it is, but it makes people think that because he is bold, he is more honest and more truthful than those who hesitate and formulate. His offhand lies are not meant to be accurate but rather to display the lack of restraint that seems to be more truthful than the uptight rectitude of the fact-checker. His vulgar insults betray the absence of wit and the rejection of humor and irony in his flat soul; he is always serious and yet always exaggerates.
In sum, Donald Trump reflects and connects to the vulgar manliness in the American (or any) people. He is demotic rather than democratic, intuitive himself in finding what is instinctive in us. The American Founders made a Constitution for a popular republic that would resist the ills of all previous republics, which had exposed government to the vagaries and impulses of the vulgar. Instead, our republic would “refine and enlarge” the popular will through representative institutions that contain and employ the ambition of the few, and that supply the whole with the “cool and deliberate sense of the community.”
The Founders made a constitutional democracy with, among other things, an electoral college, of which Trump took full advantage, that was meant to keep people like him from ever winning office. Well, every human institution for good can be abused for ill. And not only Trump supporters but all of us must hope that even a demagogue can bring good. Perhaps what is demotic can refresh, rather than degrade, what is democratic. It is one good thing at least to be reminded of the difference between the vulgar and the refined.
Or have I not just said that this difference too is very much in question?
Review of 'The Strange Death of Europe' By Douglas Murray
Since Christianity had shaped the “humanism of which Europe feels legitimately proud,” the ailing pontiff argued, the constitution should make some reference to Europe’s Christian patrimony. His appeal was met with accusations of bigotry. The pope had inflamed the post-9/11 atmosphere of “Islamophobia,” one “anti-racism” outfit said. Another group asked: What about the contributions made by the “tolerant Islam of al-Andalus”? Former French President Valéry Giscard d’Estaing spoke for the political class: “Europeans live in a purely secular political system, where religion does not play an important role.”
Douglas Murray recounts this episode early on in his fiery, lucid, and essential polemic. It epitomized the folly of European elites who would sooner discard the Continent’s civilizational heritage than show partiality for their own culture over others’. To Murray, this tendency is quite literally suicidal—hence the “death” in his title.
The book deals mainly with Western Europe’s disastrous experiment in admitting huge numbers of Muslim immigrants without bothering to assimilate them. These immigrants now inhabit parallel communities on the outskirts of most major cities. They reject mainstream values and not infrequently go boom. Murray’s account ranges from the postwar guest-worker programs to the 2015 crisis that brought more than a million people from the Middle East and Africa.
This is dark-night-of-the-soul stuff. The author, a director at London’s Henry Jackson Society (where I was briefly a nonresident fellow), has for more than a decade been among Europe’s more pessimistic voices on immigration. My classically liberal instincts primed me to oppose him at every turn. Time and again, I found myself conceding that, indeed, he has a point. This is in large part because I have been living in and reporting on Europe for nearly four years. Events of the period have vindicated Murray’s bleak vision and confounded his critics.
Murray is right: Time isn’t mellowing out Europe’s Muslims. “The presumption of those who believed in integration is that in time everybody who arrives will become like Europeans,” Murray writes. Yet it is the young who are usually the most fanatical. Second- and third-generation immigrants make up the bulk of the estimated 5,000 Muslims who have gone off to fight with the Islamic State.
The first large wave of Muslim immigrants to Britain arrived soon after World War II. Seven decades later, an opinion survey conducted (in 2016) by the polling firm ICM found that half of Muslim Britons would proscribe homosexuality, a third would legalize polygamy, and a fifth would replace civil law with Shariah. A different survey, also conducted in 2016, found that 83 percent of young French Muslims describe their faith as “important or very important” to them, compared with 22 percent of young Catholics. I could go on with such polling data; Murray does for many pages.
He is also correct that all the various “integration” models have failed. Whether it is consensus-based social democracy in the Nordic countries, multiculturalism in Britain, or republican secularism in France, the same patterns of disintegration and social incohesion persist nearly everywhere. Different European governments have treated this or that security measure, economic policy, or urban-planning scheme as the integration panacea, to no avail.
Murray argues that the successive failures owe to a basic lack of political will. To prove the point he cites, among other things, female genital mutilation in the UK. Laws against the practice have been on the books for three decades. Even so, an estimated 130,000 British women have had their genitals cut, and not a single case has been successfully prosecuted.
Pusillanimity and retreat have been the norm among governments and cultural elites on everything from FGM to free speech to counterterrorism. The result has been that the “people who are most criticized both from within Muslim communities in Europe and among the wider population are in fact the people who fell hardest for the integration promises of liberal Europe.” It was Ayaan Hirsi Ali, the fierce Somali-born proponent of Enlightenment values and women’s equality, who had to escape Holland under a death threat, not her persecutors.
And Murray is right when he says that Europeans hadn’t staged a real debate on immigration until very recently. The author might be too quick to dismiss the salutary fiscal and social effects of economic growth and immigration’s role in promoting it. At various points he even suggests that Europeans forgo economic as well as population growth if it means having to put up with fewer migrants. He praises hermetically sealed Japan, but he elides the Japanese model’s serious economic, demographic, and even psychological disadvantages.
All this is secondary to Murray’s unanswerable argument that European elites had for years cordoned off immigration from normal political debate. As he writes, “whereas the benefits of mass immigration undoubtedly exist and everybody is made very aware of them, the disadvantages of importing huge numbers of people from another culture take a great deal of time to admit to.” In some cases, most notably the child-sex grooming conspiracy in Rotherham, England, the institutions have tried to actively suppress the truth. Writes Murray: “Instead of carrying out their jobs without fear or favor, police, prosecutors, and journalists behaved as though their job was to mediate between the public and the facts.”I s it possible to imagine an alternative history, one in which Europe would absorb this many migrants from Islamic lands but suffer fewer and less calamitous harms? Murray’s surprising answer is yes. Had Europe retained its existential confidence over the course of the previous two centuries, things might have turned out differently. As it was, however, mass migration saw a “strong religious culture”—Islam—“placed into a weak and relativistic culture.”
In the book’s best chapters, Murray departs from the policy debate to attend to the sources of Europe’s existential insecurity. Germans bear much of the blame, beginning with 19th-century Bible scholarship that applied the methods of history, philology, and literary criticism to sacred scripture. That pulled the rug of theological certainty from under Europe’s feet, in Murray’s account, and then Darwin’s discoveries heightened the disorientation. Europeans next tried to substitute totalistic ideology for religion, with catastrophic results.
Finally, after World War II, they settled on human rights as the central meaning of Europe. But since Europeans could no longer believe, these rights were cut off from one of their main wellsprings: the Judeo-Christian tradition. The Catholic Church—having circumscribed the power of earthly kings across centuries and thereby “injected an anti-totalitarian vaccine into the European bloodstream,” as George Weigel has written in these pages–was scorned or ignored. Europeans forgot how they came to be free.
Somehow Europe must recover its vitality. But how? Murray is torn. On one hand, he sees how a rights-based civilization needs a theological frame, lest it succumb before a virile and energetic civilization like Islam. On the other, he thinks the leap of faith is impossible today. Murray can’t blame François, the professor-protagonist of Michel Houellebecq’s 2016 novel Submission. Faced with an Islamic takeover of France, François heads to a monastery desperate to shake his spiritual torpor. But kneeling before the Virgin doesn’t do anything for him. Islam, with its simplicity and practicality (not least the offer of up to four nubile wives), is much harder to resist.
Murray wonders whether the answer lies in art. Maybe in beauty Europeans can recover the fulfillment and sense of mystery that their ancestors once found in liturgy–only without the cosmic truth claims. He laments that contemporary European art has “given up that desire to connect us to something like the spirit of religion,” though it is possible that the current period of crisis will engender a revival. In the meanwhile, Murray has suggested, even nonbelievers should go to church as a way to mark and show gratitude for Christianity’s foundational role in Europe.
He is onto something. Figure out the identity bit in the book’s subtitle—“Immigration, Identity, Islam”—and the other two will prove much easier to sort out.
A maestro’s morality
How is it possible that a man who made his conducting debut when Grover Cleveland was president should still be sufficiently well known and revered that most of his recordings remain in print to this day? Toscanini: Musician of Conscience, Harvey Sachs’s new biography, goes a long way toward defining what made Toscanini unique.1 A conductor himself, Sachs is also the author of, among other excellent books, a previous biography of Toscanini that was published in 1978. Since then, several large caches of important primary-source material, most notably some 1,500 of the conductor’s letters, have become available to researchers. Sachs’s new biography draws on this new material and other fresh research. It is vastly longer and more detailed than its predecessor and supersedes it in every way.
Despite its length and thoroughness, Toscanini: Musician of Conscience is not a pedant’s vade mecum. Clearly and attractively written, it ranks alongside Richard Osborne’s 1998 biography of Herbert von Karajan as one of the most readable biographies of a conductor ever published. For Toscanini, as Sachs shows us, had a volatile, immensely strong-willed character, one that in time caused him to clash not only with his colleagues but with the dangerous likes of Adolf Hitler and Benito Mussolini. The same fierce integrity that energized his conducting also led him to put his life at risk at a time when many of his fellow musicians were disinclined to go even slightly out of their way to push back against the Fascist tyrants of the ’30s.T oscanini: Musician of Conscience does not devote much space to close analysis of Toscanini’s interpretative choices and technical methods. For the most part, Sachs shows us Toscanini’s art through the eyes of others, and the near-unanimity of the admiration of his contemporaries, whose praise is quoted in extenso, is striking, even startling. Richard Strauss, as distinguished a conductor as he was a composer, spoke for virtually everyone in the world of music when he said, “When you see that man conduct, you feel that there is only one thing for you to do: take your baton, break it in pieces, and never conduct again.”
Fortunately for posterity, Toscanini’s unflashy yet wondrously supple baton technique can be seen up close in the 10 concerts he gave with the NBC Symphony between 1948 and 1952 that were telecast live (most of which can now be viewed in part or whole on YouTube). But while his manual gestures, whose effect was heightened by the irresistible force of his piercing gaze, were by all accounts unfailingly communicative, Toscanini’s ability to draw unforgettable performances out of the orchestras that he led had at least as much to do with his natural musical gifts. These included an infallible memory—he always conducted without a score—and an eerily exact ear for wrong notes. Such attributes would have impressed orchestra players, a hard-nosed lot, even if they had not been deployed in the service of a personality so galvanizing that most musicians found it all but impossible not to do Toscanini’s musical bidding.
What he wanted was for the most part wholly straightforward. Toscanini believed that it was his job—his duty, if you will—to perform the classics with note-perfect precision, singing tone, unflagging intensity, and an overall feeling of architectural unity that became his trademark. When an orchestra failed to give of its best, he flew into screaming rages whose verbal violence would likely not be believed were it not for the fact that there were secret tapes made. In one of his most spectacular tantrums, which has been posted on YouTube, he can be heard telling the bass players of the NBC Symphony that “you have no ears, no eyes, nothing at all…you have ears in—in your feet!”
Toscanini was able to get away with such behavior because his own gifts were so extraordinary that the vast majority of his players worshipped him. In the words of the English bassoonist Archie Camden, who played under Toscanini in the BBC Symphony from 1935 to 1939, he was “the High Priest of Music,” a man “almost of another world” whose artistic integrity was beyond question. And while his personal integrity was not nearly so unblemished—he was, as Sachs reports with unsalacious candor, a compulsive philanderer whose love letters to his mistresses are explicit to the point of pornography—there is nonetheless a parallel between the passionate conscientiousness of his music-making and his refusal to compromise with Hitler and Mussolini, both of whom were sufficiently knowledgeable about music to understand what a coup it would have been to co-opt the world’s greatest conductor.
Among the most valuable parts of Toscanini: Musician of Conscience are the sections in which Sachs describes Toscanini’s fractious relations with the German and Italian governments. Like many of his fellow countrymen, he had been initially impressed by Mussolini, so much so that he ran for the Italian parliament as a Fascist candidate in 1919. But he soon saw through Mussolini’s modernizing rodomontade to the tyrant within, and by the late ’20s he was known throughout Italy and the world as an unswerving opponent of the Fascist regime. In 1931 he was beaten by a mob of blackshirted thugs, after which he stopped conducting in Italy, explaining that he would not perform there so long as the Fascists were in power. Mussolini thereupon started tapping his telephone line, and seven years later the conductor’s passport was confiscated when he described the Italian government’s treatment of Jews as “medieval stuff” in a phone call. Had public and private pressure not been brought to bear, he might well have been jailed or murdered. Instead he was allowed to emigrate to the U.S. He did not return to Italy until after World War II.
If anything, Toscanini’s hatred for the Nazis was even more potent, above all because he was disgusted by their anti-Semitism. A philo-Semite who referred to the Jews as “this marvelous people persecuted by the modern Nero,” he wrote a letter to one of his mistresses in the immediate wake of the Anschluss that makes for arresting reading eight decades later:
My heart is torn in bits and pieces. When you think about this tragic destruction of the Jewish population of Austria, it makes your blood turn cold. Think of what a prominent part they’d played in Vienna’s life for two centuries! . . . Today, with all the great progress of our civilization, none of the so-called liberal nations is making a move. England, France, and the United States are silent!
Toscanini felt so strongly about the rising tide of anti-Semitism that he agreed in 1936 to conduct the inaugural concerts of the Palestine Symphony (later the Israel Philharmonic) as a gesture of solidarity with the Jews. In an even more consequential gesture, he had already terminated his relationship with the Bayreuth Festival, where he had conducted in 1930 and 1931, the first non-German conductor to do so. While the founder of the festival, Richard Wagner, ranked alongside Beethoven, Brahms, and Verdi at the top of Toscanini’s pantheon of musical gods, he was well aware many of the members of the Wagner family who ran Bayreuth were close friends of Adolf Hitler, and he decided to stop conducting in Germany—Bayreuth included—when the Nazis came to power. Hitler implored him to return to the festival in a personal letter that praised him as “the great representative of art and of a people friendly to Germany.” Once again, though, there was to be no compromise: Toscanini never performed in Germany again, nor would he forgive those musicians, Wilhelm Furtwängler among them, who continued to do so.I mplicit throughout Sachs’s book is the idea that Toscanini the man and Toscanini the musician were, as his subtitle suggests, inseparable—that, in other words, his conscience drove him to oppose totalitarianism in much the same way that it drove him to pour his heart and soul into his work. He was in every sense of the word a driven man, one capable of writing in an especially revealing letter that “when I’m working I don’t have time to feel joy; on the contrary, I suffer without interruption, and I feel that I’m going through all the pain and suffering of a woman giving birth.”
Toscanini was not striking a theatrical pose when he wrote these melodramatic-sounding words. The rare moments of ecstasy that he experienced on the podium were more than offset by his obsessive struggle to make the mere mortals who sang and played for him realize, as closely as possible, his vision of artistic perfection. That was why he berated them, why he ended his rehearsals drenched with sweat, why he flogged himself as unsparingly as he flogged his musicians. It was, he believed, what he had been born to do, and he was willing to move heaven and earth in order to do it.
To read of such terrifying dedication is awe-inspiring—yet it is also strangely demoralizing. To be sure, there are still artists who drive themselves as relentlessly as did Toscanini, and who pull great art out of themselves with the same iron determination. But his quasi-religious consecration to music inevitably feels alien to the light-minded spirit of our own age, dominated as it is by pop culture. It is hard to believe that NBC, the network of Jimmy Fallon and Superstore, maintained for 17 years a full-time symphony orchestra that had been organized in 1937 for the specific purpose of allowing Toscanini to give concerts under conditions that he found satisfactory. A poll taken by Fortune that year found that 40 percent of Americans could identify Toscanini as a conductor. By 1954, the year in which he gave up conducting the NBC Symphony (which was then disbanded), the number was surely much higher.
Will there ever again be a time when high art in general and classical music in particular mean as much to the American people as they did in Toscanini’s heyday? Very likely not. But at least there will be Harvey Sachs’s fine biography—and, far more important, Toscanini’s matchlessly vivid recordings—to remind us of what we once were, what we have lost, and what Arturo Toscanini himself aspired to be and to do.
1 Liveright, 923 pages. Many of Toscanini’s best commercial American recordings, made with the NBC Symphony, the New York Philharmonic, and the Philadelphia Orchestra, were reissued earlier this year in a budget-priced box set called Arturo Toscanini: The Essential Recordings (RCA Red Seal, 20 CD’s) whose contents were chosen by Sachs and Christopher Dyment, another noted Toscanini scholar. Most of the recordings that he made in the ’30s with the BBC Symphony are on Arturo Toscanini: The HMV Recordings (Warner Classics, six CD’s).
A blockbuster movie gets the spirit right and the details wrong
But enough about Brexit; what about Christopher Nolan’s new movie about Dunkirk?
Dunkirk is undoubtedly a blockbuster with a huge cast—Nolan has splendidly used thousands of extras rather than computer cartooning to depict the vast numbers of Allied troops trapped on the beaches—and a superb score by Hans Zimmer. Kenneth Branagh is a stiff upper-lipped rear-admiral, whose rather clunking script is all too obviously designed to tell the audience what’s going on; One Direction pop star Harry Styles is a British Tommy, and Tom Hardy is a Spitfire pilot who somehow shoots down two Heinkels while gliding, having run out of fuel about halfway through the movie. Mark Rylance, meanwhile, plays the brave skipper of a small boat taking troops off the beaches in the manner of Walter Pidgeon in Mrs. Miniver.
Yet for all the clichéd characterization, almost total lack of dialogue, complete lack of historical context (not even a cameo role for Winston Churchill), a ludicrous subplot in which a company of British soldiers stuck on a sinking boat do not use their Bren guns to defend themselves, problems with continuity (sunny days turn immediately into misty ones as the movie jumps confusingly through time), and Germans breaking into central Dunkirk whereas in fact they were kept outside the perimeter throughout the evacuation, Dunkirk somehow works well.
It works for the same reason that the 1958 film of the same name directed by Leslie Norman and starring Richard Attenborough and John Mills did. The story of the nine-day evacuation of the British Expeditionary Force from Dunkirk in late May and early June 1940 is a tale of such extraordinary heroism, luck, and intimate proximity to utter disaster that it would carry any film, even a bad one, and Nolan’s is emphatically not a bad one. Although the dogfights take place at ridiculously low altitudes, they are thrilling, and the fact that one doesn’t see a single German soldier until the closing scene, and then only two of them in silhouette, somehow works, too. See the film on the biggest screen you can, which will emphasize the enormity of the challenge faced by the Allies in getting over 336,000 troops off the beaches for the loss of only 40,000 killed, wounded and captured.
There is a scene when the armada of small boats arrives at the beaches that will bring a lump to the throat of any patriotic Briton; similarly, three swooping Spitfires are given a wonderfully evocative moment. The microcosm of the evacuation that Nolan concentrates on works well, despite another silly subplot in which a British officer with PTSD (played by Cillian Murphy) kills a young boy on Rylance’s small boat. That all the British infantry privates, not just Harry Styles, look like they sing in boy-bands doesn’t affect the power of seeing them crouch en masse under German attack in their greatcoats and helmets on the foam-flecked beaches.
On the tenth of May in 1940, Adolf Hitler invaded France, Belgium, and Holland, unleashing Blitzkrieg on the British and French armies—a new all-arms tactic of warfare that left his enemies reeling. He also sent tanks through the forests of the Ardennes mountains, which were considered impassable, and by May 16, some panzer units had already reached the English Channel. With the British and French in full retreat, on May 24 the Fuhrer halted his tanks’ headlong advance for various sound military reasons—he wanted to give his men some rest, did not want to over-extend the German army, needed to protect against counter-attack, and wanted his infantry to catch up. From May 26 to June 3, the Allies used this pause to throw up a perimeter around the French port of Dunkirk, from whose pleasure beaches more than a quarter of a million British and more than 80,000 French troops embarked to cross the Channel to safety in Britain.
Protected by the Royal Air Force, which lost 144 pilots in the skies over Dunkirk, and by the French air force (which plays no part in this movie) and transported by the Royal Navy (which doesn’t seem to be able to use its guns against the Luftwaffe in this film, but which luckily did in real life), British and French troops made it to Dover, albeit without any heavy equipment which they had to destroy on the beach. An allusion is made to that when Tom Hardy destroys the Spitfire he has (I must say quite unbelievably) landed on a beach in order to prevent its falling into German hands.
In response to a call from the British government, more than 700 private vessels were requisitioned, including yachts, paddle steamers, ferries, fishing trawlers, packet steamers and lifeboats. Even today when boating down the Thames it is possible to see small pleasure vessels sometimes only fifteen feet long with the plaque “Dunkirk 1940” proudly displayed on the cabins. That 226 were sunk by the Luftwaffe, along with six destroyers of the 220 warships that took part, shows what it meant to rise to what was afterwards called “the Dunkirk Spirit.” It was a spirit of defiance of tyranny that one glimpses regularly in this film, even if Nolan does have to pay obeisance to the modern demands for stories of cowardice alongside heroism, and the supposedly redemptive cowardice-into-heroism stories that Hollywood did not find necessary when it made Mrs. Miniver in 1942.
Nolan’s Dunkirk implies that it was the small boats that brought back the majority of the troops, whereas in fact the 39 destroyers and one cruiser involved in Operation Dynamo brought back the huge majority while the little ships did the crucial job of ferrying troops from the beaches to the destroyers. Six of which were sunk, though none by U-boats (which the film wrongly suggests were present).
Where Nolan’s film commits a libel on the British armed services is in its tin ear for the Anglo-French relations of the time. In the movie, a British beach-master prevents French infantrymen from boarding a naval vessel, saying “This is a British ship. You get your own ships.” The movie later alleges that no Frenchmen were allowed to be evacuated until all the Britons were safely back home. This was not what happened. The French were brought across the Channel in Royal Navy vessels and small boats when their units arrived on the beaches.
There was no discrimination whatsoever, and to suggest there was injects false nationalist tension into what was in truth a model of good inter-Allied cooperation. Only much later, when the Nazi-installed Vichy government in France needed to create an Anglophobic myth of betrayal at Dunkirk, did such lies emerge. It is a shame that Nolan is now propagating them—especially since this might be the only contact that millions of people will ever have with the Dunkirk story for years, perhaps even a generation. At a time when schools simply do not teach the histories of anything so patriotism-inducing as Dunkirk, it was incumbent on Nolan to get this right.
In a touching scene at the end, one of the Tommies is depicted reading from a newspaper Churchill’s famous “We shall fight on the beaches” speech of June 4, 1940, with its admonition: “We must be very careful not to assign to this deliverance the attributes of a victory. Wars are not won by evacuations.” Churchill made no attempt to minimize the scale of what he called a “colossal military disaster,” but he also spoke, rightly, of the fact that it had been a “miracle of deliverance.” That is all that matters in this story.
So despite my annoyance at how many little details are off here—for example, Tom Hardy firing 75 seconds’ worth of ammunition when he would really have only had 14.7, or choppy weather when the Channel was really like a mill pond—I must confess that such problems are only for military history pedants like me. What Nolan has gotten right is the superb spirit of the British people in overcoming hatred, resentment, and fury with calmness, courage, and good humor.
Which brings us back to Brexit.
The Swoon has several symptoms: extreme praise, a disinclination to absorb contrary facts, a weakness for adulation, and a willingness to project one’s own beliefs and dispositions onto an ill-suited target, regardless of evidence. The first thing to know about the Swoon, though, is that it is well rooted in reality. John McCain is perhaps the most interesting non-presidential figure in Washington politics since Daniel Patrick Moynihan. Any piece of journalism that aims to assess him objectively should be required to include, as a stipulation, a passage like this one from Robert Timberg’s masterful book about Vietnam, The Nightingale’s Song.
“Do you want to go home?”
“Now, McCain, it will be very bad for you.”
The [chief jailer] gleefully led the charge as the guards, at [another guard’s] command, drove fists and knees and boots into McCain. Amid laughter and muttered oaths, he was slammed from one guard to another, bounced from wall to wall, knocked down, kicked, dragged to his feet, knocked back down, punched again and again in the face. When the beating was over, he lay on the floor, bloody, arms and legs throbbing, ribs cracked, several teeth broken off at the gum line.
“Are you ready to confess your crimes?” asked [the guard].
The ropes came next . . .
This scene is, of course, from McCain’s five years in a North Vietnamese prisoner of war camp. It helps to know that before this gruesome episode began—there were many more to come—McCain’s arms had been broken and gone untreated. It helps, too, to know that the point of the torture was to force McCain to leave the prison and return home to his father, the highest ranking naval officer in the Pacific. In other words, they hung him by his broken arms because he refused to let them let him go.
Every reporter who’s done his homework knows this about McCain, and most civilians who meet him know it, too. This is the predicate for the Swoon. It began to afflict liberal journalists of the Boomer generation during the warm-up to his first run for president, against Governor George W. Bush, in the late 1990s. The reporter would be brought onto McCain’s campaign bus and receive a mock-gruff welcome from the candidate. No nervous handlers would be in evidence, like those who ever attend other candidates during interviews.
And then it happens: In casual, preliminary conversation, McCain makes an indiscreet comment about a Senate colleague. “Is that off the record?” the reporter asks, and McCain waves his hand: “It’s the truth, isn’t it?” In a minute or two, the candidate, a former fighter pilot, drops the F bomb. Then, on another subject, he makes an offhanded reference to being “in prison.” The reporter, who went through four deferments in the late 1960s smoking weed with half-naked co-eds at an Ivy League school, feels the hot, familiar surge of guilt. As the interview winds down, the reporter sees an unexpected and semi-obscure literary work—the collected short stories of William Maxwell, let’s say—that McCain keeps handy for casual reading.
By the time he’s shown off the bus—after McCain has complimented a forgotten column the reporter wrote two years ago—the man is a goner. If I saw it once in my years writing about McCain, I saw it a dozen times. (I saw it happen to me!) Soon the magazine feature appears, with a headline like “The Warrior,” or “A Question of Honor,” or even “John McCain Walks on Water.” Those are all real headlines from his first presidential campaign. This really got printed, too: “It is a perilous thing, this act of faith in a faithless time—perilous for McCain and perilous for the people who have come to him, who must realize the constant risk that, sometimes, God turns out to be just a thunderstorm, and the gold just stones agleam in the sun.”
Judging from inquiries I’ve made over the years, the only person who knows what that sentence means is the writer of it, an employee of Esquire magazine named Charles Pierce. No liberal journalist got the Swoon worse than Pierce, and no one was left with a bitterer hangover when it emerged that McCain was, in nearly every respect, a conventionally conservative, generally loyal Republican—with complications, of course. The early Swooners had mistaken those complications (support for campaign-finance reform, for example, and his willingness to strike back at evangelical bullies like Jerry Falwell Sr.) as the essence of McCain. When events proved this not to be so, culminating in his dreary turn as the 2008 Republican presidential nominee—when he committed the ultimate crime in liberal eyes, midwifing the national career of Sarah Palin—it was only Republicans who were left to swoon.
So matters rested until this July, when McCain released the news that he suffers from a particularly aggressive form of brain cancer. Many appropriate encomiums rolled in, some from the original Swooners. But another complication arose. Desperate to pass a “motion to proceed” so that a vote could be taken on a lame and toothless “repeal” of Obamacare, Senate Republicans could muster only a tie vote. McCain announced he would rise from his hospital bed and fly to Washington to break the tie and vote for the motion to proceed.
Even conservatives who had long remained resistant to the Swoon succumbed. Even Donald Trump tweet-hailed McCain as a returning hero. His old fans from the left, those with long memories, wrote, or tweeted, more in sorrow than in anger. Over at Esquire, poor Charles Peirce reaffirmed that God had turned out to be just a thunderstorm again. “The ugliest thing to witness on a very ugly day in the United States Senate,” he wrote, “was what John McCain did to what was left of his legacy as a national figure.” A longtime Swooner in the Atlantic: “Senator McCain gave us a clearer idea of who he is and what he stands for.” Answers: a hypocrite, and nothing!
The old fans weren’t mollified by a speech McCain made after his vote, in which he sounded notes they had once thrilled to—he praised bipartisanship and cooperation across the aisle. Several critics in the press dismissed the speech with the same accusation that his conservative enemies had always leveled at McCain when he committed something moderate. He was pandering…to them! “McCain so dearly wants the press to think better of him for [this] speech,” wrote the ex-fan in the Atlantic. But the former Swooners were having none of it. Swoon me once, shame on me. Swoon me twice . . .
Then the next day in the wee hours, McCain voted against the actual bill to repeal Obamacare. Democrats were elated, and Republicans were forced to halt in mid-Swoon. His reasons for voting as he did were sound enough, but reasons seldom enter in when people are in thrall to their image of McCain. The people who had once loved him so, and who had suffered so cruelly in disappointment, were once more in love. Let’s let Pierce have the last word: “The John McCain the country had been waiting for finally showed up early Friday morning.” He had done what they wanted him to do; why he had done it was immaterial.
The condescension is breathtaking. Sometimes I think McCain is the most misunderstood man in Washington. True enough, he’s hard to pin down. He’s a screen onto which the city’s ideologues and party hacks project their own hopes and forebodings. Now, as he wages another battle in a long and eventful life, what he deserves from us is something simpler—not a swoon but a salute, offered humbly, with much reverence, affection, and gratitude.