William Powell was 44 when My Man Godfrey was filmed. He was 16 years older than his co-star, Carole Lombard, and 11 years older than Clark Gable had been two years earlier when Gable had starred in the seminal screwball comedy It Happened One Night. A trained stage actor who had begun making movies in 1922, he did not come into his own until he signed with MGM and appeared opposite Myrna Loy in The Thin Man in 1934. From then on, though, he was a star in every sense, so much so that he was billed above Lombard in My Man Godfrey.
Powell’s age, far from being an obstacle to his stardom, was central to it. Even though he specialized in light comedy, his sardonic screen persona had an underlying weight that allowed him to bring off dramatic roles no less convincingly. Without it, he couldn’t have essayed the tricky title role of My Man Godfrey. His contempt for the irresponsible frivolity of the rich eccentrics among whom he finds himself is made explicit in the screenplay (“I was curious to see how a bunch of empty-headed nitwits conducted themselves”). Yet he also comes across as a man to whom a beautiful young woman like Lombard might plausibly be attracted, and it is in no way dramatically unsatisfactory when they fall in love.
While Powell was older than most of the other male screen stars of his day, he was in no way uncharacteristic of them. With few exceptions, their screen personas were unequivocally adult. To compare such studio-era screen idols as Powell, Gable, Humphrey Bogart, Cary Grant, and Spencer Tracy to (say) Tom Cruise, Matt Damon, or Leonardo DiCaprio is to see at once how completely Hollywood has reoriented itself toward youth and how completely the male actors themselves strive to remain “relatable” to teenagers. (Cruise, for example, is now 54, yet he remains almost entirely boyish. He has a 21-year-old son but could never convincingly play the middle-aged father of a man in his majority.) Even such “action stars” as Errol Flynn and John Wayne carried themselves with a natural gravity that would be out of place in most of today’s movies.
It is this quality that explains one of the most striking phenomena of Hollywood’s golden age, the existence of mature male “character stars” who, while not quite popular enough to carry a film on their own, were nonetheless known to moviegoers of all ages. A few, like Powell and Charles Boyer, were treated by the system as full-fledged leading men and billed accordingly, while others, like Claude Rains, stuck to supporting roles. But all reflected the same expectation of maturity that was a defining element of prewar filmmaking, and their work gave a unique richness of texture to the casting of studio-era films that has vanished now that youth is so overwhelmingly dominant on screen.T he careers of Boyer and Rains are exemplary of the lost art of the mature character star. Both men were older European stage actors (Rains was born in England in 1889, Boyer in France a decade later) who moved to the U.S. in the 1930s and soon thereafter established themselves at the major studios. Between them, they made well over a hundred films in Hollywood and received four Oscar nominations each, though neither won the prize. While Rains is better known today, thanks to his having had the luck to appear in such well-remembered pictures as The Adventures of Robin Hood and Casablanca, Boyer was far more famous in his lifetime, so much so that he became the model for a cartoon character, Chuck Jones’s Pepé le Pew. Yet they were regarded by their peers with like admiration, and their best films show that they worked in a broadly similar way.
What David Thomson has written of Rains was no less true of Boyer: “Technically, he often filled roles that were leads, but he treated them as character parts.” He did this in part because he had no alternative. Short, stocky and balding, Boyer looked nothing like the proverbial romantic leading man. What made him a star was his uncanny ability to enhance the performances of the celebrated actresses whose lovers he played, among them Ingrid Bergman, Bette Davis, Irene Dunne, Greta Garbo, Olivia de Havilland, and Hedy Lamarr. It was Lamarr with whom he shared the screen in John Cromwell’s Algiers (1938), the film that made him a celebrity (in which, however, he did not in fact say, “Come wiz me to ze Casbah,” the apocryphal line with which, like Bogart’s “Play it again, Sam,” his name is linked). Instead of trying to upstage his scene partners, he uplifted and complemented them, knowing that his mellifluous voice and intelligent characterizations—he had studied philosophy at the Sorbonne before going on the stage—would still make an impression.
As a result, Boyer became Hollywood’s all-purpose French lover, capable of moving with ease from comedy-tinted dramas like Frank Borzage’s History Is Made at Night (1937) to the spectacularly villainous role he played in George Cukor’s Gaslight (1944). Even after happily abandoning such parts in late middle age, he never lost his reputation for being the onscreen ladies’ man whom he portrayed most memorably and poignantly in Love Affair (1939), Leo McCarey’s tale of a shipboard romance that is transformed by separation and suffering into the love of a lifetime.1
Claude Rains had a gift for suggesting weakness, that hardest of qualities to convey compellingly, which he used in his role of the lovesick Nazi in Hitchcock’s Notorious.Two inches shorter than Boyer, Rains had an equally distinctive speaking voice. “His precise, fine voice can give a chisel edge to the flattest sentiments,” Graham Greene wrote in 1938.2 It was so striking that it brought him his first major Hollywood role, that of the mad scientist in James Whale’s 1933 screen version of H.G. Wells’s The Invisible Man, who is heard but not seen until the end of the film. In part because of his modest stature, he scarcely ever played screen lovers, though he was more than capable of providing Boyer-like romantic support for women stars, as he did to superlative effect when he co-starred with Bette Davis in Vincent Sherman’s Mr. Skeffington (1944).
Mostly, though, Rains left the lovemaking to his colleagues, more frequently portraying such villains as Prince John in The Adventures of Robin Hood (1938). Even more satisfying are his portraits of decent but morally compromised men like the district attorney who prosecutes an innocent man in Mervyn LeRoy’s They Won’t Forget (1937) and the corrupt senator in Frank Capra’s Mr. Smith Goes to Washington (1939). Like Boyer, he had a knack for light comedy but used it instead to hint at the unscrupulous charm of dramatic characters like Captain Renault in Casablanca (1942). He also had a special gift for suggesting weakness, that hardest of qualities to convey compellingly on screen or stage, which he used to unforgettable effect in his greatest film role, that of the lovesick Nazi in Alfred Hitchcock’s Notorious (1946) who adores Ingrid Bergman but cannot free himself from the influence of his domineering mother.
For the most part, Rains and Boyer specialized in Hollywood melodramas of varying quality, ennobling them with their presence. But whenever they had the opportunity to appear in films of greater dramatic complexity, as Rains did in Notorious and in Gabriel Pascal’s screen version of George Bernard Shaw’s Caesar and Cleopatra (1945) and as Boyer did in Max Ophüls’s The Earrings of Madame de . . . (1953), they rose effortlessly to the occasion, just as they distinguished themselves in their later stage appearances, for which they typically chose serious fare. Indeed, Boyer’s performance in Charles Laughton’s 1949 revival of Shaw’s Don Juan in Hell brought him an entirely new kind of réclame, and the recording of the show released in 1952 proves that in addition to being the quintessential French screen lover, he was also a classical actor of exceptional accomplishment.W hat made the careers of such actors possible? I’d submit it was the nature of life during the Great Depression.
Then as now, teenagers flocked to the movies—but so, too, did their parents, who had no interest in films about the splendors and miseries of adolescent life. In any case, both generations had been scarred by the continuing economic trauma through which they were still living, which had a profound effect on what they wanted to see on screen. Most teenagers of the ’30s and ’40s assumed that they would go to work or find a husband immediately after graduating from high school. Unless they were rich, those who went to college did so to prepare themselves to enter the world of work—and after 1940, teenage boys expected to go to a war from which they might not return. As a result, they were drawn to films that were consistent with their experience, as well as mature actors to whose own “adultness” they could aspire. Hence the success of the fully adult movie stars, as well as character stars like Boyer, who was 40 (and looked older) when Love Affair was filmed, and Rains, who was 53 when he appeared in Casablanca.
The children of the baby boom experienced a prolongation of youth. This changed the nature of movies in postwar America—and ultimately put an end to the phenomenon of the character star.After the war, a new kind of star emerged, exemplified by Marlon Brando, Montgomery Clift, and James Dean. Visibly and unmistakably younger than their predecessors, these handsome but slightly androgynous men looked—and acted—their age. Yet the phenomenon of the character star did not die out as a result of their success. Indeed, it saw a brief effulgence with the subsequent rise to fame of such older actors as Walter Matthau (born in 1920), Gene Hackman (born in 1930), and Robert Duvall (born in 1931). None was a conventional leading-man type, yet they started appearing in full-fledged starring roles in the early ’70s, a time when commercial American film was striving for a darker, more cynical tone that necessitated the casting of actors capable of embodying the masculine toughness and disillusion that are a product of maturity alone.
All this changed when George Lucas and Steven Spielberg started making big-budget films like Jaws (1975) and Star Wars (1977) whose simplistic plots and elaborate special effects were specifically intended to titillate an adolescent audience, one whose collective experience was profoundly different from that of its parents and grandparents. Instead of going straight from high school to work, the children of the baby boom experienced a prolongation of youth arising from the desire of their parents to give them an easier life. This is what changed the nature of movies in postwar America—and ultimately put an end to the phenomenon of the character star. We live in a different world now, one in which early maturity is not merely undervalued but actively shunned. It is thus inconceivable that a forty-something actor who looks like William Powell could make any headway in Hollywood today, much less become a name-above-the-title star.
One inevitably wonders what the children of the millennials will make of films like My Man Godfrey and actors like Powell, Charles Boyer, and Claude Rains, not to mention the better-remembered contemporaries with whom they worked. Will the unabashedly adult demeanor of these men, and of such similarly inclined actresses of the period as Joan Crawford, Bette Davis, and Barbara Stanwyck, seem even more alien to them? Or will the harshness of life in the 21st century force them to stop crying for the moon of eternal youth? If so, they may well come to reject the shallowness of the movies that their parents loved. But they will also have the studio-era films of the ’30s and ’40s—and the ’70s—to show them how mature men and women grapple with the problems of adult life.
2 Rains’s diction was self-made. Born into near-poverty, he spoke with a Cockney accent that he did not lose until he became a stage manager and started listening to Herbert Beerbohm Tree, his employer, one of the great British stage actors of the Vicwardian era.
Choose your plan and pay nothing for six Weeks!
Remember These Forgotten Men
Must-Reads from Magazine
Preening doesn't work.
Donald Trump’s demagogic rhetoric on the media is dangerous and un-American. When he describes reporters and editors as “enemies of the people,” or when he chuckles at Rodrigo Duterte’s remark that the media are “spies,” the president wounds the dignity of his office and America’s already-infirm civic health. The question is what the media should do to check the president’s rhetorical excesses.
One answer is for America’s mainstream newsrooms to tell the American people that reporters are not, in fact, the enemy and the president should cut it out with his anti-media crusade. For all the commercial pressures they face today, American journalists are still perched by prominent windows overlooking the national public square, which means they still get a hearing from the people down below when they wish.
I suppose that was the bright idea behind Thursday’s simultaneous publication of pro-media editorials in the editorial pages of 350 newspapers across the country. Participating outlets included national papers like the New York Times (naturally) as well as scores of regional ones, plus a few magazines and professional societies. If you can bring yourself to wade through one dull editorial after another, by all means: CNN has links to all 350.
But our mostly liberal colleagues in the edit-page business are fooling themselves if they imagine that this latest national teach-in will foster greater trust in the media, especially among the millions of Americans who in 2016 registered their discontent with the country’s establishment in toto by sending a vulgarian from Queens to the Oval Office. Those Americans—and not all of them were die-hard Trumpians—had had it with a prestige press that too often saw itself as an adjunct to the liberal cause rather than the cause of truth. They had had it with the subtle and not-so-subtle biases, the servility to liberal politicians, the contempt for their cherished beliefs and condescension for their ways of life.
To regain that trust, it will not do to condemn and condescend some more.
Jesse Brenneman, a New York City radio producer, understands this perfectly. In an ongoing satirical video series posted to Twitter, Brenneman documents his mockumentary-style road trip in “Trump country.” The aim is supposedly to understand the frustrations that led to Trump. But Brenneman mostly ends up yelling at the Trumpians from his driver’s seat, as his car zooms through regions like Central Pennsylvania: “WHY DID YOU DO IT? WHY DID YOU DO IT? WHY DID YOU DO IT? WHY DID YOU VOTE FOR HIM? WHY? WHY? WHY? IS IT ABORTION? DO WE NEED TO HAVE FEWER ABORTIONS? WHY DO YOU WATCH FOX NEWS? IT’S NOT TRUE, MUCH OF IT!”
Brenneman is gently chiding his own media comrades. But the 350 editorials amount to a self-serious version of the same thing: Trust us. We’re the media. What’s wrong with you!
Regaining trust requires something else. It requires factually sound reporting and the pursuit of the truth wherever it leads. It calls for reporters who conduct themselves professionally on social media, who don’t give vent to their anti-conservative animosities at every turn. And pundits who don’t change their minds about an issue merely because they find themselves on the same side as Trump. In short, it requires a return to journalism basics.
That’s hard work. Hectoring editorials are easier.
Choose your plan and pay nothing for six Weeks!
Yesterday, today, and tomorrow.
Andrew Cuomo, the governor of New York, told a stunned crowd on Wednesday that the United States of America “was never that great.” He followed that flat-footed line with a series of bromides about how America will “reach greatness” when mankind ceases to stereotype, discriminate, and degrade one another, but the damage was done. Cuomo’s primary opponent, the progressive insurgent and former actress Cynthia Nixon, mocked the governor for failing in the attempt to mimic “what a progressive sounds like.” That is a telling admission. Presumably, Nixon’s idealized “progressive” would more adroitly explain why American greatness is overstated.
You might think that President Donald Trump would take the opportunity presented by Cuomo’s faceplant to wrap himself in the flag, but he opted only to mock the Empire State’s executive for “having a total meltdown.” The president’s instincts are equally revealing. After all, the phrase “Make America Great Again” concedes that America is, at present, not all that great. This is an earnest conviction on Trump’s part.
In accepting the GOP presidential nomination, Trump painted a portrait of a country that was weak and failing. Shackled by political correctness, riddled with violent crime, beset by dangerous migrants and violent refugees, subverted by craven politicians, and plagued by a crisis of confidence in its mission; Trump’s vision of the country was best summed in the most memorable line from his first inaugural address: “American carnage.” Just 19 months later, the president insists that the nation has been made whole again, which is more a function of his competence than the national character.
These two provisory expressions of patriotism share more commonalities than distinctions. Everyone has their own definition of patriotism, and love of country should not be blind. Unwavering reverence is an expression of faith, not gratitude. Patriotism must know prudent limits, or it may come to justify venality and violence. But patriotism is distinct from an understanding of what makes the United States a great and exceptional nation.
American greatness is established in its Constitution. The nation’s founding charter endures because of two conditions that prevailed at the close of the 18th Century. First, the collection of sovereign states that hammered out a national government was careful to premise a prospective Union on decentralization and federalism. That diffusion preserves local social and legal customs and, thus, domestic harmony. Second, the Constitution’s framers operated on the assumptions espoused by the Enlightenment’s leading luminaries, among them Lockean notions of legitimacy derived from the consent of the governed. These two assumptions led James Madison to conclude in Federalist 51 that “the rights of individuals, or of the minority, will be in little danger from interested combinations of the majority” even while “all authority in it will be derived from and dependent on the society.”
It was also in Federalist 51 in which Madison articulated a truth about human nature that has vexed prideful technocrats since the dawn of time: Mankind is flawed. The species cannot be perfected. Thus, “Ambition must be made to counteract ambition.” The revolutionary movements that followed America’s founding held this capitulatory revelation in low esteem. They sought to create “ideal” societies in which mankind’s contradictions and baser impulses would dissolve into a new social consciousness. It is no coincidence that those “ideal” revolutionary societies eventually descended into bloodshed, oppression, and disunion while America endured.
The Constitution’s amendments are equally exceptional. With a few lamentable deviations, the amendments are a set of negative rights that proscribe governmental action rather than establish that which the government can do. That is a paradigmatic triumph; it established as America’s baseline ethos the idea that human freedoms not expressly enumerated in the Constitution are implied. They do not flow from the beneficence of some far-off potentate. They are God-granted. The concept of unenumerated rights is as revolutionary today as it was in the 18th century, and it remains an alien notion outside the Anglophonic world.
America is capable of astonishing violence and repression, but it equally adept at reconciliation and renewal. That capacity is rooted in Americans’ remarkable facility for compromise. The story of the United States is, in many ways, a story of compromise, and not all of those compromises are worthy of celebration. The facility Americans have for negotiation and concession has, however, forged a government and kept it. It is what has made the United States the most successful experiment in cultural intermixing in human history. It is what fortifies its incredible capitalist dynamism. And its commerce remains the greatest vehicle for achieving equality, meritocracy, and human flourishing ever devised.
So much of what America’s critics lament about the country’s inherent flaws—its hostility toward collectivism, the ruthlessness of its entrepreneurial spirit, its manic bouts of isolationism and extroversion on the world stage, and the tensions between old and new immigrants—are outgrowths of the traits that make it extraordinary. The nation’s commitment to pluralism, egalitarianism, and unity around shared principles rather than cultural, tribal, or subnational bonds is what makes America unique among nations. It will never stop striving to achieve the ideals of its founding; ideals are, after all, often unattainable. But its shared creed is the North Star toward which the United States has looked for a quarter millennium.
All these things that make America great are hardly immutable traits, and some careless future generation may one day abandon them. But despite America’s weakness for fad and experimentation, those fundamental tenets have proven resistant to change. As Jonah Goldberg observed in Suicide of the West, Thomas Jefferson’s assertion that “all men are created equal” cannot be improved upon. Any effort to amend that claim would be a regression to a more primitive state. That and the many other gifts that the founding generation left behind ensured that the United States was a uniquely magnificent nation on day one. Don’t let any politician tell you otherwise.
Choose your plan and pay nothing for six Weeks!
The limits of religious liberty.
Jack Phillips once more finds himself on the sharp end of liberal “tolerance.” He was the Colorado baker at the center of the Masterpiece Cakeshop case, the one who in 2012 refused to bake a cake for a same-sex wedding. A state civil-rights commission censured Phillips and ordered him to undergo ideological retraining. But a 7-2 majority of the U.S. Supreme Court found that the commission had exhibited such overt hostility to Phillips’s religious views as to have violated the state’s “obligation of religious neutrality” under the First Amendment.
But it appears the commission didn’t get the message. The Alliance Defending Freedom, which represented Phillips in the original case, reports:
On June 26, 2017, the same day that the Supreme Court agreed to take up Masterpiece Cakeshop v. Colorado Civil Rights Commission, an attorney asked Phillips to create a cake designed pink on the inside and blue on the outside, which the attorney said was to celebrate a gender transition from male to female. Phillips declined the request because the custom cake would have expressed messages about sex and gender identity that conflict with his religious beliefs. Less than a month after the Supreme Court ruled for Phillips in his first case, the state surprised him by finding probable cause to believe that Colorado law requires him to create the requested gender-transition cake.
This time, however, Phillips and the ADF are taking the fight to the state. On Tuesday, the ADF filed a lawsuit against Colorado Governor John Hickenlooper and the members of the commission, alleging anti-religious bullying and harassment of Phillips aimed at ruining his business and livelihood.
Many religious conservatives see this new case as an opportunity to “firm up” the Court’s Masterpiece holding. If it makes it to the Supreme Court, especially one with a Justice Kavanaugh, there is a good chance that Americans will end up with sturdier protections against illiberal liberalism than former Justice Anthony Kennedy’s whimsical jurisprudence permitted.
But by my lights, the renewed persecution of Phillips also reveals the limits of “religious liberty” as a sword and organizing principle for the right. As I predicted when the original decision was handed down,
the inner logic of today’s secular progressivism puts the movement continually on the offensive. A philosophy that rejects all traditional barriers to individual autonomy and self-expression won’t rest until all “thou shalts” are defeated, and those who voice them marginalized. For a transgender woman to fully exercise autonomy, for example, the devout Christian, Muslim, or Jew must recognize her as a woman. People of faith and others who cling to traditional views must publicly assent to what they don’t believe.
And here we are. “Religious freedom,” without a substantive politics that offers a vision of the common good, can easily allow liberalism to frame traditional moral precepts as little more than superstitions best relegated to the private sphere of the mind. Under the banner of liberty, religious conservatives might win procedural victories here and there. But they will be cornered in the long-term.
Choose your plan and pay nothing for six Weeks!
Whatever Donald wants, he's gonna get it.
What do Republicans believe? Whatever Donald Trump tells them they should believe, it seems.
In survey after survey, self-described Republicans—admittedly a severely truncated demographic in the Trump era—are surrendering not just principle but common sense to whatever Trump needs them to say at the moment. The positions Republicans adopt to prop up the president are often so outside the American right’s traditional credo that it’s hard to believe they’re being honest.
According to a June Axios-sponsored SurveyMonkey poll, a whopping 92 percent of Republicans believe the conventional press deliberately runs with false or misleading stories. That’s not especially surprising. Republicans have a long-standing grievance with the mainstream media, and nearly three-quarters of all respondents in this survey agree with them. What is unique and, frankly, disturbing is the apparent resolve of GOP voters to do something about it.
A Quinnipiac University survey released last week showed that a majority of Republicans agree with the Trump White House’s determination that the press is the “enemy of the people.” An Ipsos poll released around the same time confirmed that close to a majority of GOP voters believe “the news media is the enemy of the American people.” That same poll showed that a significant plurality of GOP voters—43 to 36 percent—think Trump should have the expressly unconstitutional authority to shutter media outlets with which he disagrees. Or, rather, “news outlets engaged in bad behavior,” whatever that means.
The GOP base also seems generally unfazed by Donald Trump’s bizarre rhetorical deference to Russian President Vladimir Putin. An Economist-backed YouGov poll in early July showed 56 percent of the GOP said that “Donald Trump’s relationship with Vladimir Putin is mostly a good thing for the United States,” while only 40 percent said that the United States should remain a member of the NATO alliance. In 2014, only 22 percent of Republicans thought of Russia as friendly toward or allied with the United States. Today, via Gallup, that’s up to 40 percent of Republicans.
Given that, it’s no surprise that 70 percent of self-identified Republicans broke with the vast majority of the public and gave the president high marks for his press conference alongside Putin, in which he disparaged his own Cabinet and intelligence officials and heaped praise upon the autocrat in the Kremlin.
Donald Trump’s rhetorical servility toward Putin contrasts greatly with his administration’s admirably hawkish posture toward Moscow, but don’t ask Republican voters to reconcile these contradictions. A July Fox News poll found that 57 percent of GOP voters think that Trump’s toughness toward Russia is “about right.” So, which is it?
“An attack on law enforcement is an attack on all Americans,” Donald Trump said to the applause of Republicans as he accepted the party’s presidential nomination. Ever since, the president has occupied his time attacking law enforcement, and Republicans are with him all the way.
Seventy-five percent of Republicans in a recent poll agree with the president that the special counsel’s office established by a Trump-appointed deputy attorney general is conducting a “witch hunt” targeting him and his allies. This position concedes that the 12 Russian nationals, 13 Russian intelligence officers, and five Americans who pleaded guilty to various crimes as a result of Robert Mueller’s work retain the president’s full faith and confidence. But perhaps that conclusion takes the average Republican voter literally and not seriously.
More seriously, six in ten Republicans tell pollsters that they believe the FBI is actively trying to frame the President of the United States for a crime. Logically, then, it stands to reason that most in the GOP believe that law enforcement is a politicized institution that is waging an underhanded campaign to de-legitimize an election and carry out something akin to a coup. In February, Reuters/Ipsos found that 73 percent of Republicans believe just that. But if the coup narrative were true and an existential threat to the foundations of the Republic had been uncovered, would Republicans really behave as they are—placidly allowing Democrats to out-raise, out-organize, and out-campaign GOP candidates consistently for over 18 months?
Even in trifling matters in which the stakes are so low that they hardly merit the effort it takes to lie—like the president’s baseless claim that “between 3 million and 5 million people voted illegally in the 2016 presidential election,” thus robbing Trump of a popular vote victory in 2016—a majority of Republican voters are willing to compromise themselves. And only to spare the president from the shame of trivial embarrassment.
Some contend that these results are an outgrowth of the fact that voters have deciphered the pollster’s game. Respondents are savvy enough to know when survey-takers are genuinely trying to take the public’s temperature on an issue and when they are merely seeking to exacerbate tensions within the GOP camp. Thus, this line of reasoning goes, respondents who support Trump are more likely to answer questions in a way that demonstrates their fealty toward the president even if they don’t necessarily hold that position. In other words, these are all lies. Maybe that’s true, but it’s cold comfort. The lies we tell ourselves become our truth if we tell them often enough.
Choose your plan and pay nothing for six Weeks!
arly in the morning of July 19, after eight hours of debate, the Knesset passed by a vote of 62–55 (with two abstentions) a law codifying Israel’s status as the national home of the Jewish people. First introduced in 2011 by the centrist Kadima Party, the so-called nation-state bill joined more than a dozen “Basic Laws” that now function as Israel’s unwritten constitution. Its 11 paragraphs mostly restate long-operative principles of Israeli democracy: Hebrew is the national language, “Hatikvah” is the national anthem, the menorah is the national emblem, Jerusalem is the nation’s capital, and Israel is where the self-determination of the Jewish nation is exercised.
One might find it surprising that such generalities would provoke a global outcry. Then again, Israel and selective indignation seem to go together like peanut butter and jelly. Criticisms run the gamut from saying the law is unnecessary and provocative to saying it’s racist and anti-democratic. The Israeli left, in alliance with Israel’s minority Arabs and Druze, has marched in the streets. Institutions of the Jewish Diaspora have called for the law’s repeal. They have found themselves, rather uneasily, on the same side of the debate as anti-Zionists and Israel-haters in the West Bank and Gaza Strip, in Muslim capitals, and in the EU and UN. “The spirit of Hitler, which led the world to a great catastrophe, has found its resurgence among some of Israel’s leaders,” said Turkey’s Recep Tayip Erdogan.
Leaving aside anti-Semites such as Erdogan, reasonable people and friends of Israel may disagree about the necessity and utility of the nation-state law. Such disagreement, however, ought to be based on facts. And facts have been sorely lacking in recent discussions of Israel—thanks to an uninformed, biased, and one-sided media. Major journalistic institutions have become so wedded to a pro-Palestinian, anti–Benjamin Netanyahu narrative, in which Israel is part of a global trend toward nationalist authoritarian populism, that they have abdicated any responsibility for presenting the news in a dispassionate and balanced manner. The shameful result of this inflammatory coverage is the normalization of anti-Israel rhetoric and policies and widening divisions between Israel and the Diaspora.
For example, a July 18, 2018, article in the Los Angeles Times described the nation-state law as “granting an advantageous status to Jewish-only communities.” But that is false: The bill contained no such language. (An earlier version might have been interpreted in this way, but the provision was removed.) Yet, as I write, the Los Angeles Times has not corrected the piece that contained the error.
On July 19, in the New York Times, David M. Halbfinger and Isabel Kershner wrote that the Knesset’s “incendiary move” had been “denounced by centrists and leftists as racist and anti-democratic.” Why? Because the law “omits any mention of democracy or the principle of equality.” But that is because other Basic Laws already have codified the democratic and egalitarian character of Israel, including two laws dealing specifically with human rights.
The nation-state law, the Times continued, also “promotes the development of Jewish communities, possibly aiding those who would seek to advance discriminatory land-allocation policies.” Put the emphasis on possibly, because there’s nothing in the law to provide such aid.
Indeed, the nation-state law contains no additional rights for Jews; nor does it promulgate fewer rights for Arabs. Halbfinger and Kershner went on to say that the law “downgrades Arabic from an official language to one with a ‘special status.’” But then, far into the piece, the writers also acknowledged that “it is largely a symbolic sleight since a subsequent clause says, ‘This clause does not harm the status given to the Arabic language before this law came into effect.’”
A July 22 front-page article in the Times by Max Fisher was headlined “Israel Picks Identity Over Democracy. More Nations May Follow.” This was a funny way to characterize a law that had won majority support, following parliamentary procedure, of a democratically elected legislative body. Such through-the-looking-glass analysis riddled this piece, as well as the additional four news articles and four op-eds the Times has published on the matter at the time of this writing. In these pieces, “democracy” is defined as “results favored by the New York Times editorial board,” and Israel’s national self-understanding is in irrevocable conflict with its democratic form of government.
Fisher’s “Interpreter” column began with an anecdote recalling how David Ben-Gurion “emerged from retirement in July 1967” and “insisted that Israel give up the territories it had conquered” after repelling the invasion of three Arab armies a month earlier. Unfortunately for Fisher, this dramatic episode seems to be apocryphal. Historian Martin Kramer, after exhaustive research, concluded, “There’s no evidence that Ben-Gurion warned Israelis that their victory ‘had sown the seeds of self-destruction,’ either in July 1967 or later.” Fisher stands by his story.
The questionable claims did not stop there. “The quality of Israeli democracy has been declining steadily since the early 2000s,” Fisher continued, an era that just happens to have coincided with the rise of Israeli statesmen whose politics he and the political scientists he cites find detestable.
Fisher also mentioned a “wave of horrific violence known as the Second Intifada, which killed far more Palestinians than Israelis, [and] included shocking terrorist attacks in previously safe Israeli enclaves.” But where did this violence come from? Who committed the shocking terrorist acts? It’s left unsaid.
Denying Arab agency is a longstanding habit of Israel’s critics. And that is what’s noteworthy about these often-hysterical reactions to the nation-state law: The stories use the legislation merely as a jumping-off point for larger complaints about Israel’s Jewish character. For these writers, this isn’t a debate over the Israeli flag. It’s a debate over Jewish nationalism and a proxy for the Israeli–Palestinian conflict.
In a July 24 “Ideas” piece for Time, Ilene Prusher wrote, “It’s not clear that the equality outlined in the founders’ vision statement”—that’s progressive-speak for “Declaration of Independence”—“remains a goal. It’s certainly far from reality.” Prusher continued, “The new law provides legal teeth for discrimination that is currently de facto” and, citing a left-wing law professor at Hebrew University, “essentially makes discrimination constitutional.”
No, it doesn’t, actually. Rather than speculate, the nation-state bill’s opponents might try examining the actual text, which says absolutely nothing about discrimination. As Eugene Kontorovich of Northwestern University said during a recent episode of the Jewish Leadership Conference podcast, “Anything can be perverted—but that does not mean everything is perverse.”
The truth is that democracy is thriving in Israel. So are many of the values one normally associates with (egad!) the New York Times. Last I checked, Israel is the one country in the Middle East where you can attend an LGBT Pride parade. Noah Ephron, a critic of the nation-state law, points out that the proportion of women serving in the Knesset is higher than in the U.S. Congress or average EU parliament. There is universal health care. “Alone among Western democracies,” Ephron adds, “labor unions have grown bigger and stronger in Israel over the past decade.” Minority citizens are guaranteed the same rights as Jewish ones. And it is precisely these achievements that are sustained by Israel’s Jewish character and traditions.
The Times quoted Avi Shilon, a historian at Ben-Gurion University, who said dismissively, “Mr. Netanyahu and his colleagues are acting like we are still in the battle of 1948, or in a previous era.” Judging by the fallacious, paranoid, fevered, and at times bigoted reaction to the nation-state bill, however, Bibi may have good reason to believe that Israel is still in the battle of 1948, and still defending itself against assaults on the very idea of a Jewish State.