The real difference in the latter-day reputations of Rodgers’s principal collaborators is that while he and Hart wrote nearly 30 musicals together, none of them has held the stage. Only two of their shows, On Your Toes (1936) and Pal Joey (1940), have ever been successfully revived—one time apiece—on Broadway. Conversely, five of the nine stage musicals written by Rodgers and Hammerstein continue to be mounted around the world and produced on Broadway every few seasons. Compared with Oklahoma!, Carousel, South Pacific, The King and I, and The Sound of Music, the shows of Rodgers and Hart are mere footnotes to the history of musical comedy.
Not so, however, their 800-odd songs. “The Lady Is a Tramp,” “My Funny Valentine,” “Spring Is Here,” “Where or When”: These songs, and many others like them, remain ubiquitous and beloved. Why have they outlived the musicals for which they were created—and what manner of man was capable of writing lyrics for them that were so widely varied in emotional tone?
Larry, Teddy’s older brother, was no less stagestruck and was seriously interested in Shakespeare from boyhood onward. But American playwrights had little to offer the serious-minded theatergoer at the turn of the 20th century, so he embraced the Broadway musical, which had just started to take shape as a unique genre. Uninterested in the flowery Central European operettas that then dominated the American musical-comedy stage, Hart looked for inspiration to W.S. Gilbert’s immaculately crafted lyrics and the small-scale musicals of Jerome Kern, P.G. Wodehouse, and Guy Bolton. Richard Rodgers, seven years Hart’s junior, was equally impressed by the Kern-Wodehouse-Bolton shows, recalling in 1975 that they “tried to deal in a humorous way with modern, everyday characters,” and the two men launched a writing partnership of their own shortly after they met in 1919.
The most widely remarked aspect of Hart’s style was and is its humor. The wittiest lyricist of the golden age of American popular song, he emulated Gilbert’s virtuosic use of rhyme for comic purposes, deftly translating it into the American vulgate: “When love congeals / It soon reveals / The faint aroma of performing seals, / The double-crossing of a pair of heels.” He was also a master of the twin arts of compression and surprise, and much of his humor lies in the economical way in which he makes his dramatic points. In “Bewitched, Bothered and Bewildered,” for example, he suggests in just 12 well-chosen words (all but one a monosyllable) that the singer’s attraction to her loutish partner is purely sexual: “He’s a laugh, but I love it / Because the laugh’s on me.”
Unlike Gilbert, Hart could be slipshod about fitting his words to Rodgers’s tunes, enough so that Stephen Sondheim called him “the laziest of the pre-eminent lyricists.”1 He was also inclined to cram too many rhymes into his lyrics, which can sound cluttered and self-conscious as a result. But Hart could also write plainly and directly about love, and whenever he did so, his straightforwardness was an ideal verbal complement to Rodgers’s broad, sweeping melodies.
All this notwithstanding, the most distinctive quality of his work is its deep-dyed disillusion, which not infrequently comes across as harsh cynicism: “Caring too much is such / A juvenile fancy. / Learning to trust is just / For children in school.” Just as often, though, it manifests itself in the melancholy romanticism of “Glad to Be Unhappy” (“Unrequited love’s a bore, / And I’ve got it pretty bad”) or the hopeless despair of “This Funny World” (“If you’re beaten, conceal it! / There’s no pity for you”). In all these modes, Hart stands out from every other songwriter of the ’20s and ’30s, and the fact that his saddest lyrics were invariably set to major-key tunes by Rodgers makes the contrast still more striking.
Why did Hart write such bleak lyrics? Because he took a bleak view of his own life. Rodgers’s account of their first meeting hints at one of the sources of his partner’s dissatisfaction, which was that he believed himself to be physically unattractive:
The total man was hardly more than five feet tall. He wore frayed carpet slippers, a pair of tuxedo trousers, an undershirt and a nondescript jacket. His hair was unbrushed, and he obviously hadn’t had a shave for a couple of days. . . . Feature for feature he had a handsome face, but it was set in a head that was a bit too large for his body and gave him a slightly gnomelike appearance.
Dark and balding, Hart was so painfully self-conscious about his height that he wore elevator shoes. He was also homosexual, and there is no evidence that he ever formed a lasting romantic relationship with anyone.2 Hart was discreet about his sexuality, making it impossible to speak with assurance about his interior life. Still, everyone who knew him agreed that (in the words of Alan Jay Lerner) “the joy of his professional success became drowned in the lost misery of his handicapped life,” and that he drank to increasingly perilous excess for that reason. But there was another compelling reason for Hart’s chronic disappointment: He found it all but impossible to write for Broadway in the theatrically serious way to which he had aspired from the start of his career.
Hart believed deeply in the potential of the musical as a dramatic art form, going so far as to assure Alec Wilder that “all his lyrics were concerned with character delineation and plot.” He also believed in the storytelling power of dance, so much so that he and Rodgers worked with George Balanchine, the greatest choreographer of the 20th century, on four shows, On Your Toes, Babes in Arms (1937), I Married an Angel (1938), and The Boys from Syracuse (1938). But with the sole exception of Pal Joey, the books of their musicals were not nearly so distinguished as their scores. Unable to write his own books, Hart elected instead to collaborate with journeymen like Herbert Fields on weightlessly silly “original” stories, and he knew that he was disserving his potential by doing so.
In this respect, of course, Hart was no different from anyone else writing for Broadway in the ’20s and ’30s. Only Show Boat (1927) and Porgy and Bess (1935) had aspired to do more than provide light entertainment to tired businessmen and their wives. But he longed in vain to aim higher, and Hollywood failed to offer him a more challenging alternative, though two of the first movies for which he and Rodgers wrote songs did more than hint at the untapped promise of the new medium: Love Me Tonight (1932, directed by Rouben Mamoulian) and Hallelujah, I’m a Bum (1933, directed by Lewis Milestone) are film musicals whose staging and scores were strikingly innovative. Alas, the creative decisions in Hollywood were made by producers, not writers, and Rodgers and Hart never again worked on films of like quality, nor were the watered-down screen versions of their stage shows at all satisfactory.
Then came Pal Joey (1940), written in collaboration with John O’Hara, who had suggested to Rodgers that his popular New Yorker stories might be turned into a musical. The result was an edgy urban fable about a heartless nightclub hoofer (played by Gene Kelly on Broadway) who catches the eye of a man-hungry socialite who is prepared to stake him to a club of his own in return for his services in bed. O’Hara put Joey Evans’s sleazy world on the stage so vividly that even his stage directions are memorable: “A cheap night club, on the South Side of Chicago. Not cheap in the whorehouse way, but strictly a neighborhood joint.”
But it was only because of the redeeming presence of such beautiful ballads as “Bewitched, Bothered and Bewildered” and “I Could Write a Book” that Broadway audiences were willing to stomach the plot of Pal Joey, which even now can come across as joltingly hard-boiled. And while the show ran for 374 performances, Brooks Atkinson’s priggish New York Times review devastated Hart: “If it is possible to make an entertaining musical comedy out of an odious story, Pal Joey is it . . . . Although [it] is expertly done, can you draw sweet water from a foul well?” Atkinson was the most perceptive of all the major newspaper critics, and his inability to appreciate the show’s originality dealt a fatal blow to its lyricist’s shaky self-confidence.
Hart had long since become a full-fledged alcoholic, but his drinking had not yet started to affect the quality of his work. After Pal Joey, though, he went off the rails, and he did so just as Rodgers became interested in writing a musical based on a 1930 play, Lynn Riggs’s Green Grow the Lilacs, whose subject matter, life in the Oklahoma Territory at the turn of the 20th century, was unsympathetic to Hart. Not only was Rodgers determined to proceed with the project, but he was tired of putting up with Hart’s drunkenness, so he decided to do the show with Oscar Hammerstein, who had already suggested to Rodgers that it was time for him to write a musical with “a book of ‘substance.’” Hart told his partner to go ahead without him. “I don’t know how you put up with me all these years,” he added. “The best thing would be for you to forget about me.”
The success in 1943 of what came to be known as Oklahoma! was the beginning of the end for Hart. Hoping against hope that work would give his old friend a reason to live, Rodgers asked him to help revise one of their earlier shows, a 1927 musical version of Mark Twain’s A Connecticut Yankee in King Arthur’s Court. But it was too late: The revival opened in November of 1943, and Hart was hospitalized the next morning, dying four days later. His last words were “What have I lived for?”
The 1952 Broadway revival of Pal Joey proved to be unexpectedly successful—it ran for 540 performances, 166 more than the original production—and Atkinson acknowledged no less unexpectedly in his New York Times review that he had failed to properly appreciate the show 12 years earlier:
There was a minority, including this column, that was not enchanted. But no one is likely now to be impervious to the tight organization of the production, the terseness of the writing, the liveliness and versatility of the score, and the easy perfection of the lyrics.
By then, though, it was clear that the musicals of Rodgers and Hammerstein, Pal Joey excepted, were far better constructed than anything else that Rodgers and Hart had written, and Pal Joey itself gradually faded from sight. The unsuccessful 2008 Broadway revival made use of a completely (and ineptly) rewritten book by Richard Greenberg, and while On Your Toes was revived with commercial success in 1983, its popularity was substantially due to the show’s choreography and dancing.
On the other hand, the songs of Rodgers and Hart finally came into their own in the ’50s, when pop and jazz singers like Frank Sinatra, Ella Fitzgerald, and Peggy Lee started to perform and record them regularly. These performances, especially Sinatra’s, were chiefly responsible for establishing pre-rock American popular song as a recognized genre whose sophistication set it apart from other, more plebeian styles of pop music. The tribute to Hart and his composing partner that Cole Porter had penned in 1939 became a byword: “It’s smooth, it’s smart. / It’s Rodgers, it’s Hart!” As much as postwar theatergoers delighted in the wholesome musicals of Rodgers and Hammerstein, no one ever called them smooth or smart.
Unlikely as it would have seemed at the time of his death three-quarters of a century ago, Hart’s lyrics are even more widely admired today than they were in his lifetime. Uprooted from their flimsy theatrical contexts, they have turned out to be uncommonly hardy. If anything, their dark disillusion speaks even more powerfully to postmodern listeners than it did to those who first heard them in the ’20s and ’30s, and for those who know the double-edged feeling that Hart described when he wrote in “Glad to Be Unhappy” that “for someone you adore, / It’s a pleasure to be sad,” no other popular songs are quite so poignant.
1 A notorious example of Hart’s careless prosody is the opening line of “There’s a Small Hotel,” in which he accents the first two words of the song’s title, throwing away “small hotel.”
2 On one of the rare occasions that Hart is known to have spoken to Rodgers (or anyone else) about his sex life, he hinted that he favored cash-and-carry encounters with the lower-class men who are known in the gay world as “rough trade.”
Choose your plan and pay nothing for six Weeks!
Broadway’s Tiny Giant
Must-Reads from Magazine
Good intentions, tragic consequences.
Chicago, Illinois — Andy has little time to chitchat. There are hundreds of hot towels to sort and fold, and when that’s done, there are yet more to wash and dry. The 41-year-old is one of half a dozen laundry-room workers at Misericordia, a community for people with disabilities in the Windy City. He and his colleagues, all of whom are intellectually disabled and reside on the Misericordia “campus,” know that their work has purpose, and they delight in each task and every busy hour.
In addition to his job at the laundry room, Andy holds two others. “For two days I work at Sacred Heart”—a nearby Catholic school—“and at Target. Target is a store, a big super-store. At Sacred Heart, I sweep floors and tables.”
“Ah, so you’re the janitor there?” I follow up.
“No, no! I just clean. I love working there.”
Andy’s packed schedule is typical for the higher-functioning residents at Misericordia, many of whom juggle multiple jobs. Their work at Misericordia helps meet real community needs—laundry, recycling, gardening, cooking, baking, and so on—while preparing residents for the private labor market. Andy has already found competitive employment (at Target), but many others rely on Misericordia’s own programs to stay active and employed.
Yet if progressive lawmakers and minimum-wage crusaders have their way, many of these opportunities would disappear, along with the Depression-era law which makes them possible.
The law, Section 14(c) of the Fair Labor Standards Act, permits employers to pay people with disabilities a specialized wage based on their ability to perform various jobs. It thus encourages the hiring of the disabled while ensuring that they are paid a wage commensurate with their productivity. The law safeguards against abuse by, among other things, requiring employers to regularly review and adjust wages as disabled employees make productivity gains. Many of these employers are nonprofit entities that exist solely to provide meaningful work for the disabled.
Only 20 percent of Americans with disabilities participate in the labor force. The share is even smaller among those with intellectual and developmental disabilities. For this group, work isn’t mainly about money—most of the Misericordia residents are oblivious to how much they get paid—so much as it is about purpose and community. What the disabled seek from work is “the feeling of safety, the opportunity to work alongside friends, and an atmosphere of kindness and understanding,” says Scott Mendel, chairman of Together for Choice, which campaigns for freedom of choice for the disabled and their families. (Mendel’s daughter, who has cerebral palsy, lives and works at Misericordia.)
Abstract principles of economic justice, divorced from economic realities and the lived experience of people with disabilities, are a recipe for disaster in this area. Yet that’s the approach taken by too many progressives these days.
Last month, for example, seven Senate progressives led by Elizabeth Warren of Massachusetts wrote a letter to Labor Secretary Alexander Acosta denouncing Section 14(c) for setting “low expectations for workers with disabilities” and relegating them to “second-class” status. The senators also took issue with so-called sheltered workshops, like those at Misericordia, which are specifically designed to help the disabled find pathways to market employment. Activists at the state level, meanwhile, continue to press for the abolition of such programs, and they have already succeeded in restricting or limiting them in a number of jurisdictions, most notably in Pennsylvania, where such settings have been all but eliminated.
While there have been a few, notorious cases of 14(c) and sheltered-workshop abuse over the years, existing law provides mechanisms for punishing firms for misconduct. Getting rid of 14(c) and sheltered workshops, however, could potentially leave hundreds of thousands of disabled people unemployed. Activists have yet to explain what it is they expect these newly jobless to do with their time.
Competitive employment simply isn’t an option for many of the most disabled. And even those like Andy, who are employed in the private economy, tend to work at most 20 hours a week at their competitive jobs. What would they do with the rest of their time, if sheltered workshops didn’t exist? Most likely, they would “veg out” in front of a television. Squeezing 14(c) program and forcing private employers to pay minimum wage to workers whose productivity falls far short of the norm wouldn’t improve the lot of the disabled; it would leave them jobless.
Economic reality is reality no less for the disabled.
Nor have progressives accounted for the effects on the lives of the disabled in jurisdictions that have restricted sheltered workshops. “None of these states have done an adequate job of ascertaining whether these actions actually enhanced the quality of life for the individuals affected,” a study in the Social Improvement Journal concluded last year. Less time in sheltered workshops, the study found, “was not replaced with a corollary increase in the use of more integrated forms of employment.” Rather, “these individuals were essentially unemployed, engaging in made-up day activities.”
Make-work is not what Andy and his colleagues are up to today at Misericordia. They complete real tasks, which benefit their fellow residents in concrete ways. “This work is training, but it also gives them meaning,” one Misericordia director told me. “It’s not just doing meaningless work, but it’s going toward something. We’re not setting them up to do something that someone else takes apart. This is something that’s needed.” Yet, in the name of economic justice, progressives are on the verge of depriving men and women like Andy of the dignity of work and the freedom of choice that non-disabled Americans take for granted.
Choose your plan and pay nothing for six Weeks!
Reminding voters what Democratic governance means.
To paraphrase New York Times columnist Ross Douthat (with apologies), the less Republicans do in office, the more popular they generally become. That is, when the GOP exists solely in voters’ minds as a bulwark against cultural and political liberalism, it can cobble together a winning coalition. Likewise, Democrats regain the national trust when they serve only as an obstacle to Republican objectives. It’s when both parties begin to talk about what they want to do with their power that they get into trouble.
That is an over-simplification, but the core thesis is an astute one. In an age of negative partisanship and without an acute foreign or domestic crisis to focus the national mind, it’s not unreasonable to presume that both parties’ chief value is defined in negative terms by the public. Considering how little of the national dialogue has to do with policy these days, general principles and heuristics are probably how most marginal voters navigate the political environment.
Somewhere along the way, though, Democrats managed to convince themselves that they cannot just be the anti-Donald Trump party. Their most influential members have become convinced that the party needs to articulate a positive agenda beyond a set of vague principles. For the moment, Democrats who merely want to present themselves as unobjectionable alternatives to Trumpism without going into much broader detail appear to be losing the argument.
According to a study of campaign-season advertisements released on Friday by the USA Today Network and conducted by Kantar Media’s Campaign Marketing Analysis Group, Democrats are not leaning into their opposition to Trump. While over 44,000 pro-Trump advertisements from Republican candidates have aired on local broadcast networks, only about 20,000 Democratic ads have highlighted a candidate’s anti-Trump bona fides. “Trump has been mentioned in 27 [percent] of Democratic ads for Congress, overwhelmingly in a negative light,” the study revealed. In the same period during the 2014 midterm election cycle, by contrast, 60 percent of Republican advertisements featured President Barack Obama in a negative light.
There are plenty of caveats that should prevent observers from drawing too many broad conclusions about what this means. First, comparing the political environment in 2018 to 2014 is apples and oranges. Recall that 2014 was Barack Obama’s second midterm election, so naturally enthusiasm among the incumbent party’s base to rally to the president’s defense wanes while the “out-party’s” anxiety over the incumbent president grows. If Donald Trump’s job-approval rating is still anemic in September, it is reasonable to expect that Republican candidates will soft-peddle their support for the president just as Democrats did in 2010. Second, Democrats running against Democrats in a Democratic primary race may not feel the need to emphasize their opposition to the president, since that doesn’t create a stark enough contrast with their opponent.
And yet, the net effect of the primary season is the same. Democrats aren’t just informing voters of their opposition to how Trump and the Republican Party have managed the nation’s affairs; they’re describing what they would do differently. By and large, the Democratic Party’s agenda consists of “doubling” spending on social-welfare programs, education, and infrastructure, and promising a series of five-year-plan prestige projects. But Democratic candidates are also leaning heavily into divisive social issues.
The themes that Democratic ads have embraced so far range from support for new gun-control measures (“f*** the NRA,” was one New Mexico candidate’s message), to protecting public funding for Planned Parenthood, to promoting support for same-sex marriage rights, to attacking Sinclair Broadcasting (which happened to own the network on which that particular ad ran). A number of Democratic candidates are running on their support for a single-payer health-care system, including the progressive candidate in Nebraska’s GOP-leaning 2nd Congressional District who narrowly defeated an establishment-backed former House member this week, putting that seat farther out of the reach of Democrats in November.
In the end, messages like these animate the Democratic Party’s progressive base, but they have the potential to alienate swing voters. That may not be enough to overcome the electorate’s tendency to reward the “out-party” in a president’s first midterm election. And yet, the risk Democrats run by being specific about what they actually want to do with renewed political power cannot be dismissed. Democrats in the activist base are convinced that embracing conflict-ridden identity politics is a moral imperative, and the party’s establishmentarian leaders appear to believe that being anti-Trump is not enough to ensure the party’s success in November. All the while, the Democratic Party’s position in the polls continues to deteriorate.
Choose your plan and pay nothing for six Weeks!
Meritocracy is in the eye of the beholder.
A running theme in Jonah Goldberg’s fantastic new book, Suicide of the West, is the extent to which those who were bequeathed the blessings associated with classically liberal capitalist models of governance are cursed with crippling insecurity. Western economic and political advancement has followed a consistently upward trajectory, albeit in fits and starts. Yet, the chief beneficiaries of this unprecedented prosperity seem unaware of that fact. In boom or bust, the verdict of many in the prosperous West remains the same: the capitalist model is flawed and failing.
Capitalism’s detractors are as likely to denounce the exploitative nature of free markets during a downturn as they are to lament the displacement and disorientation that follows when the economy roars. The bottom line is static; only the emphasis changes. Though this tendency is a bipartisan one, capitalism’s skeptics are still more at home on the left. With the lingering effects of the Great Recession all but behind us, the liberal argument against capitalism’s excesses has shifted from mitigating the effects on low-skilled workers to warnings about the pernicious effects of prosperity.
Matthew Stewart’s expansive piece in The Atlantic this month is a valuable addition to the genre. In it, Stewart attacks the rise of a permanent aristocracy resulting from the plague of “income inequality,” but his argument is not a recitation of the Democratic Party’s 2012 election themes. It isn’t just the mythic “1 percent,” (or, in the author’s estimation, the “top 0.1 percent”) but the top 9.9 percent that has not only accrued unearned benefits from capitalist society but has fixed the system to ensure that those benefits are hereditary.
Stewart laments the rise of a new Gilded Age in America, which is anecdotally exemplified by his own comfort and prosperity—a spoil he appears to view as plunder stolen from the blue-collar service providers he regularly patronizes. You see, he is a member of a new aristocracy, which leverages its economic and social capital to wall itself off from the rest of the world and preserves its influence. He and those like him have “mastered the old trick of consolidating wealth and passing privilege along at the expense of other people’s children.” This corruption and Stewart’s insecurity is, he contends, a product of consumerism. “The traditional story of economic growth in America has been one of arriving, building, inviting friends, and building some more,” Stewart wrote. “The story we’re writing looks more like one of slamming doors shut behind us and slowly suffocating under a mass of commercial-grade kitchen appliances.”
Though he diverges from the kind of scientistic Marxism reanimated by Thomas Piketty, Stewart nevertheless appeals to some familiar Soviet-style dialectical materialism. “Inequality necessarily entrenches itself through other, nonfinancial, intrinsically invidious forms of wealth and power,” he wrote. “We use these other forms of capital to project our advantages into life itself.” In this way, Stewart can have it all. The privilege enjoyed by the aristocracy is a symptom of Western capitalism’s sickness, but so, too, are the advantages bestowed on the underprivileged. Affirmative action programs in schools, for example, function in part to “indulge rich people in the belief that their college is open to all on the basis of merit.”
It goes on like this for another 13,000 words and, thus, has the strategic advantage of being impervious to a comprehensive rebuttal outside of a book. Stewart does make some valuable observations about entrenched interests, noxious rent-seekers, and the perils of empowering the state to pick economic winners and losers. Where his argument runs aground is his claim that meritocracy in America is an illusion. Capitalism is, he says, a brutal zero-sum game in which true advancement is rendered unattainable by unseen forces is a foundational plank of the liberal American ethos. This is not new. Not new at all.
Much of Stewart’s thesis can be found in a 2004 report in The Economist, which alleges that the American upper-middle-class has created a set of “sticky” conditions that preserve their status and result in what Teddy Roosevelt warned could become an American version of a “hereditary aristocracy.” In 2013, the American economist Joseph Stiglitz warned that the American dream is dead, and the notion that the United States is a place of opportunity is a myth. “Since capitalism required losers, the myth of the melting pot was necessary to promote the belief in individual mobility through hard work and competition,” read a line from a 1973 edition of a National Council for the Social Studies-issued handbook for teachers. The Southern Poverty Law Center, which for some reason produces a curriculum for teachers, has long recommended that educators advise students poverty is a result of systemic factors and not individual choices. Even today, a cottage industry has arisen around the notion that Western largess is decadence, that meritocracy is a myth, and that arguments to the contrary are acts of subversion.
The belief that American meritocracy is a myth persists despite wildly dynamic conditions on the ground. As the Brookings Institution noted, 60 percent of employed black women in 1940 worked as household servants, compared with just 2.2 percent today. In between 1940 and 1970, “black men cut the income gap by about a third,” wrote Abigail and Stephan Thernstrom in 1998. The black professional class, ranging from doctors to university lecturers, exploded in the latter half of the 20th Century, as did African-American home ownership and life expectancy rates. The African-American story is not unique. The average American income in 1990 was just $23,730 annually. Today, it’s $58,700—a figure that well outpaces inflation and that outstrips most of the developed world. The American middle-class is doing just fine, but that experience has not come at the expense of Americans at or near the poverty line. As the economic recovery began to take hold in 2014, poverty rates declined precipitously across the board, though that effect was more keenly felt by minority groups which recovered at faster rates than their white counterparts.
As National Review’s Max Bloom pointed out last year, 13 of the world’s top 25 universities and 21 of the world’s 50 largest universities are located in America. The United States attracts substantial foreign investment, inflating America’s much-misunderstood trade deficit. The influx of foreign immigrants and legal permanent residents streaming into America looking to take advantage of its meritocratic system rivals or exceeds immigration rates at the turn of the 20th Century. You could be forgiven for concluding that American meritocracy is self-evident to all who have not been informed of the general liberal consensus. Indeed, according to an October 2016 essay in The Atlantic by Victor Tan Chen, the United States so “fetishizes” meritocracy that it has become “exhausting” and ultimately “harmful” to its “egalitarian ideals.”
Stewart is not wrong that there has been a notable decline in economic mobility in this decade. That condition is attributable to many factors, ranging from the collapse of the mortgage market to the erosion of the nuclear family among lower-to middle-class Americans (a charge supported by none-too-conservative venues like the New York Times and the Brookings Institution). But Mr. Stewart will surely rejoice in the discovery that downward economic mobility is alive and well among the upper class. National Review’s Kevin Williamson observed in March of this year that the Forbes billionaires list includes remarkably few heirs to old money. “According to the Bureau of Labor Statistics, inherited wealth accounts for about 15 percent of the assets of the wealthiest Americans,” he wrote. Moreover, that list is not static; it churns, and that churn is reflective of America’s economic dynamism. In 2017, for example, “hedge fund managers have been displaced over the last two years not only by technology billionaires but by a fish stick king, meat processor, vodka distiller, ice tea brewer and hair care products peddler.”
There is plenty to be said in favor of America’s efforts to achieve meritocracy, imperfect as those efforts may be. But so few seem to be touting them, preferring instead to peddle the idea that the ideal of success in America is a hollow simulacrum designed to fool its citizens into toiling toward no discernable end. Stewart’s piece is a fine addition to a saturated marketplace in which consumers are desperate to reward purveyors of bad news. Here’s to his success.
Choose your plan and pay nothing for six Weeks!
Podcast: Donald Trump Jr. moves the ball forward.
We try, we really do try, to sort through the increasingly problematic “Russian collusion” narrative and establish a timeline of sorts—and figure out what’s real and what’s nonsense. Do we succeed? Give a listen.
Don’t forget to subscribe to our podcast on iTunes.
Choose your plan and pay nothing for six Weeks!
An immigrant from Italy, Morais had taught himself English utilizing the King James Bible. Few Americans spoke in this manner, including Abraham Lincoln. Three days later, the president himself reflected before an audience: “How long ago is it?—eighty-odd years—since on the Fourth of July for the first time in the history of the world a nation by its representatives assembled and declared as a self-evident truth that ‘all men are created equal.’” Only several months later, at the dedication of the Gettysburg cemetery, would Lincoln refer to the birth of our nation in Morais’s manner, making “four score and seven years ago” one of the most famous phrases in the English language and thereby endowing his address with a prophetic tenor and scriptural quality.
This has led historians, including Jonathan Sarna and Marc Saperstein, to suggest that Lincoln may have read Morais’s sermon, which had been widely circulated. Whether or not this was so, the Gettysburg address parallels Morais’s remarks in that it, too, joins mourning for the fallen with a recognition of American independence, allowing those who had died to define our appreciation for the day that our “forefathers brought forth a new nation conceived in liberty.” Lincoln’s words stressed that a nation must always link civic celebration of its independence with the lives given on its behalf. Visiting the cemetery at Gettysburg, he argued, requires us to dedicate ourselves to the unfinished work that “they who fought here have thus far so nobly advanced.” He went on: “From these honored dead we take increased devotion to that cause for which they gave the last full measure of devotion,” thereby ensuring that “these dead shall not have died in vain.”
The literary link between Morais’s recalling of Jerusalem and Lincoln’s Gettysburg Address makes it all the more striking that it is the Jews of today’s Judea who make manifest the lessons of Lincoln’s words. Just as the battle of Gettysburg concluded on July 3, Israelis hold their Memorial Day commemorations on the day before their Independence Day celebrations. On the morning of the Fourth of Iyar, a siren sounds throughout the land, with all pausing their everyday activities in reverent memory of those who had died. There are few more stunning images of Israel today than those of highways on which thousands of cars grind to a halt, all travelers standing at the roadside, and all heads bowing in commemoration. Throughout the day, cemeteries are visited by the family members of those lost. Only in the evening does the somber Yom Hazikaron give way to the joy of the Fifth of Iyar’s Yom Ha’atzmaut, Independence Day. For anyone who has experienced it, the two days define each other. Those assembled in Israel’s cemeteries facing the unbearable loss of loved ones do so in the knowledge that it is the sacrifice of their beloved family members that make the next day’s celebration of independence possible. And the celebration of independence is begun with the acknowledgement by millions of citizens that those who lie in those cemeteries, who gave “their last full measure of devotion,” obligate the living to ensure that the dead did not die in vain.
The American version of Memorial Day, like the Gettysburg Address itself, began as a means of decorating and honoring the graves of Civil War dead. It is unconnected to the Fourth of July, which takes place five weeks later. Both holidays are observed by many (though not all) Americans as escapes from work, and too few ponder the link between the sacrifice of American dead and the freedom that we the living enjoy. There is thus no denying that the Israelis’ insistence on linking their Independence Day celebration with their Memorial Day is not only more appropriate; it is more American, a truer fulfillment of Lincoln’s message at Gettysburg.
In studying the Hebrew calendar of 1776, I was struck by the fact that the original Fourth of July, like that of 1863, fell on the 17th of Tammuz. It is, perhaps, another reminder that Gettysburg and America’s birth must always be joined in our minds, and linked in our civic observance. It is, of course, beyond unlikely that Memorial Day will be moved to adjoin the fourth of July. Yet that should not prevent us from learning from the Israeli example. Imagine if the third of July were dedicated to remembering the battle that concluded on that date. Imagine if “Gettysburg Day” involved a brief moment of commemoration by “us, the living” for those who gave the last full measure of devotion. Imagine if tens—perhaps hundreds—of millions of Americans paused in unison from their leisure activities for a minute or two to reflect on the sacrifice of generations past. Surely our observance of the Independence Day that followed could not fail to be affected; surely the Fourth of July would be marked in a manner more worthy of a great nation.