The cautionary tale of Battlestar Galactica.
Either you are with me, or you are my enemy!” shouted a young Darth Vader in 2005’s Star Wars: Revenge of the Sith, one of the execrable prequels to the original films by George Lucas. In response to this all-or-nothing provocation, a disgusted Obi-Wan Kenobi replies, “Only a Sith deals in absolutes!”
Siths are Jedi Knights who have given themselves over to the Dark Side by embracing the evil emotions of anger, envy, and revenge. Readers of Commentary can be forgiven for neither knowing nor caring about this. But it is worth noting that for millions of Star Wars enthusiasts, it was very serious stuff indeed. Lucas revived, if not reinvented, the entire genre of science fiction in the 1970s by embracing bold and mythic depictions of good and evil and the heroic battle of the former against the latter. For decades, the established premise of the Star Wars franchise was that the universe is divided into the Dark Side and the Light Side of the “Force.” Jedi Knights—champions of all that is noble and virtuous—were warned never to give in, even a little, to the Dark Side, lest they lose their souls. If all that is not about “absolutes,” then what on earth (or in a galaxy far, far away) is? And Lucas threw it all away to get in a dig at George W. Bush.
His swipe at Bush’s famous iteration of the doctrine that would bear his name—“You are either with us or against us”—in a few seconds unraveled the entire moral superstructure of the Star Wars franchise. Such gratuitous political self-indulgence saturated the popular culture during the Bush years, in fare that had absolutely nothing to do with the policies of the White House.
In the two (awful) sequels to The Matrix, a science-fiction hit about humans being used as a fuel source by a world overtaken by machines, Bush is visually compared to Adolf Hitler. In the Pixar film Wall-E, the “global CEO” of an environmentally devastated Planet Earth apes Bush’s “stay the course” line. In X-Files: I Want to Believe, Bush and J. Edgar Hoover are paired. On television, Bush hatred or liberal antiwar paranoia suffused the NBC series Law and Order like a metastasizing cancer. The hospital show Grey’s Anatomy, the attorney show Boston Legal, the cop show Bones, and even the mother-daughter show Gilmore Girls included notable and needless instances, some playful and others less so, of what Charles Krauthammer dubbed Bush Derangement Syndrome.
In most of these cases, political asides can be shrugged off. Hollywood is a very liberal place, Bush and the war were indeed very unpopular, so expecting producers and actors to escape the temptation to get their shots in would be like expecting them to treat global warming with skepticism. Denouncing the ideological intrusion into the dialogue of Grey’s Anatomy as a corruption of artistic integrity offers such televised junk more respect than it deserves. After all, few can look upon Harold & Kumar Escape from Guantanamo Bay and wistfully ponder what might have been.
That is not the case with a cable-television series called Battlestar Galactica, a remarkable piece of work that nonetheless committed artistic and creative suicide owing to the intrusion of the political beliefs of its creator and writers, which eventually made a complete hash of their own show.
A remake of a campy 1970s science-fiction series made in the wake of the box-office receipts of the original Star Wars, the gritty, intelligent, and pensive Battlestar Galactica came as a startling surprise upon the premiere of the six-hour miniseries that began its run in 2003. The story line involves a futuristic human civilization spanning 12 planetary colonies. Robots (called Cylons) originally invented to serve as slaves evolve into sentient enemies bent on destroying their former masters. In the original series, the Cylons were depicted as fairly absurd tin men. In the new version, the evolved Cylons are human doppelgängers capable of infiltrating human society (the tin men, far more frightening this time, are still around but serve as shock troops). The doppelgängers are also essentially immortal—if one is killed, his or her consciousness is instantly transmitted into a new, identical body.
In the debut miniseries, we are introduced to a civilization very much like our own: open, decent, democratic. In fulfillment of a supposedly divine plan, the Cylons spread out among humanity’s 20 billion people, taking advantage of that openness and decency, as well as society’s boredom with military preparedness (memories of the last Cylon war have faded away). They orchestrate a 9/11 on a genocidal scale, murdering the vast majority of humanity in a perfectly timed nuclear cataclysm. An aging battlestar called Galactica—essentially a space-borne aircraft carrier—poised to become a museum exhibit narrowly escapes the -Armageddon with a tiny ragtag convoy of humanity’s survivors. Outmatched, outgunned, and outstrategized, they must all try to survive against a foe that needs no rest and has no conscience.
These premises gave Battlestar Galactica an ideal foundation to play off the headlines of the day. Indeed, as Newsweek’s Joshua Alston noted in December 2008, Battlestar Galactica captured “better than any other TV drama of the past eight years the fear, uncertainty and moral ambiguity of the post-9/11 world.” The tensions between security and freedom, civilian and military leadership, healthy fear versus debilitating phobia, were explored brilliantly. The series won Program of the Year from the Television Critics Association, as well as numerous other awards. Time hailed it as the best thing on television in 2005, and the series earned a ranking in its top 100 TV shows of all time. From National Review to Rolling Stone, the series was justifiably hailed for its gritty realism, superb acting, and deft direction.
Originally, the series was very difficult to pigeonhole ideologically. An avid student of martial culture, Ron Moore, its guiding creative hand, treated the military with deep respect. William Adama, Galactica’s commander, is not a coffeehouse philosophe indulging his cosmopolitan sensibilities (the way Patrick Stewart’s Jean-Luc Picard often did in the second iteration of the Star Trek franchise in the 1980s), but a gruff and stalwart leader. Laura Roslin (played by Mary McDonnell) is a saccharine liberal do-gooder accidentally thrust into the position of president who achieves a flinty toughness—and makes an unexpected ideological journey of her own when she decides that abortion cannot be tolerated with the human population reduced to a mere 50,000 souls.
Inevitably and justifiably, the show dealt with various “enemy within” themes, but unlike countless rehashes of The Crucible, Battlestar Galactica conceded that there actually was an enemy within. The enemy was very real, literally an existential foe guilty of murdering 20 billion people, not just the hobgoblin of alleged McCarthyite paranoia. Peace activists are depicted, at times, as deluded, dangerous, and even vaguely traitorous, giving the impression that at least some of the writers were familiar with Orwell’s writings on wartime pacifists. And the frightening nature of the relentless suicide-bomber-attack machine was indelibly captured by the sensational concept that any Cylon killed in battle could simply be resurrected to fight another day.
Though the show received raves from writers and critics associated with the Right, Battlestar Galactica was in no way a conservative document. Numerous subplots were congenial to liberal sensibilities, as when President Roslin’s breast cancer is cured with embryonic stem cells. But hawkish arguments and assumptions were portrayed with integrity. The regrettable trade-offs implicit in any war, particularly a war to prevent total extinction, were treated as real.
The original miniseries was written and filmed in 2002, when the war on terror was a nearly universal cause. The show’s first season was written and filmed in 2003, and the second in 2004. When it came time to make the third season, in 2005, the war on terror had become old hat, and the war in Iraq had become a grinding controversy. Moore and his colleagues felt compelled to move on from their analogical portrait of the war on terror to the occupation of Iraq—a decision that upended the direction the show had been heading over the previous 32 hours and that led inexorably to its self-destruction.
The third season opens with most of humanity—exhausted by war, deprivation, and internal divisions—settling on a bleak, barely habitable planet. Suddenly the Cylons, after annihilating all but .00025 percent of humanity, decide they want to live in peace. But rather than leave humans alone, they conclude the best way to achieve this goal would be to invade this last tiny outpost of humanity and forcibly convert them to the one true god (in the series, the Cylons are monotheists, while the humans are polytheists) . . . or something.
The truth is that the audience was never given a remotely decipherable, never mind plausible, explanation for this radically bizarre and nonsensical turn of events. Rather, it was simply asserted in a hodgepodge of babbling dialogue. Almost immediately, the show’s protagonists are transformed into “insurgents” who have little or no compunction about becoming suicide bombers. The Cylons, for their part, are finding the human colony very troublesome. In one particularly ham-fisted scene, one of the Cylon leaders mocks his colleagues: “How did you think the humans would greet us? With— ‘Oh, never mind’?” This is, of course, a naked reference to the idea expressed before the American invasion, that the war in Iraq would be a “cakewalk.”
Most egregiously, the human suicide bombers are not young men brainwashed in a madrassa and promised eternal life with 72 virgins, nor are they threatened with the murder of their families—the tactics used by jihadists to create their human bombs. Rather, they are decent, calm, and composed men and women fighting in a noble cause. Taken seriously, this romanticization of suicide bombers and “insurgency” has a cascade of revolting implications. The insurgency in Iraq was not an authentic resistance like the Warsaw Ghetto uprising or De Gaulle’s Free French forces. The ranks of terrorists in Iraq were overwhelmingly made up of Baathist remnants of the Hussein regime and al-Qaeda interlopers with their own imperialist ambitions for a worldwide umma.
The extent of the show’s political and ideological corruption is best exemplified by the fact that one of the central pillars of the series had to be yanked: the notion that the Cylons had a grand, complex, conspiratorial plan involving their human doppelgängers that was unfolding inexorably over the course of the show’s run, one that humans needed to uncover in order to secure a victory in the war for the survival of their species. Indeed, every episode of the first three seasons began with an opening sequence in which the viewer is explicitly told that the Cylons “have a plan.” But in the third season, a Cylon leader explains that “plans change,” whereupon the Cylon quest to exterminate the human race simply evaporates so the show can riff on the evils of “occupation.” By the premiere of the fourth season, the Cylon plan was no longer mentioned during the opening credits. And every other seed of plot that had been planted over the previous years was left untended and forgotten as well.
Thus, a show marked by gritty realism about how a decent but flawed civilization modeled on our own tries to cling to its decency while fighting an existential war against an implacable enemy veered wildly off course. The humans were no longer analogized to Americans; rather Americans were analogized to genocidal occupiers. In other words, we are no longer the inspiration for the futuristic Israelites trying to survive. We are now the Nazis.
With this turnabout, left-wing writers suddenly fell in love with the show. Battlestar Galactica had “morphed into a stinging allegorical critique of America’s three-year occupation of Iraq,” cheered a writer in the liberal American Prospect. Spencer Ackerman, then an employee of the New Republic, wrote a piece for Slate titled “Battlestar: Iraqtica—Does the hit television show support the Iraqi insurgency?” His unequivocal conclusion: “In unmistakable terms, Battlestar: Galactica is telling viewers that insurgency (like, say, the one in Iraq) might have some moral flaws, such as the whole suicide bombing thing, but is ultimately virtuous and worthy of support. Wow.” That “wow” is celebratory.
After the Iraq story line, Battlestar Galactica deteriorated rapidly over the course of its final two seasons. The plot shift led the show’s writers and producers into a bizarre and meandering world of visiting angels, pseudo-scientific mumbo-jumbo, and deus-ex-machina literary devices. Human and Cylon fell in love; robots killed themselves; a key character’s death and resurrection were never explained; and in the end it turned out that everything we were watching had led to the population of our Earth 150,000 years ago and that we were heading in a similar direction because we have some robots now too. The disappointment among the show’s fans was palpable, and its final episode provoked widespread rage-—there is no other word for it—among those who had followed the series passionately for the previous five years and felt they had been tricked by its conclusion.
No doubt the producers believe it was all worth it. For having the “bravery” to tackle the occupation of Iraq, the producers and lead actors were invited to a panel at the United Nations to dilate on the war on terror. It is hard to imagine that would have happened if the series had held to its original course.
Ron Moore told Salon in 2007 that “the show’s mission is not to present answers to what I think are really complicated, difficult questions. One of the mistakes TV often makes is that it tries to tackle complicated moral and legal issues and wrap them up in an hour and give you a neat, tidy message by the end: ‘And here’s the way to solve Iraq!’ I don’t think that’s helpful, and I don’t think that’s good storytelling or great to watch. Our mission is more about asking questions, asking the audience to think about things, to think about uncomfortable things, to question their own assumptions.”
It’s been said that the difference between the truth and fiction is that fiction has to make sense. After its third season, Battlestar Galactica steadily failed on both counts.
These failures are attributable not just to the allure of ideology and the desire to stay “relevant” but also to Moore’s fraudulent notion that merely “asking questions” isn’t itself a form of ideological commitment. Indeed, most propaganda is often posed in the form of invidious questions. A merely loaded question—have you stopped beating your wife yet?—is one thing. An invidious question is one in which evil fictions are given parity with truth. “I’m not saying the Holocaust didn’t happen, I’m just raising important questions.”
Joshua Alston’s conclusion that Battlestar Galactica best captures the fear, uncertainty, and ambiguity of the post-9/11 world still holds up, but with a thick layer of irony. For the series’s story arc demonstrates that Moore and Company were not immune to the pressures of the post-9/11 world. Indeed, it reveals instead that they could not handle those pressures.
Choose your plan and pay nothing for six Weeks!
How Politics Destroyed a Great TV Show
Must-Reads from Magazine
Good intentions, tragic consequences.
Chicago, Illinois — Andy has little time to chitchat. There are hundreds of hot towels to sort and fold, and when that’s done, there are yet more to wash and dry. The 41-year-old is one of half a dozen laundry-room workers at Misericordia, a community for people with disabilities in the Windy City. He and his colleagues, all of whom are intellectually disabled and reside on the Misericordia “campus,” know that their work has purpose, and they delight in each task and every busy hour.
In addition to his job at the laundry room, Andy holds two others. “For two days I work at Sacred Heart”—a nearby Catholic school—“and at Target. Target is a store, a big super-store. At Sacred Heart, I sweep floors and tables.”
“Ah, so you’re the janitor there?” I follow up.
“No, no! I just clean. I love working there.”
Andy’s packed schedule is typical for the higher-functioning residents at Misericordia, many of whom juggle multiple jobs. Their work at Misericordia helps meet real community needs—laundry, recycling, gardening, cooking, baking, and so on—while preparing residents for the private labor market. Andy has already found competitive employment (at Target), but many others rely on Misericordia’s own programs to stay active and employed.
Yet if progressive lawmakers and minimum-wage crusaders have their way, many of these opportunities would disappear, along with the Depression-era law which makes them possible.
The law, Section 14(c) of the Fair Labor Standards Act, permits employers to pay people with disabilities a specialized wage based on their ability to perform various jobs. It thus encourages the hiring of the disabled while ensuring that they are paid a wage commensurate with their productivity. The law safeguards against abuse by, among other things, requiring employers to regularly review and adjust wages as disabled employees make productivity gains. Many of these employers are nonprofit entities that exist solely to provide meaningful work for the disabled.
Only 20 percent of Americans with disabilities participate in the labor force. The share is even smaller among those with intellectual and developmental disabilities. For this group, work isn’t mainly about money—most of the Misericordia residents are oblivious to how much they get paid—so much as it is about purpose and community. What the disabled seek from work is “the feeling of safety, the opportunity to work alongside friends, and an atmosphere of kindness and understanding,” says Scott Mendel, chairman of Together for Choice, which campaigns for freedom of choice for the disabled and their families. (Mendel’s daughter, who has cerebral palsy, lives and works at Misericordia.)
Abstract principles of economic justice, divorced from economic realities and the lived experience of people with disabilities, are a recipe for disaster in this area. Yet that’s the approach taken by too many progressives these days.
Last month, for example, seven Senate progressives led by Elizabeth Warren of Massachusetts wrote a letter to Labor Secretary Alexander Acosta denouncing Section 14(c) for setting “low expectations for workers with disabilities” and relegating them to “second-class” status. The senators also took issue with so-called sheltered workshops, like those at Misericordia, which are specifically designed to help the disabled find pathways to market employment. Activists at the state level, meanwhile, continue to press for the abolition of such programs, and they have already succeeded in restricting or limiting them in a number of jurisdictions, most notably in Pennsylvania, where such settings have been all but eliminated.
While there have been a few, notorious cases of 14(c) and sheltered-workshop abuse over the years, existing law provides mechanisms for punishing firms for misconduct. Getting rid of 14(c) and sheltered workshops, however, could potentially leave hundreds of thousands of disabled people unemployed. Activists have yet to explain what it is they expect these newly jobless to do with their time.
Competitive employment simply isn’t an option for many of the most disabled. And even those like Andy, who are employed in the private economy, tend to work at most 20 hours a week at their competitive jobs. What would they do with the rest of their time, if sheltered workshops didn’t exist? Most likely, they would “veg out” in front of a television. Squeezing 14(c) program and forcing private employers to pay minimum wage to workers whose productivity falls far short of the norm wouldn’t improve the lot of the disabled; it would leave them jobless.
Economic reality is reality no less for the disabled.
Nor have progressives accounted for the effects on the lives of the disabled in jurisdictions that have restricted sheltered workshops. “None of these states have done an adequate job of ascertaining whether these actions actually enhanced the quality of life for the individuals affected,” a study in the Social Improvement Journal concluded last year. Less time in sheltered workshops, the study found, “was not replaced with a corollary increase in the use of more integrated forms of employment.” Rather, “these individuals were essentially unemployed, engaging in made-up day activities.”
Make-work is not what Andy and his colleagues are up to today at Misericordia. They complete real tasks, which benefit their fellow residents in concrete ways. “This work is training, but it also gives them meaning,” one Misericordia director told me. “It’s not just doing meaningless work, but it’s going toward something. We’re not setting them up to do something that someone else takes apart. This is something that’s needed.” Yet, in the name of economic justice, progressives are on the verge of depriving men and women like Andy of the dignity of work and the freedom of choice that non-disabled Americans take for granted.
Choose your plan and pay nothing for six Weeks!
Reminding voters what Democratic governance means.
To paraphrase New York Times columnist Ross Douthat (with apologies), the less Republicans do in office, the more popular they generally become. That is, when the GOP exists solely in voters’ minds as a bulwark against cultural and political liberalism, it can cobble together a winning coalition. Likewise, Democrats regain the national trust when they serve only as an obstacle to Republican objectives. It’s when both parties begin to talk about what they want to do with their power that they get into trouble.
That is an over-simplification, but the core thesis is an astute one. In an age of negative partisanship and without an acute foreign or domestic crisis to focus the national mind, it’s not unreasonable to presume that both parties’ chief value is defined in negative terms by the public. Considering how little of the national dialogue has to do with policy these days, general principles and heuristics are probably how most marginal voters navigate the political environment.
Somewhere along the way, though, Democrats managed to convince themselves that they cannot just be the anti-Donald Trump party. Their most influential members have become convinced that the party needs to articulate a positive agenda beyond a set of vague principles. For the moment, Democrats who merely want to present themselves as unobjectionable alternatives to Trumpism without going into much broader detail appear to be losing the argument.
According to a study of campaign-season advertisements released on Friday by the USA Today Network and conducted by Kantar Media’s Campaign Marketing Analysis Group, Democrats are not leaning into their opposition to Trump. While over 44,000 pro-Trump advertisements from Republican candidates have aired on local broadcast networks, only about 20,000 Democratic ads have highlighted a candidate’s anti-Trump bona fides. “Trump has been mentioned in 27 [percent] of Democratic ads for Congress, overwhelmingly in a negative light,” the study revealed. In the same period during the 2014 midterm election cycle, by contrast, 60 percent of Republican advertisements featured President Barack Obama in a negative light.
There are plenty of caveats that should prevent observers from drawing too many broad conclusions about what this means. First, comparing the political environment in 2018 to 2014 is apples and oranges. Recall that 2014 was Barack Obama’s second midterm election, so naturally enthusiasm among the incumbent party’s base to rally to the president’s defense wanes while the “out-party’s” anxiety over the incumbent president grows. If Donald Trump’s job-approval rating is still anemic in September, it is reasonable to expect that Republican candidates will soft-peddle their support for the president just as Democrats did in 2010. Second, Democrats running against Democrats in a Democratic primary race may not feel the need to emphasize their opposition to the president, since that doesn’t create a stark enough contrast with their opponent.
And yet, the net effect of the primary season is the same. Democrats aren’t just informing voters of their opposition to how Trump and the Republican Party have managed the nation’s affairs; they’re describing what they would do differently. By and large, the Democratic Party’s agenda consists of “doubling” spending on social-welfare programs, education, and infrastructure, and promising a series of five-year-plan prestige projects. But Democratic candidates are also leaning heavily into divisive social issues.
The themes that Democratic ads have embraced so far range from support for new gun-control measures (“f*** the NRA,” was one New Mexico candidate’s message), to protecting public funding for Planned Parenthood, to promoting support for same-sex marriage rights, to attacking Sinclair Broadcasting (which happened to own the network on which that particular ad ran). A number of Democratic candidates are running on their support for a single-payer health-care system, including the progressive candidate in Nebraska’s GOP-leaning 2nd Congressional District who narrowly defeated an establishment-backed former House member this week, putting that seat farther out of the reach of Democrats in November.
In the end, messages like these animate the Democratic Party’s progressive base, but they have the potential to alienate swing voters. That may not be enough to overcome the electorate’s tendency to reward the “out-party” in a president’s first midterm election. And yet, the risk Democrats run by being specific about what they actually want to do with renewed political power cannot be dismissed. Democrats in the activist base are convinced that embracing conflict-ridden identity politics is a moral imperative, and the party’s establishmentarian leaders appear to believe that being anti-Trump is not enough to ensure the party’s success in November. All the while, the Democratic Party’s position in the polls continues to deteriorate.
Choose your plan and pay nothing for six Weeks!
Meritocracy is in the eye of the beholder.
A running theme in Jonah Goldberg’s fantastic new book, Suicide of the West, is the extent to which those who were bequeathed the blessings associated with classically liberal capitalist models of governance are cursed with crippling insecurity. Western economic and political advancement has followed a consistently upward trajectory, albeit in fits and starts. Yet, the chief beneficiaries of this unprecedented prosperity seem unaware of that fact. In boom or bust, the verdict of many in the prosperous West remains the same: the capitalist model is flawed and failing.
Capitalism’s detractors are as likely to denounce the exploitative nature of free markets during a downturn as they are to lament the displacement and disorientation that follows when the economy roars. The bottom line is static; only the emphasis changes. Though this tendency is a bipartisan one, capitalism’s skeptics are still more at home on the left. With the lingering effects of the Great Recession all but behind us, the liberal argument against capitalism’s excesses has shifted from mitigating the effects on low-skilled workers to warnings about the pernicious effects of prosperity.
Matthew Stewart’s expansive piece in The Atlantic this month is a valuable addition to the genre. In it, Stewart attacks the rise of a permanent aristocracy resulting from the plague of “income inequality,” but his argument is not a recitation of the Democratic Party’s 2012 election themes. It isn’t just the mythic “1 percent,” (or, in the author’s estimation, the “top 0.1 percent”) but the top 9.9 percent that has not only accrued unearned benefits from capitalist society but has fixed the system to ensure that those benefits are hereditary.
Stewart laments the rise of a new Gilded Age in America, which is anecdotally exemplified by his own comfort and prosperity—a spoil he appears to view as plunder stolen from the blue-collar service providers he regularly patronizes. You see, he is a member of a new aristocracy, which leverages its economic and social capital to wall itself off from the rest of the world and preserves its influence. He and those like him have “mastered the old trick of consolidating wealth and passing privilege along at the expense of other people’s children.” This corruption and Stewart’s insecurity is, he contends, a product of consumerism. “The traditional story of economic growth in America has been one of arriving, building, inviting friends, and building some more,” Stewart wrote. “The story we’re writing looks more like one of slamming doors shut behind us and slowly suffocating under a mass of commercial-grade kitchen appliances.”
Though he diverges from the kind of scientistic Marxism reanimated by Thomas Piketty, Stewart nevertheless appeals to some familiar Soviet-style dialectical materialism. “Inequality necessarily entrenches itself through other, nonfinancial, intrinsically invidious forms of wealth and power,” he wrote. “We use these other forms of capital to project our advantages into life itself.” In this way, Stewart can have it all. The privilege enjoyed by the aristocracy is a symptom of Western capitalism’s sickness, but so, too, are the advantages bestowed on the underprivileged. Affirmative action programs in schools, for example, function in part to “indulge rich people in the belief that their college is open to all on the basis of merit.”
It goes on like this for another 13,000 words and, thus, has the strategic advantage of being impervious to a comprehensive rebuttal outside of a book. Stewart does make some valuable observations about entrenched interests, noxious rent-seekers, and the perils of empowering the state to pick economic winners and losers. Where his argument runs aground is his claim that meritocracy in America is an illusion. Capitalism is, he says, a brutal zero-sum game in which true advancement is rendered unattainable by unseen forces is a foundational plank of the liberal American ethos. This is not new. Not new at all.
Much of Stewart’s thesis can be found in a 2004 report in The Economist, which alleges that the American upper-middle-class has created a set of “sticky” conditions that preserve their status and result in what Teddy Roosevelt warned could become an American version of a “hereditary aristocracy.” In 2013, the American economist Joseph Stiglitz warned that the American dream is dead, and the notion that the United States is a place of opportunity is a myth. “Since capitalism required losers, the myth of the melting pot was necessary to promote the belief in individual mobility through hard work and competition,” read a line from a 1973 edition of a National Council for the Social Studies-issued handbook for teachers. The Southern Poverty Law Center, which for some reason produces a curriculum for teachers, has long recommended that educators advise students poverty is a result of systemic factors and not individual choices. Even today, a cottage industry has arisen around the notion that Western largess is decadence, that meritocracy is a myth, and that arguments to the contrary are acts of subversion.
The belief that American meritocracy is a myth persists despite wildly dynamic conditions on the ground. As the Brookings Institution noted, 60 percent of employed black women in 1940 worked as household servants, compared with just 2.2 percent today. In between 1940 and 1970, “black men cut the income gap by about a third,” wrote Abigail and Stephan Thernstrom in 1998. The black professional class, ranging from doctors to university lecturers, exploded in the latter half of the 20th Century, as did African-American home ownership and life expectancy rates. The African-American story is not unique. The average American income in 1990 was just $23,730 annually. Today, it’s $58,700—a figure that well outpaces inflation and that outstrips most of the developed world. The American middle-class is doing just fine, but that experience has not come at the expense of Americans at or near the poverty line. As the economic recovery began to take hold in 2014, poverty rates declined precipitously across the board, though that effect was more keenly felt by minority groups which recovered at faster rates than their white counterparts.
As National Review’s Max Bloom pointed out last year, 13 of the world’s top 25 universities and 21 of the world’s 50 largest universities are located in America. The United States attracts substantial foreign investment, inflating America’s much-misunderstood trade deficit. The influx of foreign immigrants and legal permanent residents streaming into America looking to take advantage of its meritocratic system rivals or exceeds immigration rates at the turn of the 20th Century. You could be forgiven for concluding that American meritocracy is self-evident to all who have not been informed of the general liberal consensus. Indeed, according to an October 2016 essay in The Atlantic by Victor Tan Chen, the United States so “fetishizes” meritocracy that it has become “exhausting” and ultimately “harmful” to its “egalitarian ideals.”
Stewart is not wrong that there has been a notable decline in economic mobility in this decade. That condition is attributable to many factors, ranging from the collapse of the mortgage market to the erosion of the nuclear family among lower-to middle-class Americans (a charge supported by none-too-conservative venues like the New York Times and the Brookings Institution). But Mr. Stewart will surely rejoice in the discovery that downward economic mobility is alive and well among the upper class. National Review’s Kevin Williamson observed in March of this year that the Forbes billionaires list includes remarkably few heirs to old money. “According to the Bureau of Labor Statistics, inherited wealth accounts for about 15 percent of the assets of the wealthiest Americans,” he wrote. Moreover, that list is not static; it churns, and that churn is reflective of America’s economic dynamism. In 2017, for example, “hedge fund managers have been displaced over the last two years not only by technology billionaires but by a fish stick king, meat processor, vodka distiller, ice tea brewer and hair care products peddler.”
There is plenty to be said in favor of America’s efforts to achieve meritocracy, imperfect as those efforts may be. But so few seem to be touting them, preferring instead to peddle the idea that the ideal of success in America is a hollow simulacrum designed to fool its citizens into toiling toward no discernable end. Stewart’s piece is a fine addition to a saturated marketplace in which consumers are desperate to reward purveyors of bad news. Here’s to his success.
Choose your plan and pay nothing for six Weeks!
Podcast: Donald Trump Jr. moves the ball forward.
We try, we really do try, to sort through the increasingly problematic “Russian collusion” narrative and establish a timeline of sorts—and figure out what’s real and what’s nonsense. Do we succeed? Give a listen.
Don’t forget to subscribe to our podcast on iTunes.
Choose your plan and pay nothing for six Weeks!
An immigrant from Italy, Morais had taught himself English utilizing the King James Bible. Few Americans spoke in this manner, including Abraham Lincoln. Three days later, the president himself reflected before an audience: “How long ago is it?—eighty-odd years—since on the Fourth of July for the first time in the history of the world a nation by its representatives assembled and declared as a self-evident truth that ‘all men are created equal.’” Only several months later, at the dedication of the Gettysburg cemetery, would Lincoln refer to the birth of our nation in Morais’s manner, making “four score and seven years ago” one of the most famous phrases in the English language and thereby endowing his address with a prophetic tenor and scriptural quality.
This has led historians, including Jonathan Sarna and Marc Saperstein, to suggest that Lincoln may have read Morais’s sermon, which had been widely circulated. Whether or not this was so, the Gettysburg address parallels Morais’s remarks in that it, too, joins mourning for the fallen with a recognition of American independence, allowing those who had died to define our appreciation for the day that our “forefathers brought forth a new nation conceived in liberty.” Lincoln’s words stressed that a nation must always link civic celebration of its independence with the lives given on its behalf. Visiting the cemetery at Gettysburg, he argued, requires us to dedicate ourselves to the unfinished work that “they who fought here have thus far so nobly advanced.” He went on: “From these honored dead we take increased devotion to that cause for which they gave the last full measure of devotion,” thereby ensuring that “these dead shall not have died in vain.”
The literary link between Morais’s recalling of Jerusalem and Lincoln’s Gettysburg Address makes it all the more striking that it is the Jews of today’s Judea who make manifest the lessons of Lincoln’s words. Just as the battle of Gettysburg concluded on July 3, Israelis hold their Memorial Day commemorations on the day before their Independence Day celebrations. On the morning of the Fourth of Iyar, a siren sounds throughout the land, with all pausing their everyday activities in reverent memory of those who had died. There are few more stunning images of Israel today than those of highways on which thousands of cars grind to a halt, all travelers standing at the roadside, and all heads bowing in commemoration. Throughout the day, cemeteries are visited by the family members of those lost. Only in the evening does the somber Yom Hazikaron give way to the joy of the Fifth of Iyar’s Yom Ha’atzmaut, Independence Day. For anyone who has experienced it, the two days define each other. Those assembled in Israel’s cemeteries facing the unbearable loss of loved ones do so in the knowledge that it is the sacrifice of their beloved family members that make the next day’s celebration of independence possible. And the celebration of independence is begun with the acknowledgement by millions of citizens that those who lie in those cemeteries, who gave “their last full measure of devotion,” obligate the living to ensure that the dead did not die in vain.
The American version of Memorial Day, like the Gettysburg Address itself, began as a means of decorating and honoring the graves of Civil War dead. It is unconnected to the Fourth of July, which takes place five weeks later. Both holidays are observed by many (though not all) Americans as escapes from work, and too few ponder the link between the sacrifice of American dead and the freedom that we the living enjoy. There is thus no denying that the Israelis’ insistence on linking their Independence Day celebration with their Memorial Day is not only more appropriate; it is more American, a truer fulfillment of Lincoln’s message at Gettysburg.
In studying the Hebrew calendar of 1776, I was struck by the fact that the original Fourth of July, like that of 1863, fell on the 17th of Tammuz. It is, perhaps, another reminder that Gettysburg and America’s birth must always be joined in our minds, and linked in our civic observance. It is, of course, beyond unlikely that Memorial Day will be moved to adjoin the fourth of July. Yet that should not prevent us from learning from the Israeli example. Imagine if the third of July were dedicated to remembering the battle that concluded on that date. Imagine if “Gettysburg Day” involved a brief moment of commemoration by “us, the living” for those who gave the last full measure of devotion. Imagine if tens—perhaps hundreds—of millions of Americans paused in unison from their leisure activities for a minute or two to reflect on the sacrifice of generations past. Surely our observance of the Independence Day that followed could not fail to be affected; surely the Fourth of July would be marked in a manner more worthy of a great nation.