Although we may be seeing a traditional liberal cycle in policies that affect middle-class and upper-class interests, education, the environment,…
It is by now taken for granted that the nation is about to turn to the Left in domestic policy. “Reaganism is finished, bankrupt, used up, over,” wrote Arthur M. Schlesinger, Jr., in the Washington Post this past May. At the time he foresaw a squeaker in the November presidential election and then a liberal phase of the political cycle coming “full flood” in the early 1990’s.
Schlesinger is one of many saying roughly the same thing, although not all liberals share his anti-Reagan animus in so acute a form. The liberal conventional wisdom, as I read it, goes something like this: perhaps it is just as well that Reagan came along. We needed a breather to recognize some of the mistakes we made by the overzealous prosecution of liberal social policy in the 60’s and 70’s. But now something needs to be done about education, drugs, homelessness, children in poverty, and so forth, and that “something” must take the form of new federal programs. The American people are once again ready for “affirmative government.” And we are wiser now. This time, under Democratic leadership, we will get it right.
Indeed, the urge to try again (albeit with somewhat different nostrums) is as strong among many Republicans as it is among Democrats, which means that some sort of son-of-the-Great-Society seems very likely even if George Bush wins the presidency.
In either case, however, what we will be observing is far from a return to the way things were. Rather, a kind of simultaneous movement seems to be taking place within the body politic, and specifically within the political attitudes of American liberals. It is as if one geological plate, near the surface, were moving in one direction, while beneath it another and much larger plate, with much greater momentum, is moving in a different direction. I hesitate to use the terms “liberal” and “conservative” to describe the two plates: the direction of the upper one is unmistakably to the Left, just as Schlesinger says, but the direction of the other is not to the traditional Right. It is moving off at another angle altogether.
The consequences of this divergence can be specified with some precision. Although we may well be seeing a traditional liberal cycle in policies that affect middle-class and upper-class interests—education, the environment, industrial regulation, and the like—policies aimed at the black underclass in the inner city are likely to reflect a different, and much more ominous, set of dynamics. In my judgment these policies will turn out to be custodial in nature, and their effect will be to make the underclass into wards of the state.
Let me put forward some theses, each of which is based on current or incipient realities, each of which seems to me to have a high probability of coming true in the next decade, and all of which together support my prediction of the coming of custodial democracy.
First, the baby-boom generation is moving into its forties. The leading edge of this age cohort turned forty last year, and the rest will be doing so through the first half of the 90’s.
This simple statement of fact has pervasive implications. The baby-boomers, going through personal changes that tend to occur universally from the late thirties to the early forties, are coming out the other side more acutely aware of their own mortality, more attached to family and community, more reconciled to the world as it is and less optimistic about making it better.
Will this help the Republicans? In the past, the electorate has tended to vote more conservatively with age. Although the landslide election of 1984 was something of an exception—voters under thirty voted for Reagan in almost the same proportions as those forty-five and older—in 1980, with a closer election, the contrast was striking: 44 percent of voters under thirty went for Reagan, compared with 56 percent of those forty-five and older. In 1976, in a very close election, Gerald Ford lost among voters under thirty (48 percent) but won among those forty-five and older (53 percent). If this pattern holds, the Republicans have some grounds for optimism, since the number of voters aged thirty-five and older is rising (from two out of three in the 1980 presidential election to about three out of four in the late 1990’s).
These facts notwithstanding, however, it is unlikely that formerly liberal baby-boomers will rush to register as Republicans, or even that they will regard themselves as having grown more conservative. As Ben Wattenberg has observed, people tend to be “imprinted” early in their voting lives; that being so, baby-boomers who were liberal in their twenties are likely to go on identifying themselves with liberalism in their forties and fifties.
But party politics are in any case not the issue here. Whether they vote Democratic or Republican, the passage into middle age will cause the baby-boomers’ liberalism to be filtered through an altered sensibility. Indeed, it is precisely because the baby-boomers are likely to continue thinking of themselves as liberals that they will, I believe, be led to support custodial policies, confident that this is the liberal, “compassionate” thing to do.
If history is any guide, the nation itself will go through a midlife crisis along with the baby-boomers. This age cohort has repeatedly demonstrated a critical-mass effect. Just as, in the 1960’s, the radicalism of the baby-boomers, then in their teens and twenties, became the national fashion, and just as, in the 1970’s, they made the upwardly-mobile acquisitiveness of their thirties a national watchword, so their preoccupation with the novel perspectives of middle age as they reach their forties is also likely to become a rage. Indeed, it has already begun to do so, as is seen in the rash of recent articles on the decline of yuppieism. The yuppies are simply outgrowing that phase of life, as they were bound to do; and as they shift their attention elsewhere, so does and will the nation.
The aging of the baby-boomers will have its own, “conservatizing” effect on social policy. But it is not happening in a vacuum. With regard to policy affecting the inner-city poor, it is coinciding with a variety of other trends.
As I indicated above, there is no doubt that in the next years new social programs will be tried, whether enthusiastically by a President Dukakis or doggedly by a President Bush. There can also be little question, however, that these programs will fail to affect appreciably the size, or the misery, of the underclass.
Take the new welfare-reform law, recently adopted by Congress amid much fanfare over its promise to lead us at last out of the trap of welfare dependency by imposing stricter work expectations on recipients of federal aid. In fact, the new law does not do anything more than a variety of state-enacted work programs have already tried to do, with only modest and scattered success. No one has proposed a jobs program that will sidestep or overcome the problems encountered in such previous failures as CETA. No one is proposing programs for delinquents or drug addicts or the homeless or any similarly afflicted group that are different from earlier ones.
The truth about our federal social programs is unpalatable, and the attempts to wiggle away from it have been many and resourceful. But here it is, in the words of Peter Rossi, one of our most highly respected experts on this subject:
A review of the history of the last two decades of efforts to evaluate major social programs in the United States sustains the proposition that over this period the American establishment of policy-makers, agency officials, professionals, and social scientists did not know how to design and implement social programs that were minimally effective, let alone spectacularly so.
This is not the same thing as saying that nothing works. Local programs, especially if they are run by the people who had the idea for them in the first place, do sometimes work. Programs to help those who are helping themselves—the displaced worker with a history of employment, the drug-abuser who is motivated to quit, the dropout who is motivated to study—do sometimes work.
But two categories of programs have failed conspicuously and consistently: large federal programs intended to change behavior (as opposed to dealing out commodities or cash), and any program, small or large, local or federal, trying in particular to change the behavior of a clientele that is not already socialized into norms of working-class and middle-class society. Federal programs to help the underclass fall into both categories. We have spent vast amounts of money on these programs over the past decades, but no matter how much we spend, we still do not know how—let me repeat and emphasize these words, we do not know how—to change the behavior of significant proportions of the urban underclass through social engineering.
Now we have a new welfare-reform law, and there will undoubtedly be more to come. In another five or ten years, the new wave of programs to help the underclass escape its condition will have been tried and evaluated in its turn. One can plausibly predict the result: programs aimed at welfare mothers will occasionally show small positive effects—so small, however, that the aggregate numbers of chronic welfare recipients will not have changed. And as for the most important programs, those aimed at the men of the underclass, they will show either no effects at all, or negative ones.
My conclusion is that these results will first be accompanied by a sense of disillusionment and then by a search for scapegoats. Liberals will still be unwilling to accept that the social engineering itself has expanded the underclass. Increasingly, the scapegoats will be members of the underclass themselves: “You just can’t do anything with those people.”
Among the considerations leading liberals to that position will be the vanishing plausibility of a longstanding shibboleth of social reform, namely, the belief that the problem of unemployment among the underclass has to do mainly with a lack of jobs.
According to this argument, chronic unemployment among the members of the underclass is predominantly “structural” in nature. That is, there are not enough jobs to begin with, and those that exist require qualifications which members of the underclass do not have, or are located in places where they do not live, or offer such dispiritingly low wages that they are not worth the effort. Therefore, the argument concludes, since the root of the problem is the lack of jobs, the provision of jobs is its ultimate solution.1
A natural test of this hypothesis, which amounts to a conventional wisdom of American liberalism, has been under way during the latest economic recovery, the longest peacetime expansion in our history. For two, three, and four years now, some major American cities have been experiencing a full-employment labor market. There is reason to believe, moreover, that those markets will remain tight over the next several years as (because of demographic factors) the number of new job entrants drops.
What are the results of this natural experiment? Although we do not yet have detailed studies of the effects of the economic recovery on the urban poor, the aggregate data from the Bureau of Labor Statistics tell a two-part story.
From 1982 to 1987, while overall white unemployment was dropping from 8.6 percent to 5.3 percent, black unemployment fell from 18.9 percent to 13.0 percent. This was a major improvement, even though the news is obviously not all good: black rates of unemployment still remain very high by white norms, and the black-white ratio of unemployment has continued to rise even during the recovery, as it has tended to do now for more than thirty years, in good times and bad. But there is no question that blacks looking for jobs have done better during the recovery. In the period 1982-87, in all age groups and for both sexes, blacks in the labor market have made large gains.
The catch lies in the phrase, “in the labor market.” For the second part of the story is that during the same five-year period when unemployment dropped so dramatically, the percentage of black adults not in the labor market changed very little. The only consistent reductions, and they were modest ones, occurred among women. Among men, the proportions remained nearly unchanged.
The labor-market behavior of the men of the underclass is especially crucial, for only by changes in their behavior—in work and in marriage—can the size of the underclass be reduced. The reality that is going to become harder to evade as time goes by is that a large core of young black men is not in the labor market, and the size of that core seems to be extremely resistant to improvements in the economy. The only sign of progress in the aggregate statistics is among black male teen-agers, where the proportion not in the labor market fell from 60 to 56 percent during the years 1982-87. But this can be considered progress only if one ignores the huge size of the base. In 1982, for the critical age period twenty-to-twenty-four, the years when experience and a work record are being (or need to be) established, 21 percent of black males were not in the labor market. In 1987, after five years of economic expansion and falling unemployment, that figure had risen, to 22 percent. (The comparable white figures were 14 percent and 13 percent respectively.)
It must be borne in mind, moreover, that these figures refer to all young black males, including middle-class and working-class blacks with high labor-force participation rates and high levels of enrollment in school. If we were to factor out those elements, in an attempt to isolate figures for black males in the underclass, the results would be still worse. Although we do not know precisely what percentage of black males living in the inner city are not in school and are not in ill health, but also not in the labor force, the figure appears to be very high and (at the least) holding steady.
It may be that more detailed data from the cities with tight labor markets will mitigate these results—full-employment economies surely draw some young males into the labor market. But the figures we already do have from Boston, one of the nation’s most touted labor markets, are not encouraging. Boston had reached nearly full employment for blacks by 1987 (the black unemployment rate was a remarkably low 4.1 percent) but, contrary to any reasonable expectation, the proportion of blacks not in the labor market increased, from 35 percent in 1982 to 39 percent in 1987. It has yet to be determined whether the increase occurred among the working class or the underclass, among the young or the old, among men or women; but once again, and stating the case very conservatively, it is difficult to conclude that anything is happening in Boston to change the overall picture.
The argument that “there just aren’t any jobs,” then, is becoming increasingly hollow, and at the same time a large and extremely visible subpopulation of poor black young males is remaining out of the labor market. For many years, academics and policy-makers alike have found it convenient to dismiss reports of problems in the labor-market behavior of inner-city males as the product of racism or “structural” barriers. A different reality has been out there all along, it has been resolutely ignored, and it is finally beginning to intrude. Liberals will increasingly be hearing themselves saying, “Those people just don’t want to work.”
Still another critical factor in the formation of the new liberal consensus has to do with AIDS. There is a clear danger that over the course of the next ten years, this disease will come increasingly to be identified with the black urban underclass, which will thus increasingly become an object of public fear and dislike, including on the part of liberals.
Now, if AIDS cases are examined in absolute terms, the trendlines for blacks and whites both show steep upward curves, with blacks experiencing substantially fewer cases than whites. Thus, from September 1982 to June 1988, the number of AIDS cases among whites rose from 263 to about 37,000 and among blacks from 94 to about 16,500. But when AIDS is considered in terms of the number of cases per 100,000 population, a very different picture emerges, with whites currently at 18 per 100,000, blacks at 55.
In other words, from 1982 to the present, blacks have acquired new cases at an average of three times the rate per 100,000 population as whites. This ratio has not been getting smaller; indeed, since a change in diagnostic requirements in September 1987, it has been growing, and now stands at 3.8 times the white rate.
Just how much worse the situation is in the inner city than in the population at large is only beginning to be measured. A study of 2,000 emergency-room patients at Johns Hopkins University hospital in Baltimore, which serves a predominantly inner-city clientele, found that 5 percent were infected with the AIDS virus. The highest infection rate was found among young black males aged twenty to thirty-four: 11 percent.
The main reason for the differential rates of increase is thought to be drugs. At present, 78 percent of all AIDS cases among whites have been homosexual or bisexual males who do not use intravenous drugs. Homosexuals tend to be well-informed, and in many places well-organized, and have demonstrated their readiness to change behavior. Among blacks, by contrast, only 38 percent of AIDS cases involve homosexual, non-IV-drug-abusing males. Almost all of the rest are associated with drug abuse, heterosexual transmission, or transmission from mother to infant. Most of these cases occur in populations that are notoriously difficult to reach with educational campaigns: not only drug addicts, but the homeless, the chronically unemployed, those living in large inner cities.
It is possible, of course, that a cure for AIDS will be developed soon and thus remove the problem altogether. It is also possible that the increase in AIDS in the inner city will produce, within that environment itself, a major shift of attitudes and, eventually, behavior, rendering drug dealers, drug users, and prostitutes into social pariahs and positively affecting patterns of employment and family formation alike. Realistically, however, both of these possibilities are remote.
If they fail to occur, AIDS in the black inner city will take on a new relevance for the future of liberalism. So far, the reaction to AIDS victims on the part of the general American population has been driven primarily by two considerations. The first has been sympathy, stemming from the widespread recognition that individuals infected with the virus before its effects became publicized had no way of knowing that the behavior in which they were engaged was exposing them to a fatal disease. The second has been empathy: ordinary, non-drug-using heterosexuals, engaging in what heretofore had been considered normal levels of sexual activity, had to worry that they too might get the disease.
But envision this scenario five years from now. By 1993, both sources of compassion have shriveled. The movement toward a more traditional sexual and marital morality, itself encouraged by the presence of AIDS, has continued. A large proportion of the increasingly middle-aged electorate now knows that it is safe from AIDS. Empathy is gone.
Sympathy has been drastically reduced as well. For what will be the attitude of the increasingly monogamous, abstemious, self-righteous, white electorate toward new AIDS victims, who will by this time have become a major tax burden? People who get AIDS at that point, after more than a decade of being told how the disease is transmitted, will more and more be seen as having brought it on themselves—as, indeed, almost all of them will have done.
Against whom will these reactions be directed? Numerically, the majority of AIDS victims will continue to be white, but “numerically” is not the key factor in determining emotional reactions; “prevalence” is the key. That is why, for example, illegitimacy has been seen as a predominantly black problem in the United States even though historically the majority of illegitimate births have been to white women. And that is why, five years from now, AIDS will also be seen as a predominantly black problem. No matter how one plays out the projections, the prevalence of AIDS among inner-city blacks will dwarf its prevalence among whites.
We may add another aggravating circumstance: because behavior changes so much more rapidly among the homosexual population (the source of most of the white problem) than it does or will among IV drug users (the source of most of the black problem), it is even possible that at some point in the foreseeable future the absolute number of active cases among whites will begin to dwindle while the number of active cases among blacks continues to increase, thereby inviting an even starker racial comparison.
Understanding and patience are going to dwindle across the political spectrum, to be replaced by animosity and/or indifference. Perhaps I am underestimating the compassion of liberals, but I do not believe that they will form a significant exception.
The reason they will not be an exception is to be found partly in developments within liberalism itself.
Twenty years ago, American liberals began to cut themselves loose from what seemed to many of them at the time an outmoded ideal of political and social equality. Before the mid-1960’s, mainstream liberals adhered to a political ideal, grounded in the Constitution itself, according to which we were all equal before God and should be equal before the law. This belief was intimately linked to a social ethic of individual merit and individual achievement; the goal was a society in which each person would be judged, in the famous words of Martin Luther King, Jr., by the content of his character and not by the color of his skin.
But in the course of the 1960’s, equality for liberals came to refer to a much more complex and ambitious egalitarian idea. Perhaps, the reasoning went, in some future America the old principles could be reinstituted. But at least until then, and maybe forever, social justice required something else: economic redistribution, equalization of societal outcomes, government regulation of what had hitherto been seen as matters of private association, and laws providing for the redress of historical grievances even if that meant treating whole groups of people unequally (i.e., preferentially).
Over the last twenty years, the new liberal vision, implemented by means of the policy of preferential treatment and quotas, has led to a number of perfectly disastrous and perfectly predictable results—predictable, that is, to anyone willing to think about what it means to apportion the goods of society according to the color of one’s skin. Hardly a policy-maker or academic anywhere wants to examine these results, and fewer still wish to speak of them, but snippets keep slipping out.
Thus, for example, in the area of undergraduate education it is revealed that at MIT, the average incoming black, although with math scores that put him in the top 10 percent of all college-bound students, finds himself in the bottom 10 percent of all MIT-bound students—and that the subsequent dropout rate for blacks is 24 percent, compared with 14 percent among whites. In the area of job performance, a study of black electricians hired under an affirmative-action court order in Seattle finds that blacks quit or are fired 4.1 times as often as whites, take 3.1 times as many days to sign up for new jobs at the union hall, turn down jobs 2.7 times as often, and lose 1.7 times as many hours leaving the job site early. In the area of professional training, another study reveals that of blacks admitted to medical school, the average score on the Medical College Admission Test (MCAT) is well below the average of whites who are rejected.
As I have argued elsewhere, racists given a free hand to promote racism among liberals could not have concocted a more efficient scheme than strong affirmative action, otherwise known as preferential discrimination.2 By demanding that universities and employers have “enough” blacks no matter what, this scheme systematically maximizes the likelihood that blacks hired at a given job site, or admitted to a given school, will be less capable than the whites beside them. Years of mismatching have facilitated, if they have not actually caused, the recent racial incidents at the Universities of Massachusetts, Michigan, and elsewhere, and these incidents are just the thin leading edge of what we may expect in the coming years.
This does not mean that liberals, having seen the effects of preferential treatment and racial quotas, will openly retreat from them. On the contrary, allegiance to “affirmative action” has become, if anything, a litmus test of one’s continuing liberalism among people who know they are backsliding on other issues. Besides, too many liberal policies continue to be bound up in the premise that social justice requires special treatment for groups, and too many powerful liberal constituencies have an interest in seeing those policies continue, whether or not they serve the goal of equality that was once their stated purpose.
So the next decade will witness a dangerous combination: a continuing belief, no longer even questioned, that it is legitimate for government to treat people as members of groups, but without the moral passion which undergirded that belief twenty years ago and even without any faith that the original goal is capable of achievement. In short, liberalism is becoming unmoored: liberals may still want to “do good,” but they have long since traded away their only coherent framework for deciding what “doing good” means. In the name of doing good, they will in a few years be ready to undertake race-based measures of a nature that once would have appalled them.
Let us draw together the various strands—the aging of the baby-boomers, the failure of the new wave of liberal programs, the demise of “structural unemployment,” the racialization of AIDS, the unmooring of liberalism itself.
For years, the black inner city has been the symbol both of America’s past failures and of its obligation to admit blacks to full equality—and it has also been an object of fear, anger, and guilt. Over the next few years, specific and quite powerful trends will effectively diminish the guilt and increase the fear and anger—especially among liberals. By the mid-1990’s, what is now a more or less hidden liberal condescension toward blacks in general, and toward the black underclass in particular, will have worked its way into a new consensus.
The particular form the new liberal consensus will take depends on circumstances, but in general mainstream liberal intellectuals and policy-makers will have become comfortable believing something like this: (1) inner-city blacks are really quite different from you and me, and the rules that apply to us cannot be applied to them; (2) it is futile to seek solutions that aim at bringing them into participation in American life, because we have seen that it cannot be done; and (3) the humane course is therefore to provide generously, supplying medical care, food, housing, and other social services—much as we currently do for American Indians who live on reservations. And so we will have arrived in the brave new world of custodial democracy, in which a substantial portion of our population, neither convicted as criminal nor adjudged to be insane, will in effect be treated as wards of the state.
To be sure, some such set of beliefs is already abroad in the land, discussed, when it is discussed, sotto voce. It is one thing, however, to have views whispered on the outer limits of respectability, and quite another to have them become an intellectual consensus among mainstream liberals, a consensus which large numbers of other Americans, for reasons of their own, might be happy to join. Such views would then become the baseline from which other, still more extreme measures to segregate the underclass could be contemplated. And then we would be in danger of witnessing the unmooring not just of liberalism, but of American democracy itself.
Arthur Schlesinger is right: intellectual vitality goes in cycles, and the liberals’ turn has probably come around again even if the Republicans should keep the White House. The tragedy is that the liberals’ turn is coming just at a time when they are about to reap the consequences of a two-decades-old heritage in which they have lost faith but which they cannot bring themselves to disavow.
1 For a recent and eloquent statement of this case, see William Julius Wilson, The Truly Disadvantaged: The Inner City, the Underclass, and Public Policy.
2 “Affirmative Racism,” New Republic, December 31, 1984.
Choose your plan and pay nothing for six Weeks!
The Coming of Custodial Democracy
Must-Reads from Magazine
The Elon Musk problem.
No one has ever mistaken me for a business writer. Show me a balance sheet or quarterly report, and my eyes will glaze over. Bring up “chasing alpha” at the bar, and I’ll ask for the check and give you the old Irish goodbye. Business chatter—the kind you can’t help but overhear from young stockjobbers at the gym and bloaty middle managers on the Acela—bores me to tears. I’m especially allergic to the idea of “The Market” as an autonomous, anthropomorphic entity with a unitary will and mind of its own.
But even I can tell you that Elon Musk is imploding.
The latest omen came Friday when footage of the South African-born magnate smoking a fat marijuana blunt dropped online. The video is worth watching; the Guardian has the key bits from the 150-minute interview (do people really watch interviews this long?).
Rogan, whose fame has been a mystery to many yet is an inescapable fact of our online lives, offers the joint to Musk but is quick to add: “You probably can’t [smoke it] because of stockholders, right?” (On second thought, I think I know why Rogan is famous—because he knows how to push his subjects’ buttons.)
“I mean it’s legal, right?” Musk replies.
And so Elon Musk—the founder of an electric-car company worth $50 billion and a rocket company worth $20 billion—presses the blunt between his lips and takes a drag. He washes it down with a sip of whiskey on the rocks.
“I’m not a regular smoker of weed,” Musk says a few minutes later. “I almost never [smoke it]. I mean, it’s it’s—I don’t actually notice any effect.” His speech by now is noticeably more halting than it has been earlier in the interview. “I know a lot of people like weed, and that’s fine. But I don’t find that it is very good for productivity.”
The Market was not amused. News of two senior Tesla executives quitting their jobs broke soon after the interview appeared. Tesla shares slid 8 percent. On Twitter, where he competes with President Trump for the World Megalomaniac Award, Musk tweeted out his Rogan interview, adding: “I am a business magnet.” Perhaps he was still coming down.
These disasters follow the summer’s going-private fiasco. In early August, Musk claimed he had secured the vast funding needed to take his company private and then did a switcheroo. Tesla short-sellers, whom Musk constantly tries to show up, were vindicated. The Market got angry; shares slid.
“Moving forward, we will continue to focus on what matters most,” Musk wrote in a statement to investors two weeks later, “building products that people love and that make a difference to the shared future of life on Earth. We’ve shown that we can make great sustainable energy products, and we now need to show that we can be sustainably profitable.”
That apparently entails shooting the THC-laden breeze with Joe Rogan for two and a half hours.
The question now is: How did Musk ever get so big in the first place? There were many Tesla-skeptics, of course, chief among them those very short-sellers. They were onto something, perhaps because they sensed that a sound inventor-investor-executive would be more concerned with producing a reliable, profitable, non-subsidized automobile than with . . . showing up short-sellers. Even so, Tesla shares climbed and climbed. Even now, after Friday’s Harold and Kumar routine, the stock is trading north of $260.
Two explanations come to mind. The first is that, after Steve Jobs’s death, Wall Street and Silicon Valley types were seeking the next Eccentric Visionary to whom they could hitch their dreams. And Musk was straight out of central casting for Eccentric Visionary. Ending climate change. Colonizing Mars. Super-trains linking cities across vast distances. Everything seemed possible with him. Who knows, maybe the hopes were well-placed at one point, and the adulation went to the man’s head?
The second explanation, which needn’t be mutually exclusive with the first, is ideology. So much of Musk’s business reputation rested on his claims of solving climate change and other planetary crises that loom large in the minds of the Davos crowd. Musk embodied the ideological proposition that no modern problem eludes solution by noble-minded technocratic elites. The Market, it turns out, was as prone to magical thinking as any of the rest of us.
Clarification: News of the Tesla executives’ departure broke following Musk’s pot-smoking interview, but at least one of the departures had been finalized earlier this week.
Choose your plan and pay nothing for six Weeks!
The course the West followed has been a disaster.
The West has squandered the last, best opportunity to rid the world of the criminal regime in Syria.
Damascus was designated a state sponsor of terrorism in 1979, and it has lived up to that title every year since. Syria’s descent into civil war presented several opportunities to dispense with the despot in Damascus and avert a crisis in the process, but they were all ignored. As I wrote for National Review, Syria is a case study in the perils of ideological non-interventionism. The results of the West’s over-reliance on covert action, outsourcing, and diplomacy in Syria is arguably the worst-case scenario.
Had Barack Obama not abandoned his infamous “red line” in 2013, the U.S. might have preserved the 100-year prohibition on the battlefield use of chemical weapons. The collapse of that taboo has been rapid and terrifying. In the years that followed, chemical arms have been regularly deployed in Syria, and rogue powers have been using complex nerve agents on foreign (even allied) soil in reckless state-sponsored assassination campaigns.
Ideological adherence to non-interventionism well after it had proven an untenable course of action allowed the flourishing of terrorist organizations. Some parties in the West with a political interest in isolationism deliberately confused these terrorist groups with secularist movements led by Assad regime defectors. In the years that followed, those moderate rebel factions were crushed or corrupted while Islamist terror networks, which provided a politically valuable contrast to the “civilized” regime in Damascus, were patronized and nurtured by Assad.
The incubation of terrorist organizations eventually necessitated the kind of American military intervention Obama had so desperately sought to avoid, but at a time and place not of America’s choosing and with a footprint too small to achieve any permanent solution to the crisis. All the while, a great human tide poured out from Syria in all directions, but especially into Europe. There, an influx of unassimilated migrants eroded the continent’s post-War political consensus and catalyzed the rise of illiberal populist factions.
Even as late as the summer of 2015, there was still time for the West to summon the courage to do what was necessary. In a stunning speech that summer, Assad himself admitted that Syrian forces suffered from “a lack of human resources” amid Western estimates that nearly half the 300,000-strong Syrian army had been killed, captured, or deserted. “Based on current trend lines, it is time to start thinking about a post-Assad Syria,” an intelligence source told the Washington Post’s David Ignatius. But Obama dithered still. Just a few short weeks later, Vladimir Putin, upon whom Obama relied to help him weasel out of his pledge to punish Assad for his crimes, intervened in Syria on Damascus’s behalf. That was when the greatest crimes began.
Russian intervention in Syria began not with attacks on “terrorists,” as Moscow claimed, but with attacks on covert CIA installations and arms depots; a dangerous campaign that continued well into the Trump era. The Syrian regime and its Iranian and Russian allies then embarked on a scorched-earth campaign. They bombed civilian neighborhoods and hospitals and maternity wards. They surrounded the liberated cities of Homs and Aleppo, barraging and starving their people into submission. They even targeted and destroyed a United Nations aid convey before it could relieve the famine imposed by Damascus. All the while, Moscow’s propagandists mocked reports of these atrocities, and the children who stumbled bloodied and ashen from the ruins of their homes were deemed crisis actors by Russian officials and their Western mouthpieces.
America’s strategic obligations in Syria did not diminish with Russian intervention. They increased, but so too did the danger. Early on, Russian forces concentrated not just on attacking Assad’s Western-backed enemies but on harassing NATO-aligned forces that were already operating in the Syrian theater. Russian warplanes harassed U.S. drones, painted allied assets with radar, conducted near-miss fly-bys of U.S. warships and airplanes in the region, and repeatedly violated Turkish airspace. This conduct was so reckless that, in November of 2015, NATO-allied Turkish anti-aircraft fire downed a Russian jet. On the ground, Moscow and Washington engaged in the kind of proxy fighting unseen since the collapse of the Soviet Union, as U.S.-manufactured armaments were routinely featured in rebel-made films of successful attacks on Russian tanks and APCs.
In the years that followed this intensely dangerous period, the Syrian state did not recover. Instead, Syrian forces withdrew to a narrow area along the coast and around the capital and left behind a vacuum that has been filled by competing great powers. Iran, Russia, Turkey, Jordan, Saudi Arabia, Qatar, the United Arab Emirates, Canada, the United Kingdom, France, Australia, and the United States, to say nothing of their proxy forces, are all competing to control and pacify portions of the country. Even if the terrorist threat is one day permanently neutralized in Syria—a prospect that today seems far off, considering these nations’ conflicting definition of what constitutes a terrorist—the state of competition among these powers ensures that the occupation of Syrian territory will continue for the foreseeable future.
And now, the final battle is upon the rebels. On Friday, hundreds of Syrians waving the “independence flag” poured into the streets of Idlib, the last of the country’s free cities, begging the international community to spare them from the onslaught that has already begun. The United Nations has warned that up to 800,000 people could be displaced in Damascus’s efforts to retake the rebel-held enclave, and the worst of the seven-year war’s humanitarian disasters may be yet to come.
Over the last two weeks, the United States has issued some ominous warnings. Senior American officials have begun telling reporters that the evidence is increasing of Damascus’s moving chemical munitions near the frontlines with the intent of using them on civilians. Trump administration officials announced in no uncertain terms that they would respond to another chemical attack with disproportionate force.
In response to these threats, Moscow deployed the biggest Russian naval taskforce on the Syrian coast since 2015. Simultaneously, Russia has warned of its intent to strike “militant” positions in the country’s Southwest, where U.S. soldiers routinely patrol. American forces are holding firm, for now, and the Pentagon insists that uniformed personnel are at liberty to defend themselves if they come under assault. If there is a conflict, it wouldn’t be the first time Americans and Russians have engaged in combat in Syria.
In February, Russian mercenaries and Syrian soldiers reinforcing columns of T-72 tanks and APCs armed with 125-millimeter guns engaged a position just east of the Euphrates River held by American Green Berets and Marines. The four-hour battle that ensued resulted in hundreds of Russian fatalities, but it may only have been a terrible sign of things to come.
Of course, a Western-led intervention in the Syrian conflict would have been accompanied by its own set of setbacks. What’s more, the political backlash and dysfunction that would have accompanied another difficult occupation in the Middle East perhaps presented policymakers with insurmountable obstacles. But the course the West followed instead has been a disaster.
The lessons of the Syrian civil war are clear: The U.S. cannot stay out of destabilizing conflicts in strategically valuable parts of the world, no matter how hard it tries. The humanitarian and political disasters that resulted from Western indifference to the Syrian plight is a grotesque crime that posterity will look upon with contempt. Finally, the failure to enforce prohibitions against chemical-weapons use on the battlefield has emboldened those who would use them recklessly. American soldiers will suffer the most in a world in which chemical warfare is the status quo of the battlefield of the future.
American interventionists are often asked by their opponents to reckon with the bloodshed and geopolitical instability their policies encourage. If only non-interventionists would do the same.
Choose your plan and pay nothing for six Weeks!
And the demands of realpolitik.
Earlier this week, my housekeeper, Mary, arrived to work decked out in a bright red T-shirt emblazoned with a photo of Philippine President Rodrigo Duterte, who came to Israel last Sunday for a three-day official visit.
Mary was at the Knesset on Monday, one of several hundred Filipino workers among approximately 28,000 in Israel, enthusiastically cheering her strongman president.
I asked her what she thought of Duterte–a leader who makes President Trump seem eloquent and measured, by comparison–and I was taken aback by her effusive, unhesitating endorsement: “Oh,” she enthused, “he is a very good president! The best!”
“But,” I suggested, carefully, “he says and does some pretty extreme, crazy things. Does that concern you at all?”
“Oh, no!” she collapsed in laughter. “He doesn’t mean that. It’s just his style.”
Indeed, Duterte has “style.” Bragging of his intent to kill millions of Filipino drug addicts, and invoking Hitler and his genocidal rampage, approvingly, in this context; referring to President Obama as a “son of a whore”; boasting of his parsimony in keeping multiple mistresses available in low-end hotels; approving of sexually assaulting women, particularly attractive ones. And then there was the outburst during the Pope’s visit to the very Catholic Philippines in 2015 when Duterte called him a “son of a bitch” for causing a traffic jam while in Manila.
Mary is not a simple woman. She is university educated, hard-working, pleasant, and respectful. And whatever makes her overlook Duterte’s thuggish tendencies should interest us all, because there are many Marys the world over supporting populist leaders and governments. Mary admires Duterte’s strength of conviction in dealing with drug dealers, addicts, corruption and Islamic extremists.
Human rights activists and journalists, of course, see only a brute who visited Israel to shop for weapons and defense capabilities, which would be put to questionable use. Then again, Duterte is hardly the first and far from the only unsavory ruler to come shopping in Israel, America, or elsewhere, for arms.
Israel deftly managed the visit and optics. Whereas many were disgusted that the PM and President Rivlin gave Duterte an audience, according him a legitimacy and respect that is undeserved, their meetings were brief and remarks carefully calibrated.
In addition to acknowledging his personal gratitude to the Filipino caregiver who was a companion to his father in his final years, Bibi reminded us all of the enduring friendship the Philippines has shown Israel, and Jews, for decades. Prior to WWII, then president Manuel Quezon made available 10,000 visas as part of an “open door” policy to accommodate European Jewish refugees. Only 1,300 were used, ultimately, due to the Japanese invasion which closed off escape routes.
In 1947, the Philippines was the only Asian country to vote in support of the 1947 UN Partition Plan, providing critical support for the momentum building towards the creation and international acceptance of the Jewish state one year later. These are important, historical events about which Bibi, quite rightly, chose to remind us all.
I am no cheerleader of dictators and thugs, but I do wonder why the morality of many objectors to the Duterte visit is so selective. Israel (and all western nations) has relations and ties with many countries led by dictators and rulers far more brutal than the democratically elected Duterte.
Much ado has been made in recent months of Bibi’s meetings with a number of right-wing populists and, worse. Some link it to what they see as disturbing, anti-democratic tendencies in his own leadership of late. Others, myself included, would read it as a careful effort to maintain and cultivate as many international relationships as possible that may enhance Israel’s strategic and economic interests, particularly in this period of extreme global political, economic and institutional instability.
Choose your plan and pay nothing for six Weeks!
The paradox of success.
The monthly jobs report from the Bureau of Labor Statistics released Friday morning shows that the economy continues to flourish. 201,000 new jobs were added last month, while the unemployment rate stayed steady at a very low 3.9 percent.
Unemployment rates for African-Americans and teenagers continued their decline to historic lows, while US factory activity was at a 14-year high and new unemployment claims at their lowest point since the 1960s. The long-term unemployed (those out of work for 27 weeks or longer) has fallen by 24 percent in the last year. The number of part-time workers who want full-time work has gone down by 16 percent over the last 12 months. Wages are rising at a faster pace than they have, a sign of a tightening jobs market.
Corporate profits are robust (thanks partly to the cut in the corporate income tax) and consumer spending has been rising. The GDP has been growing at a more than 4 percent rate in recent months. In short, the American economy has rarely been this good and certainly wasn’t during the long, sluggish recovery from the 2008-2009 recession under the Obama administration.
In an ordinary year, one would expect that with economic numbers this good, the party controlling both houses of Congress and the White House would be looking forward to doing well in the upcoming midterm election, even though the party holding the White House usually loses seats in midterms. But, of course, no year is an ordinary political year with Donald Trump in the White House and the Democratic Party moving ever more to the left.
November 6 will be an interesting night.
Choose your plan and pay nothing for six Weeks!
We deserve better.
You could be forgiven for thinking that everyone active in American politics has lost their minds.
What we’re witnessing is not, however, collective madness. The political class in the United States has adapted to a constant atmosphere of high drama, and they’ve adopted the most theatrical poses possible if only to maintain the attention of their fickle audiences. What might look to dispassionate observers like mass hysteria is just overwrought performance art.
This week was a case study in our national insanity, which began aptly enough on Capitol Hill. There, confirmation hearings for Judge Brett Kavanaugh got underway, but Judge Kavanaugh’s presence was barely noticed. The hearings soon became a platform for some familiar grandstanding by members of the opposition party, but the over-acting to which the nation was privy was uniquely embarrassing.
New Jersey Senator Cory Booker chewed the scenery, as is his habit, by declaring himself Spartacus and demanding that he be made a “martyr” via expulsion from the Senate for releasing one of Kavanaugh’s emails to the public, supposedly in violation of Senate confidentiality rules. But there was no violation, said Bill Bruck, the private attorney who led the review of Kavanaugh’s former White House records in the Senate. “We cleared the documents last night shortly after Senator Booker’s staff asked us to,” he said in a statement. Perhaps by engaging in what he called “an act civil disobedience,” Booker was only following the lead of his colleague, Senator Sheldon Whitehouse, who declared the committee’s process illegitimate, thereby supposedly rendering the rules of the United States Senate unworthy of recognition.
Outside another congressional committee’s chamber, the crazy really ramped up to absurd proportions. Following a hearing on alleged bias in Silicon Valley, Senator Marco Rubio was confronted by the rabble-rousing conspiracy theorist Alex Jones, which rapidly devolved to the point that both Senator and agitator were soon threatening to fight one another. “I know you’ve got to cover them, but you give these guys way too much attention,” Rubio later told reporters. “We’re making crazy people superstars. So, we going to get crazier people.” He’s right.
The Trump era has provided the press with fertile soil in which a thousand manic flowers have bloomed.
Amplified by the president himself, Jones has become one of the right’s favorite grifters. Unfortunately, he’s in plentiful company. The press has discovered a sudden interest in conspiracy theorists like Jack Posobiec, Mike Cernovich, and Laura Loomer partly because they make for such compelling television but also because they’re willing to confirm the pro-Trump right’s most paranoid suspicions.
The “Resistance” has been a valuable vehicle for the unscrupulous and under-medicated. Congresswoman Maxine Waters has been feted in the press and in apolitical venues such as the MTV Movie Awards not despite but because of her penchant for radicalizing the left and feeding them fantasies about a coming anti-Trump putsch. Former British MP Louise Mensch, “D.C. technocrat” Eric Garland, and Teen Vogue columnist Lauren Duca spent most of 2017 basking in attention and praise from respectable quarters of the Washington political and media class. Their manifest unfitness for such elevated status somehow evaded drama addicts in mainstream political and media quarters.
And whether you’re pandering to the pro-Trump right or the anti-Trump left, there’s plenty of cash to go around for those who are willing to indoctrinate children or undermine the integrity of apolitical American institutions.
The week’s most hysterical moment belongs to the president and his aides—specifically, their reaction to an anonymous op-ed published by the New York Times purportedly revealing the existence of a cabal in the administration dedicated to thwarting the president’s worst impulses. Now, some have expressed perfectly reasonable reservations about the Times’s decision to publish an anonymous op-ed. Others have fretted about the pernicious effects this disclosure might have on the already mercurial president’s approach to governance. But lost in the over-the-top reactions this piece inspired among political observers is the hackneyed nature of the revelations it contained.
In sum, the author disclosed that many members of this Republican administration are movement conservatives dedicated to conservative policy prescriptions that are antithetical to the policies on which Trump campaigned. As such, they have often successfully lobbied the president to adopt their positions over his own preferences.
The admittedly dangerous “two-track presidency” has been observable for some time, and is the frequent subject of reporting and opinion. For example, the op-ed highlighted the discrepancy between Trump’s conciliatory rhetoric toward Russia and his administration’s admirably hawkish posture, which has become such a glaringly conspicuous feature of his presidency that Trump has recently begun trumpeting his contradictory record as though it was a unique species of competence. There’s nothing wrong with taking issue with the way in which the obvious was stated in this op-ed, but the statement of the obvious should not itself be a source of special consternation.
But was it ever. The Drudge Report dubbed the author a “saboteur,” despite the op-ed failing to describe even one action that was taken on the part of this so-called “resistance” against the president’s expressed wishes. “Sedition,” former White House Aide Sebastian Gorka echoed. Sarah Huckabee Sanders attacked the anonymous columnist as a “coward.” The president himself pondered whether the op-ed constituted “treason” against the United States and demanded the Times “turn over” this “gutless” columnist to the proper authorities, whoever they are. This is certainly one way to refute the charge that Trump’s “impulsiveness results in half-baked, ill-informed and occasionally reckless decisions,” but it’s not a good one.
It’s hard to fault politicians and the press for selling drama. Banality doesn’t push papers, drive up advertising rates, or turn out the vote. At a time without an urgent crisis, when the economy is strong, and the fires abroad are relatively well-contained, it serves the political and media classes to turn up the temperature on mundanities and declare all precedents portentous. But radicalizing voters for such purposes is both trite and irresponsible. In America, healthy and productive politics is boring politics. And who would tune in for that?