Commentary Magazine


Posts For: August 18, 2009

Robert Novak’s Memoir

Terry Teachout wrote this post in August 2007.

Say what you will about Robert Novak—and some contributors to COMMENTARY have said plenty—he remains one of America’s most important newspaper columnists. In addition, Novak is also one of the last of a dying breed of opinionmongers whose columns are reported rather than merely spun out of the parchment-thin air of their prejudices (which doesn’t mean he’s not prejudiced!). Thus, The Prince of Darkness: 50 Years Reporting in Washington, despite its monstrous length and penny-plain prose style, is significant by definition, just as a candid memoir by Walter Lippmann or Drew Pearson would have been similarly significant. Henceforth anyone who writes about journalism in postwar Washington will have to cite The Prince of Darkness as a primary source, just as anyone who reads it will learn from it—though certain of its revelations are, like those of most memoirists, unintended.

One of the things that has already struck many reviewers of The Prince of Darkness is the way in which its author has coddled his resentments throughout the course of a long, busy life. It seems to me noteworthy that a man as successful as Novak should still be capable of writing with such raw resentment of having been passed over as sports editor of his college newspaper, or that he should go out of his way repeatedly to make glowering mention of his unpopularity in Washington. Some anonymous wag once called John O’Hara “the master of the fancied slight.” I doubt that many of Novak’s slights are fancied, but they give much the same impression when consumed in bulk.

Fortunately, there are more compelling autobiographical revelations to be gleaned from The Prince of Darkness. It is hugely interesting, for instance, to read of how a youthful reading of Whittaker Chambers’s Witness turned a moderate-to-liberal Republican into the hardest of anti-Communists, or how a secular Jew should have felt moved to embrace Roman Catholicism late in life. Most interesting of all, though, is the black cynicism with which Novak writes of the politicians among whom he has moved for virtually the whole of his adult life. A few escape his contempt—he was impressed, for instance, by the depth of Ronald Reagan’s reading in the history of economics—but for the most part he views them as shallow power-seekers who use everyone around them, and are themselves used in turn.

A handful of Washington journalists have written of the inhabitants of their milieu with comparable candor, most notably Meg Greenfield in Washington, her posthumous memoir: “These are people who don’t seem to live in the world so much as to inhabit some point on graph paper, whose coordinates are (sideways) the political spectrum and (up and down) the latest overnight poll figures.” But Novak’s honesty about the mutual manipulativeness of his relationships with the politicians he has covered exceeds anything I have hitherto seen in print. Among other things, he acknowledges that he’s more likely to trash you in print if you won’t talk to him off the record:

Am I suggesting a news source could buy off Novak with a hamburger in the White House? No government official or politician can secure immunity from a reporter by helping him out. Even my most important sources—such as Mel Laird and Wilbur Mills—were not immune from an occasional dig. Still, Bob Haldeman was treated more harshly because he refused any connection with me. He made himself more of a target than he had to be by refusing to be a source.

Even more revealing is Novak’s description of his relationship with Karl Rove:

What you did not find in my columns was criticism of Karl Rove. I don’t believe I would have found much to criticize him about even if he had not been a source, but reporters—much less columnists—do not attack their sources. . . . In four decades of talking to presidential aides, I never had enjoyed such a good source inside the White House. Rove obviously thought I was useful for his purposes, too. Such symbiotic relationships, built on self-interest, are the rule in high-level Washington journalism.

Perhaps I’m not enough of a cynic to appreciate fully Novak’s point of view—I’ve spent little time in Washington and less, thank God, in the company of politicians—but even so, I find that last sentence chillingly bleak. Imagine spending a half-century working in a town where the naked pursuit of self-interest governs all your personal relationships! Seen in that lurid light, the title of The Prince of Darkness, though it is Novak’s well-known nickname, ended up putting me in mind of The Screwtape Letters, C.S. Lewis’s fictional portrayal of the ceaseless backstabbing engaged in by Satan’s staff of tempters. Small wonder that Novak finally got religion. No doubt a day came when he looked around him and found himself echoing the terrible words of Christopher Marlowe’s Mephistophilis: “Why, this is hell, nor am I out of it.”

Terry Teachout wrote this post in August 2007.

Say what you will about Robert Novak—and some contributors to COMMENTARY have said plenty—he remains one of America’s most important newspaper columnists. In addition, Novak is also one of the last of a dying breed of opinionmongers whose columns are reported rather than merely spun out of the parchment-thin air of their prejudices (which doesn’t mean he’s not prejudiced!). Thus, The Prince of Darkness: 50 Years Reporting in Washington, despite its monstrous length and penny-plain prose style, is significant by definition, just as a candid memoir by Walter Lippmann or Drew Pearson would have been similarly significant. Henceforth anyone who writes about journalism in postwar Washington will have to cite The Prince of Darkness as a primary source, just as anyone who reads it will learn from it—though certain of its revelations are, like those of most memoirists, unintended.

One of the things that has already struck many reviewers of The Prince of Darkness is the way in which its author has coddled his resentments throughout the course of a long, busy life. It seems to me noteworthy that a man as successful as Novak should still be capable of writing with such raw resentment of having been passed over as sports editor of his college newspaper, or that he should go out of his way repeatedly to make glowering mention of his unpopularity in Washington. Some anonymous wag once called John O’Hara “the master of the fancied slight.” I doubt that many of Novak’s slights are fancied, but they give much the same impression when consumed in bulk.

Fortunately, there are more compelling autobiographical revelations to be gleaned from The Prince of Darkness. It is hugely interesting, for instance, to read of how a youthful reading of Whittaker Chambers’s Witness turned a moderate-to-liberal Republican into the hardest of anti-Communists, or how a secular Jew should have felt moved to embrace Roman Catholicism late in life. Most interesting of all, though, is the black cynicism with which Novak writes of the politicians among whom he has moved for virtually the whole of his adult life. A few escape his contempt—he was impressed, for instance, by the depth of Ronald Reagan’s reading in the history of economics—but for the most part he views them as shallow power-seekers who use everyone around them, and are themselves used in turn.

A handful of Washington journalists have written of the inhabitants of their milieu with comparable candor, most notably Meg Greenfield in Washington, her posthumous memoir: “These are people who don’t seem to live in the world so much as to inhabit some point on graph paper, whose coordinates are (sideways) the political spectrum and (up and down) the latest overnight poll figures.” But Novak’s honesty about the mutual manipulativeness of his relationships with the politicians he has covered exceeds anything I have hitherto seen in print. Among other things, he acknowledges that he’s more likely to trash you in print if you won’t talk to him off the record:

Am I suggesting a news source could buy off Novak with a hamburger in the White House? No government official or politician can secure immunity from a reporter by helping him out. Even my most important sources—such as Mel Laird and Wilbur Mills—were not immune from an occasional dig. Still, Bob Haldeman was treated more harshly because he refused any connection with me. He made himself more of a target than he had to be by refusing to be a source.

Even more revealing is Novak’s description of his relationship with Karl Rove:

What you did not find in my columns was criticism of Karl Rove. I don’t believe I would have found much to criticize him about even if he had not been a source, but reporters—much less columnists—do not attack their sources. . . . In four decades of talking to presidential aides, I never had enjoyed such a good source inside the White House. Rove obviously thought I was useful for his purposes, too. Such symbiotic relationships, built on self-interest, are the rule in high-level Washington journalism.

Perhaps I’m not enough of a cynic to appreciate fully Novak’s point of view—I’ve spent little time in Washington and less, thank God, in the company of politicians—but even so, I find that last sentence chillingly bleak. Imagine spending a half-century working in a town where the naked pursuit of self-interest governs all your personal relationships! Seen in that lurid light, the title of The Prince of Darkness, though it is Novak’s well-known nickname, ended up putting me in mind of The Screwtape Letters, C.S. Lewis’s fictional portrayal of the ceaseless backstabbing engaged in by Satan’s staff of tempters. Small wonder that Novak finally got religion. No doubt a day came when he looked around him and found himself echoing the terrible words of Christopher Marlowe’s Mephistophilis: “Why, this is hell, nor am I out of it.”

Read Less

The E-Word

My former White House colleague Bill McGurn, in his Wall Street Journal column today, recounts liberal apoplexy when the word evil was used by Ronald Reagan (to describe the Soviet Union) and George W. Bush (to describe Saddam Hussein’s Iraq, Iran, and North Korea). He proceeds to writes this:

With all this history, you would think Harry Reid (D., Nev.) had ample warning. Nevertheless, the Senate majority leader invoked the e-word himself last week at an energy conference in Las Vegas, where he accused those protesting President Barack Obama’s health-care proposals of being “evil mongers.” So proud was he of this contribution to the American political lexicon that he repeated it to a reporter the next day and noted the phrase was “an original.”

And then . . . nothing. No thundering rebuke from the New York Times. No outburst from Mr. Carter. In fact, it’s hard not to notice that the good and gracious people who instinctively recoil at words like “evil” or “un-American” (the preferred term of Mr. Reid’s counterpart in the House, Speaker Nancy Pelosi) have all been silent.

It would be easy to read something dark into Mr. Reid’s characterization, and the yawn with which it has been greeted. In fact, what we have here is really the logical extension of the liberal assumption that they have a monopoly on brain power. In such a world, anyone who dissents, almost by definition, has to be stupid or evil or both.

It’s a wonderful point, made by a wonderful writer.

My former White House colleague Bill McGurn, in his Wall Street Journal column today, recounts liberal apoplexy when the word evil was used by Ronald Reagan (to describe the Soviet Union) and George W. Bush (to describe Saddam Hussein’s Iraq, Iran, and North Korea). He proceeds to writes this:

With all this history, you would think Harry Reid (D., Nev.) had ample warning. Nevertheless, the Senate majority leader invoked the e-word himself last week at an energy conference in Las Vegas, where he accused those protesting President Barack Obama’s health-care proposals of being “evil mongers.” So proud was he of this contribution to the American political lexicon that he repeated it to a reporter the next day and noted the phrase was “an original.”

And then . . . nothing. No thundering rebuke from the New York Times. No outburst from Mr. Carter. In fact, it’s hard not to notice that the good and gracious people who instinctively recoil at words like “evil” or “un-American” (the preferred term of Mr. Reid’s counterpart in the House, Speaker Nancy Pelosi) have all been silent.

It would be easy to read something dark into Mr. Reid’s characterization, and the yawn with which it has been greeted. In fact, what we have here is really the logical extension of the liberal assumption that they have a monopoly on brain power. In such a world, anyone who dissents, almost by definition, has to be stupid or evil or both.

It’s a wonderful point, made by a wonderful writer.

Read Less

Self-Censorship Watch

I’m not afraid of Islamists. As long as we in the liberal West don’t browbeat ourselves into submission, and as long as we take sensible measures to defend ourselves from the threat of mass terrorism—especially from WMDs—there is no reason for free, modern, and enterprising societies to fear losing to medieval Islamism. What makes me nervous is that there are too few leaders today willing to follow Reagan’s example from the Cold War in saying this, and that there are too many “opinion leaders” and officials who are eager to split the difference between good and evil.

Consider Yale University Press, which decided not to publish the Danish cartoons of Muhammad in a forthcoming book entitled The Cartoons That Shook the World. Not only that: it also pulled “a drawing for a children’s book; an Ottoman print; and a sketch by the 19th-century artist Gustave Doré of Muhammad being tormented in Hell” from the same volume. The director of the Yale University Press, John Donatich, followed the usual line of saying that the decision to censor his own press’s book was “difficult”—but that when “it came between that and blood on my hands, there was no question.”

So the press accepted a book on the cartoon controversy, then decided at the last moment not to reproduce the actual cartoons on the grounds that if any riots resulted, the press would be responsible for the results. That’s a messed-up editorial process and a silly mutilation of the resulting book. But note who is not held responsible: the potential rioters, who are apparently automatons and spring uncontrollably and unstoppably into action whenever a Westerner dares to offend them. Thus, as part of our relentless quest for a quiet life, the press must avoid causing offense to anyone. If that is not appeasement, I don’t know what is.

As usual, though, Britain leads the way when it comes to this sort of dangerous foolishness. Earlier this month, three female police officers in South Yorkshire participated in an “In Your Shoes Day” exercise. They spent a day walking around in public in hijab, headscarves, and niqab veils, accompanied by four Muslim women, who were then treated to a tour of the custody suite and the CCTV office in headquarters. The point of all this: to teach the police officers about the pervasive racism of British society by revealing to them all the oppression and insults they would encounter when wearing a burka.

Actually, nothing much happened, so the point of the experience was rather lost. Or perhaps it wasn’t: one of the police officers reported that the day “had given her a greater appreciation of how Muslim women feel when they walk out in public in ‘clothing appropriate to their beliefs.’ ” Yes indeed, the police in South Yorkshire are being taught—and happen to accept the lesson—that the most conservative style of female dress is the one “appropriate” to Islam, and that anyone who protests this is illiberal at best and racist at worse.

Thus, explicitly, the police are taught to sympathize with the most radical Islamists and to regard with suspicion any Muslim liberal who dares to resist their intimidation. And, as Yale University Press has shown, intimidation persists for a simple reason: it works.

I’m not afraid of Islamists. As long as we in the liberal West don’t browbeat ourselves into submission, and as long as we take sensible measures to defend ourselves from the threat of mass terrorism—especially from WMDs—there is no reason for free, modern, and enterprising societies to fear losing to medieval Islamism. What makes me nervous is that there are too few leaders today willing to follow Reagan’s example from the Cold War in saying this, and that there are too many “opinion leaders” and officials who are eager to split the difference between good and evil.

Consider Yale University Press, which decided not to publish the Danish cartoons of Muhammad in a forthcoming book entitled The Cartoons That Shook the World. Not only that: it also pulled “a drawing for a children’s book; an Ottoman print; and a sketch by the 19th-century artist Gustave Doré of Muhammad being tormented in Hell” from the same volume. The director of the Yale University Press, John Donatich, followed the usual line of saying that the decision to censor his own press’s book was “difficult”—but that when “it came between that and blood on my hands, there was no question.”

So the press accepted a book on the cartoon controversy, then decided at the last moment not to reproduce the actual cartoons on the grounds that if any riots resulted, the press would be responsible for the results. That’s a messed-up editorial process and a silly mutilation of the resulting book. But note who is not held responsible: the potential rioters, who are apparently automatons and spring uncontrollably and unstoppably into action whenever a Westerner dares to offend them. Thus, as part of our relentless quest for a quiet life, the press must avoid causing offense to anyone. If that is not appeasement, I don’t know what is.

As usual, though, Britain leads the way when it comes to this sort of dangerous foolishness. Earlier this month, three female police officers in South Yorkshire participated in an “In Your Shoes Day” exercise. They spent a day walking around in public in hijab, headscarves, and niqab veils, accompanied by four Muslim women, who were then treated to a tour of the custody suite and the CCTV office in headquarters. The point of all this: to teach the police officers about the pervasive racism of British society by revealing to them all the oppression and insults they would encounter when wearing a burka.

Actually, nothing much happened, so the point of the experience was rather lost. Or perhaps it wasn’t: one of the police officers reported that the day “had given her a greater appreciation of how Muslim women feel when they walk out in public in ‘clothing appropriate to their beliefs.’ ” Yes indeed, the police in South Yorkshire are being taught—and happen to accept the lesson—that the most conservative style of female dress is the one “appropriate” to Islam, and that anyone who protests this is illiberal at best and racist at worse.

Thus, explicitly, the police are taught to sympathize with the most radical Islamists and to regard with suspicion any Muslim liberal who dares to resist their intimidation. And, as Yale University Press has shown, intimidation persists for a simple reason: it works.

Read Less

A Sign of the Times

Over at Politico’s The Arena,” which features a daily debate between policymakers and opinion shapers (full disclosure: I participate in those debates from time to time), the questions of the day are:

The right ticked off. The left ticked off. A muddle in the middle. How did Obama get into this mess? How does he get out of it?

Call it a sign of the times.

Over at Politico’s The Arena,” which features a daily debate between policymakers and opinion shapers (full disclosure: I participate in those debates from time to time), the questions of the day are:

The right ticked off. The left ticked off. A muddle in the middle. How did Obama get into this mess? How does he get out of it?

Call it a sign of the times.

Read Less

Financial Times Gets Afghanistan Wrong

The Financial Times has an op-ed today entitled “We Must Face Reality in Afghanistan.” But the picture painted by its author, Gilles Dorronsoro, a visiting scholar at the Carnegie Endowment for International Peace who was previously a professor at the Sorbonne, doesn’t reflect reality. It’s like a dispatch from Bizarro Afghanistan—an alternate universe.

He begins: “The Taliban is winning the war in Afghanistan, a fact even the top U.S. commander in the country reluctantly concedes.” Apparently he missed the fact—which I noted in a previous COMMENTARY article—that General McChrystal said no such thing; his position was twisted by a Wall Street Journal headline writer.

It doesn’t take long to stumble across another odd claim: “Gen McChrystal was named commander of U.S. and Nato forces in Afghanistan in June, reportedly after he convinced Robert Gates, secretary of defence, that he could do more with less.” I have never heard from any credible source that McChrystal told Gates he “could do more with less,” and I very much doubt it’s true. Instead, the first thing McChrystal did upon arriving in Afghanistan was launch a review of current operations to see if there were enough troops in the theater, which there weren’t.

Dorronsoro claims that “Gen McChrystal persuaded him [Secretary Gates] that he could turn the tide in its favour with only a change in strategy.” Again, I very much doubt that McChrystal said any such thing.

To add to the errors, Dorronsoro goes on to write off U.S. counterinsurgency operations, which are barely beginning, in southern Afghanistan as a failure already:

The U.S. chose Helmand as a test case, and while operations are not yet finished, they are clearly not working as planned.

“Clearing” has become almost impossible. The insurgents are part of the population, and there is no way to distinguish them from ordinary villagers. In the Pashtun south, where xenophobic feelings are common, many if not most Afghans support the insurgents more readily than the international coalition. As a consequence, the area targeted by coalition forces remains unsafe, and in view of the weakness of the Afghan army, there is no way to withdraw without allowing the Taliban to regain control.

Keep in mind that although there are plans to deploy 68,000 U.S. troops to Afghanistan, not all of them have arrived yet. The operations in Helmand that Dorronsoro refers to were launched by the Marines in June, utilizing plans that had been drawn up before McChrystal even arrived. It’s absurd to claim (a) that the Helmand operation is a referendum on McChrystal’s strategy and (b) that it has already failed, when all successful counterinsurgency efforts take a substantial period of time to make an impact. Writing off operations in Afghanistan today is reminiscent of armchair analysts in the summer of 2007, or even earlier, who were claiming that the surge was a failure in Iraq when it had barely begun.

Dorronsoro makes another questionable claim when he writes:

The U.S. made a significant mistake in offering the insurgents a “historic” battle—comparable to the failed Soviet offensive in the Panjshir Valley in the 1980s. The insurgents chose not to fight U.S. troops frontally in southern Helmand, instead regrouping in the northern area, where the terrain is more favourable to them, and fighting hard against the British.

It is the height of folly to compare the population-centric counterinsurgency operations being undertaken today by coalition forces in Afghanistan—operations resulting in minimal civilian casualties—with the scorched-earth tactics practiced by the Red Army in the 1980s. There is little in common between the two, and the Afghan people know it. Most of them still welcome the presence of coalition forces, whereas almost none of them were in favor of the Soviet presence in the 1980s.

To cap his fallacious analysis, Dorronsoro concludes:

Though public officials seldom state it as clearly, the US still has a limited and achievable objective in Afghanistan: securing a viable state and preventing al-Qaeda from regaining its lost base. To have a reasonable chance of accomplishing that, US planners must recognize that the American public will not countenance more troops in Afghanistan in the same year troops are withdrawing from Iraq. They should focus their efforts on cities and key roads and build up the Afghan army.

Two more errors. First, he claims that the public will not countenance more troops, whereas public support for the war in Afghanistan is considerably higher today than was the support for the Iraq war in 2007, when President Bush mounted a major troop increase. If Bush could pull that off given the climate of opinion at the time, it is ridiculous to suggest that public opinion would prevent President Obama from sending more troops to Afghanistan today.

The second fallacy is his suggestion that troops “should focus their efforts on cities and key roads and build up the Afghan army.” There is nothing wrong with protecting cities and key roads and building up the Afghan army—all three must be pillars of a successful counterinsurgency strategy. But his implication that the coalition should leave the countryside to the enemy is foolhardy. Talk about repeating the Soviets’ mistakes—if NATO were to do that, it would indeed be consigning itself to the fate of the Red Army, which wound up getting besieged in the cities. Luckily, General McChrystal is too smart to repeat that error.

It is hard to imagine a more misbegotten piece of analysis about Afghanistan. Normally, I would let it pass, but in this case I couldn’t, because it appeared in a reputable publication and was written by an author from a reputable institution. Moreover, it does seem to reflect a certain mindset increasingly prevalent in our capital and the capitals of our allies, where too many people are willing to concede defeat in Afghanistan before our troops have begun to fight properly.

The Financial Times has an op-ed today entitled “We Must Face Reality in Afghanistan.” But the picture painted by its author, Gilles Dorronsoro, a visiting scholar at the Carnegie Endowment for International Peace who was previously a professor at the Sorbonne, doesn’t reflect reality. It’s like a dispatch from Bizarro Afghanistan—an alternate universe.

He begins: “The Taliban is winning the war in Afghanistan, a fact even the top U.S. commander in the country reluctantly concedes.” Apparently he missed the fact—which I noted in a previous COMMENTARY article—that General McChrystal said no such thing; his position was twisted by a Wall Street Journal headline writer.

It doesn’t take long to stumble across another odd claim: “Gen McChrystal was named commander of U.S. and Nato forces in Afghanistan in June, reportedly after he convinced Robert Gates, secretary of defence, that he could do more with less.” I have never heard from any credible source that McChrystal told Gates he “could do more with less,” and I very much doubt it’s true. Instead, the first thing McChrystal did upon arriving in Afghanistan was launch a review of current operations to see if there were enough troops in the theater, which there weren’t.

Dorronsoro claims that “Gen McChrystal persuaded him [Secretary Gates] that he could turn the tide in its favour with only a change in strategy.” Again, I very much doubt that McChrystal said any such thing.

To add to the errors, Dorronsoro goes on to write off U.S. counterinsurgency operations, which are barely beginning, in southern Afghanistan as a failure already:

The U.S. chose Helmand as a test case, and while operations are not yet finished, they are clearly not working as planned.

“Clearing” has become almost impossible. The insurgents are part of the population, and there is no way to distinguish them from ordinary villagers. In the Pashtun south, where xenophobic feelings are common, many if not most Afghans support the insurgents more readily than the international coalition. As a consequence, the area targeted by coalition forces remains unsafe, and in view of the weakness of the Afghan army, there is no way to withdraw without allowing the Taliban to regain control.

Keep in mind that although there are plans to deploy 68,000 U.S. troops to Afghanistan, not all of them have arrived yet. The operations in Helmand that Dorronsoro refers to were launched by the Marines in June, utilizing plans that had been drawn up before McChrystal even arrived. It’s absurd to claim (a) that the Helmand operation is a referendum on McChrystal’s strategy and (b) that it has already failed, when all successful counterinsurgency efforts take a substantial period of time to make an impact. Writing off operations in Afghanistan today is reminiscent of armchair analysts in the summer of 2007, or even earlier, who were claiming that the surge was a failure in Iraq when it had barely begun.

Dorronsoro makes another questionable claim when he writes:

The U.S. made a significant mistake in offering the insurgents a “historic” battle—comparable to the failed Soviet offensive in the Panjshir Valley in the 1980s. The insurgents chose not to fight U.S. troops frontally in southern Helmand, instead regrouping in the northern area, where the terrain is more favourable to them, and fighting hard against the British.

It is the height of folly to compare the population-centric counterinsurgency operations being undertaken today by coalition forces in Afghanistan—operations resulting in minimal civilian casualties—with the scorched-earth tactics practiced by the Red Army in the 1980s. There is little in common between the two, and the Afghan people know it. Most of them still welcome the presence of coalition forces, whereas almost none of them were in favor of the Soviet presence in the 1980s.

To cap his fallacious analysis, Dorronsoro concludes:

Though public officials seldom state it as clearly, the US still has a limited and achievable objective in Afghanistan: securing a viable state and preventing al-Qaeda from regaining its lost base. To have a reasonable chance of accomplishing that, US planners must recognize that the American public will not countenance more troops in Afghanistan in the same year troops are withdrawing from Iraq. They should focus their efforts on cities and key roads and build up the Afghan army.

Two more errors. First, he claims that the public will not countenance more troops, whereas public support for the war in Afghanistan is considerably higher today than was the support for the Iraq war in 2007, when President Bush mounted a major troop increase. If Bush could pull that off given the climate of opinion at the time, it is ridiculous to suggest that public opinion would prevent President Obama from sending more troops to Afghanistan today.

The second fallacy is his suggestion that troops “should focus their efforts on cities and key roads and build up the Afghan army.” There is nothing wrong with protecting cities and key roads and building up the Afghan army—all three must be pillars of a successful counterinsurgency strategy. But his implication that the coalition should leave the countryside to the enemy is foolhardy. Talk about repeating the Soviets’ mistakes—if NATO were to do that, it would indeed be consigning itself to the fate of the Red Army, which wound up getting besieged in the cities. Luckily, General McChrystal is too smart to repeat that error.

It is hard to imagine a more misbegotten piece of analysis about Afghanistan. Normally, I would let it pass, but in this case I couldn’t, because it appeared in a reputable publication and was written by an author from a reputable institution. Moreover, it does seem to reflect a certain mindset increasingly prevalent in our capital and the capitals of our allies, where too many people are willing to concede defeat in Afghanistan before our troops have begun to fight properly.

Read Less

How the U.S. Army Evolved

The Washington Post‘s Rajiv Chandrasekaran wrote an interesting story about the ouster of General David McKiernan as the head U.S. command in Afghanistan, and the decision to replace him with General Stanley McChrystal. It was, according to the Post, “the first sacking of a wartime theater commander since President Harry S. Truman dismissed Gen. Douglas MacArthur in 1951 for opposing his Korean War policy.”

General McKiernan was not removed because of insubordination; by all accounts, he was a loyal general and admired by his troops. But in the opinion of Secretary of Defense Robert Gates and Admiral Mike Mullen, chairman of the Joint Chiefs of Staff, McKiernan was too conventional and old school, too deferential to NATO (which was no longer viewed as the solution to our problems in Afghanistan), and not sufficiently forceful or innovative.

General McChrystal, on the other hand, is “regarded as a leader in the Petraeus mold: able to nimbly run the troops on the ground as well as the traps in Washington.” In sending McChrystal and Lt. General David Rodriguez—who had been Gates’s chief military assistant and served as one of the regional commanders in Afghanistan—to oversee the war, Gates is promoting two skilled practitioners of counterinsurgency strategy.

What interests me most in this story is that it underscores one of the most significant developments of this decade: the reform of the American military. Anyone who has served in government can tell you that reforming institutions is a very difficult task; you run into bureaucratic obstacles and inertia, old habits and conventions, people who agree to reform in theory and undermine it in practice. Sometimes the only way reform succeeds is through action-forcing events.

In the case of the military in general, and the Army in particular, the action-forcing event was the Iraq war—and more specifically, the so-called surge. It is commonly assumed that the key to the surge was sending more troops to Iraq; in fact, the change in counterinsurgency strategy—a multipronged approach that included everything from securing and serving the population to promoting reconciliation and pursuing the enemy relentlessly—was far more important. More troops pursuing the same unproductive strategy would not have changed the trajectory of the war. More troops doing the right thing did. And the person most responsible for this shift is, of course, General David Petraeus, now commander of the U.S. Central Command.

It’s hard to recall now, but in the early years of the Iraq war, Lt. General Petraeus was viewed as something of a freelancer and an outlier. He was (thankfully) pursuing, more or less on his own, the counterinsurgency strategy now embraced by almost everyone and currently applied in Afghanistan. Petraeus literally wrote (with help) the counterinsurgency manual, and he applied it in a war that we were on the verge of losing.

The success of the surge reoriented the mindset of the military. A premium is now placed on “adaptive leadership” and young officers challenging core assumptions. The United States focuses on winning the wars we are waging rather than on potential future conflicts. Counterinsurgency lessons that had been forgotten or discarded are being relearned. And a new generation of Army leaders are being promoted, evidence of the Army’s commitment to encouraging innovation and original thinking in what is, by its nature, a very hierarchical institution. According to one senior Defense Department official: “[Petraeus] redefined during his tour in Iraq what it means to be a commanding general. He broke the mold. The traditional responsibilities were not enough anymore. You had to be adroit at international politics. You had to be a skilled diplomat. You had to be savvy with the press, and you had to be a really sophisticated leader of a large organization.”

The United States military is one of the largest organizations in the world, and therefore resistant to change of any kind. That is doubly so when it comes to far-reaching reforms. But a shift in strategy in the war in Iraq, embraced by America’s 43rd president and overseen by a Princeton Ph.D. and a handful of others, has brought about real change, as opposed to the rhetorical kind. It will be a legacy of lasting importance.

The Washington Post‘s Rajiv Chandrasekaran wrote an interesting story about the ouster of General David McKiernan as the head U.S. command in Afghanistan, and the decision to replace him with General Stanley McChrystal. It was, according to the Post, “the first sacking of a wartime theater commander since President Harry S. Truman dismissed Gen. Douglas MacArthur in 1951 for opposing his Korean War policy.”

General McKiernan was not removed because of insubordination; by all accounts, he was a loyal general and admired by his troops. But in the opinion of Secretary of Defense Robert Gates and Admiral Mike Mullen, chairman of the Joint Chiefs of Staff, McKiernan was too conventional and old school, too deferential to NATO (which was no longer viewed as the solution to our problems in Afghanistan), and not sufficiently forceful or innovative.

General McChrystal, on the other hand, is “regarded as a leader in the Petraeus mold: able to nimbly run the troops on the ground as well as the traps in Washington.” In sending McChrystal and Lt. General David Rodriguez—who had been Gates’s chief military assistant and served as one of the regional commanders in Afghanistan—to oversee the war, Gates is promoting two skilled practitioners of counterinsurgency strategy.

What interests me most in this story is that it underscores one of the most significant developments of this decade: the reform of the American military. Anyone who has served in government can tell you that reforming institutions is a very difficult task; you run into bureaucratic obstacles and inertia, old habits and conventions, people who agree to reform in theory and undermine it in practice. Sometimes the only way reform succeeds is through action-forcing events.

In the case of the military in general, and the Army in particular, the action-forcing event was the Iraq war—and more specifically, the so-called surge. It is commonly assumed that the key to the surge was sending more troops to Iraq; in fact, the change in counterinsurgency strategy—a multipronged approach that included everything from securing and serving the population to promoting reconciliation and pursuing the enemy relentlessly—was far more important. More troops pursuing the same unproductive strategy would not have changed the trajectory of the war. More troops doing the right thing did. And the person most responsible for this shift is, of course, General David Petraeus, now commander of the U.S. Central Command.

It’s hard to recall now, but in the early years of the Iraq war, Lt. General Petraeus was viewed as something of a freelancer and an outlier. He was (thankfully) pursuing, more or less on his own, the counterinsurgency strategy now embraced by almost everyone and currently applied in Afghanistan. Petraeus literally wrote (with help) the counterinsurgency manual, and he applied it in a war that we were on the verge of losing.

The success of the surge reoriented the mindset of the military. A premium is now placed on “adaptive leadership” and young officers challenging core assumptions. The United States focuses on winning the wars we are waging rather than on potential future conflicts. Counterinsurgency lessons that had been forgotten or discarded are being relearned. And a new generation of Army leaders are being promoted, evidence of the Army’s commitment to encouraging innovation and original thinking in what is, by its nature, a very hierarchical institution. According to one senior Defense Department official: “[Petraeus] redefined during his tour in Iraq what it means to be a commanding general. He broke the mold. The traditional responsibilities were not enough anymore. You had to be adroit at international politics. You had to be a skilled diplomat. You had to be savvy with the press, and you had to be a really sophisticated leader of a large organization.”

The United States military is one of the largest organizations in the world, and therefore resistant to change of any kind. That is doubly so when it comes to far-reaching reforms. But a shift in strategy in the war in Iraq, embraced by America’s 43rd president and overseen by a Princeton Ph.D. and a handful of others, has brought about real change, as opposed to the rhetorical kind. It will be a legacy of lasting importance.

Read Less

Fear-Mongering at Yale

Martin Kramer’s post about the decision by Yale University Press to remove the Danish Muhammad cartoons from a book about the Danish Muhammad cartoons is very much worth your time. Hugh Fitzgerald’s post is also excellent.

As Martin reveals, one of the central figures who ensured the censorship of the cartoons is Prof. Marcia Inhorn, the head of Yale’s Middle East Studies department, which I wrote about for this website last year. Martin links to a piece Inhorn wrote in 2006 that is really a model of the genre:

I recently returned from a trip to Lebanon, the UAE and Iran—what most Americans would consider a journey into the heart of darkness, a veritable “axis of evil”. In fact, the trip was far from perilous, and I was treated as an honoured guest in every setting. . . .

I have travelled widely and lived with my family for extended periods of time in Egypt, Lebanon, and the UAE. It saddens me that so few Americans will ever come to know the delights of the Middle East as my family and I have.

What saddens me, by contrast, is how important it is for leftist world travelers to be treated as royalty by their hosts, and how they respond to Potemkin Village–style tours of repressive and dysfunctional countries with hoary tropes about the nobility of the Orient. Because she was treated as an “honoured guest in every setting” in Iran, the fact that the regime promotes war and terrorism around the globe is irrelevant; the fact that it strings up homosexuals from cranes in downtown Tehran doesn’t matter; the fact that it brutally tortures its own dissenters is barely of any concern and neither is the prison rape of young girls before their executions.

Inhorn is not simply agnostic on the question of the Iranian regime—she actually admires it because the mullahs allow in vitro fertilization and birth control:

These developments convince me of the need to recognise the “high-modern” nature of Iran, which is currently on the “cutting edge” of developments in reproductive science and technology. It also bespeaks the need to de-vilify—indeed, de-demonise—the Shiite Muslim clergy, who are condoning these various innovations, but who are generally represented as backward and fanatical in the Western media.

This isn’t scholarship, diplomacy, basic politeness, or even Radical Chic. It is something else entirely. If you’re some variety of terrorist, thug, or authoritarian and you want to get good press in Western academic circles—well, give some lavish toasts to the timorous professor at your table, make her feel important, and then watch the apologetics and accolades pour in. This is the petty narcissism of mediocre academics who are desperate for the celebrity and adulation that they are so deservedly denied in America.

Martin Kramer’s post about the decision by Yale University Press to remove the Danish Muhammad cartoons from a book about the Danish Muhammad cartoons is very much worth your time. Hugh Fitzgerald’s post is also excellent.

As Martin reveals, one of the central figures who ensured the censorship of the cartoons is Prof. Marcia Inhorn, the head of Yale’s Middle East Studies department, which I wrote about for this website last year. Martin links to a piece Inhorn wrote in 2006 that is really a model of the genre:

I recently returned from a trip to Lebanon, the UAE and Iran—what most Americans would consider a journey into the heart of darkness, a veritable “axis of evil”. In fact, the trip was far from perilous, and I was treated as an honoured guest in every setting. . . .

I have travelled widely and lived with my family for extended periods of time in Egypt, Lebanon, and the UAE. It saddens me that so few Americans will ever come to know the delights of the Middle East as my family and I have.

What saddens me, by contrast, is how important it is for leftist world travelers to be treated as royalty by their hosts, and how they respond to Potemkin Village–style tours of repressive and dysfunctional countries with hoary tropes about the nobility of the Orient. Because she was treated as an “honoured guest in every setting” in Iran, the fact that the regime promotes war and terrorism around the globe is irrelevant; the fact that it strings up homosexuals from cranes in downtown Tehran doesn’t matter; the fact that it brutally tortures its own dissenters is barely of any concern and neither is the prison rape of young girls before their executions.

Inhorn is not simply agnostic on the question of the Iranian regime—she actually admires it because the mullahs allow in vitro fertilization and birth control:

These developments convince me of the need to recognise the “high-modern” nature of Iran, which is currently on the “cutting edge” of developments in reproductive science and technology. It also bespeaks the need to de-vilify—indeed, de-demonise—the Shiite Muslim clergy, who are condoning these various innovations, but who are generally represented as backward and fanatical in the Western media.

This isn’t scholarship, diplomacy, basic politeness, or even Radical Chic. It is something else entirely. If you’re some variety of terrorist, thug, or authoritarian and you want to get good press in Western academic circles—well, give some lavish toasts to the timorous professor at your table, make her feel important, and then watch the apologetics and accolades pour in. This is the petty narcissism of mediocre academics who are desperate for the celebrity and adulation that they are so deservedly denied in America.

Read Less

Plan B Time Is Here

“God bless him, bless his heart, president of the United States—a total failure, losing all credibility with the American people on the economy, on the war, on energy, you name the subject.” That was Nancy Pelosi’s assessment of George W. Bush last July.

A little harsh, if you ask me. It’s not like Bush burned through a near trillion dollars in a stillborn stimulus, scolded America while the economy pitched, tried to close the country’s most important maximum-security facility without a plan, alienated allies from England to Israel, emboldened the world’s bad actors from Hugo Chavez to the Burmese junta, repeatedly trashed his predecessor around the globe, heralded America’s indifference to human-rights abuses, insulted police in a botched attempt to reignite a fading grievance culture, and frittered away the dregs of his political capital on a socialist health-care hodgepodge that neither he nor any other American could explain, let alone embrace.

Nope, accomplishing all that—and doing so between Valentine’s Day and Labor Day—takes a visionary, a new Lincoln, a “sort of God,” as a sort of apostle from Newsweek put it.

It turns out there is nothing too big to fail, including the grand plans of Barack Obama. Rep. Allen Boyd (D-Fla.) acknowledged as much yesterday in regard to the health-care meltdown. Faced with enraged Americans at town-hall meetings, Boyd threw in the towel, saying he would “be willing to scrap everything” and start from square one. First health care; next—you name it. The whole Obama horizon is ready for a reset.

Forget domestic policy, look abroad: As Robert Kagan generously put it in April, “[President Obama’s] policy toward Iran makes sense, so long as he is ready with a serious Plan B if the negotiating track with Tehran fails.” In September, “if” becomes “when.” That’s the month the administration picked as the deadline for Iran to show good faith and discuss its nuclear ambitions. The open-hand approach to autocrats may be a nifty idea in university classrooms, but in reality it has given Tehran room to boost global economic ties, build weapons, enrich uranium, and claim the U.S. as a silent accomplice in the regime’s human-rights abuses. The open hand has also failed in Burma, where talk of eased American sanctions was met with a nuclear partnership with North Korea, as well as in North Korea itself—which has ratcheted up missile launches and nuclear tests in response to Hillary Clinton’s imitation of Condoleezza Rice.

The president also has egg on his face over his delusional Middle East tack. Touting the Saudi peace initiative from day one, Obama leaned on Benjamin Netanyahu to acknowledge a two-state solution and pressed Israel on the settlement canard. Meanwhile, when the president went back to the Saudis with the fruits of his prodding, they told him to get lost.

At least part of Obama’s problem is that few of his plans are actually “his” or “plans.” The president thought he could close Guantánamo Bay with the flourish of a pen and a few words denouncing the previous administration; the stimulus was a Pelosi-Reid work of magical realism; and the health-care hash was both ill-conceived and intellectually outsourced.

It’s been a strange start for a visionary president. With unprecedented levels of global sympathy and a congress stacked in his favor, Obama attempted to will his worldview into actuality. But he never brought more than his own PR to the big launch. And now he’s suddenly stumped, going tone-deaf, and sinking in the polls. Assessments like this one are often made foolish by history, and I hope this one is as well. For if Obama finds his footing he will have done so by allowing reality, not ego and indiference, to guide policy. That is what he’s done and continues to do (for now) in regard to Iraq and Afghanistan, the two admirable areas of his foreign policy.

And Bush? He did some things wrong; others he did right. Among the latter was keeping America safe for eight years, turning a quagmire into a victory, and executing an economic bailout that actually worked as planned. All of which seem frighteningly high bars to clear these days.

“God bless him, bless his heart, president of the United States—a total failure, losing all credibility with the American people on the economy, on the war, on energy, you name the subject.” That was Nancy Pelosi’s assessment of George W. Bush last July.

A little harsh, if you ask me. It’s not like Bush burned through a near trillion dollars in a stillborn stimulus, scolded America while the economy pitched, tried to close the country’s most important maximum-security facility without a plan, alienated allies from England to Israel, emboldened the world’s bad actors from Hugo Chavez to the Burmese junta, repeatedly trashed his predecessor around the globe, heralded America’s indifference to human-rights abuses, insulted police in a botched attempt to reignite a fading grievance culture, and frittered away the dregs of his political capital on a socialist health-care hodgepodge that neither he nor any other American could explain, let alone embrace.

Nope, accomplishing all that—and doing so between Valentine’s Day and Labor Day—takes a visionary, a new Lincoln, a “sort of God,” as a sort of apostle from Newsweek put it.

It turns out there is nothing too big to fail, including the grand plans of Barack Obama. Rep. Allen Boyd (D-Fla.) acknowledged as much yesterday in regard to the health-care meltdown. Faced with enraged Americans at town-hall meetings, Boyd threw in the towel, saying he would “be willing to scrap everything” and start from square one. First health care; next—you name it. The whole Obama horizon is ready for a reset.

Forget domestic policy, look abroad: As Robert Kagan generously put it in April, “[President Obama’s] policy toward Iran makes sense, so long as he is ready with a serious Plan B if the negotiating track with Tehran fails.” In September, “if” becomes “when.” That’s the month the administration picked as the deadline for Iran to show good faith and discuss its nuclear ambitions. The open-hand approach to autocrats may be a nifty idea in university classrooms, but in reality it has given Tehran room to boost global economic ties, build weapons, enrich uranium, and claim the U.S. as a silent accomplice in the regime’s human-rights abuses. The open hand has also failed in Burma, where talk of eased American sanctions was met with a nuclear partnership with North Korea, as well as in North Korea itself—which has ratcheted up missile launches and nuclear tests in response to Hillary Clinton’s imitation of Condoleezza Rice.

The president also has egg on his face over his delusional Middle East tack. Touting the Saudi peace initiative from day one, Obama leaned on Benjamin Netanyahu to acknowledge a two-state solution and pressed Israel on the settlement canard. Meanwhile, when the president went back to the Saudis with the fruits of his prodding, they told him to get lost.

At least part of Obama’s problem is that few of his plans are actually “his” or “plans.” The president thought he could close Guantánamo Bay with the flourish of a pen and a few words denouncing the previous administration; the stimulus was a Pelosi-Reid work of magical realism; and the health-care hash was both ill-conceived and intellectually outsourced.

It’s been a strange start for a visionary president. With unprecedented levels of global sympathy and a congress stacked in his favor, Obama attempted to will his worldview into actuality. But he never brought more than his own PR to the big launch. And now he’s suddenly stumped, going tone-deaf, and sinking in the polls. Assessments like this one are often made foolish by history, and I hope this one is as well. For if Obama finds his footing he will have done so by allowing reality, not ego and indiference, to guide policy. That is what he’s done and continues to do (for now) in regard to Iraq and Afghanistan, the two admirable areas of his foreign policy.

And Bush? He did some things wrong; others he did right. Among the latter was keeping America safe for eight years, turning a quagmire into a victory, and executing an economic bailout that actually worked as planned. All of which seem frighteningly high bars to clear these days.

Read Less

Re: Obama and the “Death Panel” Issue

John, my main criticism of Sarah Palin’s “death panel” remarks is that they reduce in scope the disapproval of the proposed health-care bill to a concern that, while not wholly unfounded as you pointed out, sounds exaggerated and rhetorically ill-pitched. Rightfully fearing a backlash from senior voters, Democrats yielded to the opposition regarding provisions for end-of-life counseling and removed them from the bill—and some consider this a victory for Palin.

But her outrage seems to have been misallocated, as now more must be mustered for countering the rest of the bill, which remains chock-full of problematic stipulations, some much more deleterious to both the health-insurance industry and the interests of health-care consumers than what Sarah Palin chose to focus on. The natural resources of rhetoric can be depleted by the overuse of incendiary language, to which the public gradually grows insensitive.

While as a private citizen Sarah Palin is entitled to express her criticism of the bill however she sees fit, Michael Steele, as a leader of the opposition party, showed questionable judgment in backing remarks likely to court gratuitous controversy. Focusing on end-of-life consultations leaves the opposition vulnerable to the rejoinder that such services are already covered by existing private-insurance plans. It also derails the argument from one about socialized health care, whose most objective merits or lack thereof are grounded in economics, into one about controversial social issues such as the right to die. Involving euthanasia in this debate may agitate mixed loyalties among the socially liberal but fiscally conservative—a needless risk for the opposition.

Obama’s statements that you quoted are suggestive of the extreme utilitarian mindset that permeates the bill. To be sure, its architects do intend to ration care to the elderly and the chronically ill, but how such rationing would be implemented is not through any “death panels” but rather through the perverse actuarial calculus known as comparative effectiveness research. This is a formula that divides the cost of a treatment by the number of “quality-adjusted life years” that the patient is likely to enjoy—a cost-benefit quotient to guide bureaucratic boards on allocating medical resources. In Britain, the formula leads to denying treatments for older patients who have fewer years to benefit from care than do younger patients: until recently, older patients with macular degeneration, which causes blindness, were told that they had to go totally blind in one eye before they could get an expensive new drug to save the other eye.

As Betsy McCaughey notes at the Wall Street Journal: “The House bill shifts resources from specialty medicine to primary care based on the misconception that Americans overuse specialist care and drive up costs in the process (pp. 660-686). In fact, heart-disease patients treated by generalists instead of specialists are often misdiagnosed and treated incorrectly. They are readmitted to the hospital more frequently, and die sooner.”

This is just another corollary of the utilitarian ethics motivating the bill, concerned with allocating communal resources for the greatest benefit to the greatest number. In such context, it’s hardly a misconception that Americans overuse specialty care. Indeed, however grave a disease may be, if it ails only an unlucky few, the medical resources tied to treating it could instead help a greater number of people stricken by more common ailments. If such considerations dictate the allocation of scarce resources on a large scale, the result will be generic health care for all and specialized treatments—those needed the most—for few or none.

Winning the debate against socialized health care requires educating the public on what it entails for them, to which end plenty of facts, statistics, and case studies can be employed, often originating in countries that have adopted similar systems to the one America is contemplating. But the public is more boggled than enlightened by talk of “death panels.” Why resort to bombastic rhetorical devices when facts—cool, objective, dispassionate facts—are already on our side?

John, my main criticism of Sarah Palin’s “death panel” remarks is that they reduce in scope the disapproval of the proposed health-care bill to a concern that, while not wholly unfounded as you pointed out, sounds exaggerated and rhetorically ill-pitched. Rightfully fearing a backlash from senior voters, Democrats yielded to the opposition regarding provisions for end-of-life counseling and removed them from the bill—and some consider this a victory for Palin.

But her outrage seems to have been misallocated, as now more must be mustered for countering the rest of the bill, which remains chock-full of problematic stipulations, some much more deleterious to both the health-insurance industry and the interests of health-care consumers than what Sarah Palin chose to focus on. The natural resources of rhetoric can be depleted by the overuse of incendiary language, to which the public gradually grows insensitive.

While as a private citizen Sarah Palin is entitled to express her criticism of the bill however she sees fit, Michael Steele, as a leader of the opposition party, showed questionable judgment in backing remarks likely to court gratuitous controversy. Focusing on end-of-life consultations leaves the opposition vulnerable to the rejoinder that such services are already covered by existing private-insurance plans. It also derails the argument from one about socialized health care, whose most objective merits or lack thereof are grounded in economics, into one about controversial social issues such as the right to die. Involving euthanasia in this debate may agitate mixed loyalties among the socially liberal but fiscally conservative—a needless risk for the opposition.

Obama’s statements that you quoted are suggestive of the extreme utilitarian mindset that permeates the bill. To be sure, its architects do intend to ration care to the elderly and the chronically ill, but how such rationing would be implemented is not through any “death panels” but rather through the perverse actuarial calculus known as comparative effectiveness research. This is a formula that divides the cost of a treatment by the number of “quality-adjusted life years” that the patient is likely to enjoy—a cost-benefit quotient to guide bureaucratic boards on allocating medical resources. In Britain, the formula leads to denying treatments for older patients who have fewer years to benefit from care than do younger patients: until recently, older patients with macular degeneration, which causes blindness, were told that they had to go totally blind in one eye before they could get an expensive new drug to save the other eye.

As Betsy McCaughey notes at the Wall Street Journal: “The House bill shifts resources from specialty medicine to primary care based on the misconception that Americans overuse specialist care and drive up costs in the process (pp. 660-686). In fact, heart-disease patients treated by generalists instead of specialists are often misdiagnosed and treated incorrectly. They are readmitted to the hospital more frequently, and die sooner.”

This is just another corollary of the utilitarian ethics motivating the bill, concerned with allocating communal resources for the greatest benefit to the greatest number. In such context, it’s hardly a misconception that Americans overuse specialty care. Indeed, however grave a disease may be, if it ails only an unlucky few, the medical resources tied to treating it could instead help a greater number of people stricken by more common ailments. If such considerations dictate the allocation of scarce resources on a large scale, the result will be generic health care for all and specialized treatments—those needed the most—for few or none.

Winning the debate against socialized health care requires educating the public on what it entails for them, to which end plenty of facts, statistics, and case studies can be employed, often originating in countries that have adopted similar systems to the one America is contemplating. But the public is more boggled than enlightened by talk of “death panels.” Why resort to bombastic rhetorical devices when facts—cool, objective, dispassionate facts—are already on our side?

Read Less

First They Cling to Guns and Religion–and Now This

Spiegel Online reports today on the health-care debate in the United States. German “media commentators” across the board appear to be a little unclear on the concept:

German media commentators say Obama has lost his near-messianic status in the course of the health-care debate. Many Europeans, they say, can’t understand why so many Americans are clinging on to a health-care system that is less efficient, and provides worse care for average citizens, than European systems.

The Center-Left Süddeutsche Zeitung writes:

The 44th president, once revered as a messiah, is shrinking back to human proportions. This normalization, which is healthy, has been caused by the row over how to improve America’s health-care system. Obama has given up core elements of his most important reform plan in the face of sometimes aggressive, even fired-up protests from a right-wing mob. . . .

The Left-leaning Die Tageszeitung writes:

The entire reform debate suffers from the problematic conviction that has never been questioned, namely that health is an asset from which it’s OK to make money. A public health insurance system that doesn’t need to make money will end up ruining the private companies in the sector—that’s the argument the opponents of the reform are seriously making. . . .

The Center-Left Berlin daily Der Tagesspiegel writes:

The resistance comes from citizens who are genuinely outraged at what they see as state interference and business groups that are earning good money from the existing system. They are abusing the founding myth for their own purposes. The US was created as a “land of the free” against the dictatorship of monarchies in Europe. Opponents of the reform are going so far as to cite Thomas Jefferson: The tree of liberty must be refreshed from time to time with the blood of patriots and tyrants.

At least the German media commentators did not denigrate as well-dressed Nazis those Americans who showed up at town halls to talk to their representatives.

In other news today, Speigel Online reports that the German health minister is in political trouble for repeatedly flying to Spain for vacations and having her chauffeur drive her official limousine there:

German Health Minister Ulla Schmidt has come under pressure again over her use of an official limousine while on vacation after she admitted having done so not just this year, but on several occasions since 2004.

The Health Ministry told the German parliament’s budget committee that Schmidt had the limousine driven from Germany to her holiday location in Spain and back in the years 2006 through 2008.

In Europe, they gave up clinging to guns, religion, and a private health-care system a long time ago. Hardworking government ministers now provide many things in which the private sector used to be involved. Difficult to understand why anyone would want it any other way.

Spiegel Online reports today on the health-care debate in the United States. German “media commentators” across the board appear to be a little unclear on the concept:

German media commentators say Obama has lost his near-messianic status in the course of the health-care debate. Many Europeans, they say, can’t understand why so many Americans are clinging on to a health-care system that is less efficient, and provides worse care for average citizens, than European systems.

The Center-Left Süddeutsche Zeitung writes:

The 44th president, once revered as a messiah, is shrinking back to human proportions. This normalization, which is healthy, has been caused by the row over how to improve America’s health-care system. Obama has given up core elements of his most important reform plan in the face of sometimes aggressive, even fired-up protests from a right-wing mob. . . .

The Left-leaning Die Tageszeitung writes:

The entire reform debate suffers from the problematic conviction that has never been questioned, namely that health is an asset from which it’s OK to make money. A public health insurance system that doesn’t need to make money will end up ruining the private companies in the sector—that’s the argument the opponents of the reform are seriously making. . . .

The Center-Left Berlin daily Der Tagesspiegel writes:

The resistance comes from citizens who are genuinely outraged at what they see as state interference and business groups that are earning good money from the existing system. They are abusing the founding myth for their own purposes. The US was created as a “land of the free” against the dictatorship of monarchies in Europe. Opponents of the reform are going so far as to cite Thomas Jefferson: The tree of liberty must be refreshed from time to time with the blood of patriots and tyrants.

At least the German media commentators did not denigrate as well-dressed Nazis those Americans who showed up at town halls to talk to their representatives.

In other news today, Speigel Online reports that the German health minister is in political trouble for repeatedly flying to Spain for vacations and having her chauffeur drive her official limousine there:

German Health Minister Ulla Schmidt has come under pressure again over her use of an official limousine while on vacation after she admitted having done so not just this year, but on several occasions since 2004.

The Health Ministry told the German parliament’s budget committee that Schmidt had the limousine driven from Germany to her holiday location in Spain and back in the years 2006 through 2008.

In Europe, they gave up clinging to guns, religion, and a private health-care system a long time ago. Hardworking government ministers now provide many things in which the private sector used to be involved. Difficult to understand why anyone would want it any other way.

Read Less

Robert Novak, RIP

Robert D. Novak, the controversialist whose combination of hard-line conservatism and hard-charging reporting made his column essential reading for nearly four decades, has died. Bob went to his grave a Catholic, though he had been born a Jew, and passed through mainline Protestantism on the way. He had, as they say, “issues” with his Jewish roots, expressed largely in a hostility to Israel that made little sense given the overall nature of his views on a wide range of subjects—he was, for instance, an intimate of and close friend to the late Jack Kemp and agreed with Kemp on nearly every particular, but Kemp was a supporter of Israel, and Novak an opponent of it.

In 1989, when I was an editor at the Washington Times, I assigned a reporter a profile on Richard Darman, then George H.W. Bush’s budget director. There had been rumors that Darman had been born a Jew, and I asked her to check them out in the pre-Internet days. She uncovered a news story in the Providence, Rhode Island, newspaper about Darman’s bar mitzvah, of all things. And when she asked him about it, Darman was deeply unsettled, asked her not to publish anything about it, said he would be her best source, said it would devastate his wife and children. She came back and reported this to me, and I said we would be sure to make it the lead of the piece. That weekend, on his CNN show, Bob Novak denounced the piece as the “Shame of the Week,” an act of injustice against Darman and his privacy and the sanctity of his family.

That did not prevent him from maintaining cordial relations with me, and I with him. My last communication with him was a cryptic e-mail he sent me after a New York Post column I had written on the injustices being heaped on the head of Scooter Libby, the Cheney chief of staff who got wrapped up in the Valerie Plame dragnet set into motion by Novak’s mention of her name in a column. “Obviously, I cannot comment on the case,” Novak wrote, “but that was a very good column.”

He was a difficult man in many ways, but I always found him interesting, lively, and friendly. And I have to say that, toward the end of his life, he wrote a riveting I-can’t-quite-believe-I’m-reading-this memoir entitled The Prince of Darkness, which may offer, in its unsparing portrait of his own character and how he maneuvered his way through a 50-year career, the most accurate (and most dispiriting) picture of life in Washington and the journalism game published in my lifetime. It was an unexpected achievement, because he surely knew he was leaving his readers with a bad taste in their mouths. But he was determined to get it all down and get it right, and he did.

Robert D. Novak, the controversialist whose combination of hard-line conservatism and hard-charging reporting made his column essential reading for nearly four decades, has died. Bob went to his grave a Catholic, though he had been born a Jew, and passed through mainline Protestantism on the way. He had, as they say, “issues” with his Jewish roots, expressed largely in a hostility to Israel that made little sense given the overall nature of his views on a wide range of subjects—he was, for instance, an intimate of and close friend to the late Jack Kemp and agreed with Kemp on nearly every particular, but Kemp was a supporter of Israel, and Novak an opponent of it.

In 1989, when I was an editor at the Washington Times, I assigned a reporter a profile on Richard Darman, then George H.W. Bush’s budget director. There had been rumors that Darman had been born a Jew, and I asked her to check them out in the pre-Internet days. She uncovered a news story in the Providence, Rhode Island, newspaper about Darman’s bar mitzvah, of all things. And when she asked him about it, Darman was deeply unsettled, asked her not to publish anything about it, said he would be her best source, said it would devastate his wife and children. She came back and reported this to me, and I said we would be sure to make it the lead of the piece. That weekend, on his CNN show, Bob Novak denounced the piece as the “Shame of the Week,” an act of injustice against Darman and his privacy and the sanctity of his family.

That did not prevent him from maintaining cordial relations with me, and I with him. My last communication with him was a cryptic e-mail he sent me after a New York Post column I had written on the injustices being heaped on the head of Scooter Libby, the Cheney chief of staff who got wrapped up in the Valerie Plame dragnet set into motion by Novak’s mention of her name in a column. “Obviously, I cannot comment on the case,” Novak wrote, “but that was a very good column.”

He was a difficult man in many ways, but I always found him interesting, lively, and friendly. And I have to say that, toward the end of his life, he wrote a riveting I-can’t-quite-believe-I’m-reading-this memoir entitled The Prince of Darkness, which may offer, in its unsparing portrait of his own character and how he maneuvered his way through a 50-year career, the most accurate (and most dispiriting) picture of life in Washington and the journalism game published in my lifetime. It was an unexpected achievement, because he surely knew he was leaving his readers with a bad taste in their mouths. But he was determined to get it all down and get it right, and he did.

Read Less

Don’t You Know Who I Am? The Bollywood Version

Should people who share a last name with someone suspected of terrorism be subjected to extra questioning when entering the United States? That’s a question that might strike close to home if your last name is Khan, a Muslim name that might be shared by some jihadists. But if the worst that results from such a practice is momentary inconvenience to an innocent visitor named Khan, then the vast majority of Americans, whose nation is still the target of a worldwide Islamist terrorist movement, will probably be inclined to tell the offended Khan that they are sorry for his inconvenience, wish him a pleasant stay, and then silently pray that if a real jihadist chooses to land at the same airport, officials will not be deterred by critics from asking him questions that might lead to foiling a terrorist atrocity.

That’s more or less what happened when one Shah Rukh Khan landed at Newark Liberty International Airport last Friday. Khan is a Bollywood star—a very big deal in his own country, but, alas for fans of Indian films, in New Jersey he was just another guy named Khan. Though his ordeal was quickly ended with a phone call to the Indian consulate, Khan is making a meal out of the incident. And indeed, Khan’s trying hour in Newark is being treated as an international incident of the first order, with U.S. ambassador to Delhi Timothy J. Roemer (a former Indiana congressman) genuflecting to the star, calling him a “global icon.”

There may be more to this, however, than just the normal “don’t you know who I am?” routine from a celebrity who may be not be as well known as he thinks he is. As it happens, Khan has been a frequent visitor to the United States lately. According to reports, he has been making a film about racial profiling in the United States; the working title of the movie is supposedly My Name Is Khan.

But as much as anyone who has been hassled, humiliated, or just had his time wasted at an airport by clueless security personnel might have some sympathy for Khan, the actor’s subsequent remarks in an interview with Indian television channel CNN-IBN (as reported by Agence France-Presse) about his experience reveal that he is a man with an agenda. Milking this minor mishap into a major kerfuffle raises his international profile and perhaps that of his film project.

Not content with raising the blood  pressure of his loyal fan base, Khan is raising the stakes in this controversy. Complaining about the fact that his name “is common on their checklist,” Khan railed at America’s efforts to protect its citizens from terrorists who might share his name:

America needs to understand one small thing. That there are about 190-195 smaller countries and that makes the whole world. It’s not an isolated, parallel universe existence for this country. There is a whole world which makes all the good and bad that is happening. So if we are scared of violence and terrorism, all of us are responsible for it. It’s not that the world is and America is not.

That’s the sort of nonsense that might play well in third-world countries where resentment of America is the coin of the realm, but in the real universe, it’s important to note that America is not “responsible” for the violence and terrorism unleashed on the world by Islamists. Responsibility for it lies with the jihadists, who have not only targeted the United States but, as we learned last fall, also Mumbai, the home of the Indian film industry. Indians, of all people, ought to know the perils of living in a world with Islamist terrorists, many of whom are based next door in Pakistan.

Khan and others who complain about their treatment at airports may be angry about the minimal efforts we make to screen visitors since 9/11, but if critics of profiling believe that young males with Muslim names or affiliations should not be subjected to more scrutiny than little old ladies (and, in practice, it is far from clear that they are), then they are asking us to believe that the latter are just as likely to commit such crimes as the former.

So long as there is a real terror threat coming from Islamists, Mr. Khan, welcome visitor though he may be, ought to forget about advising Americans about security policies and stick to the Bollywood tradition of rescuing damsels in distress from a fate worse than death.

Should people who share a last name with someone suspected of terrorism be subjected to extra questioning when entering the United States? That’s a question that might strike close to home if your last name is Khan, a Muslim name that might be shared by some jihadists. But if the worst that results from such a practice is momentary inconvenience to an innocent visitor named Khan, then the vast majority of Americans, whose nation is still the target of a worldwide Islamist terrorist movement, will probably be inclined to tell the offended Khan that they are sorry for his inconvenience, wish him a pleasant stay, and then silently pray that if a real jihadist chooses to land at the same airport, officials will not be deterred by critics from asking him questions that might lead to foiling a terrorist atrocity.

That’s more or less what happened when one Shah Rukh Khan landed at Newark Liberty International Airport last Friday. Khan is a Bollywood star—a very big deal in his own country, but, alas for fans of Indian films, in New Jersey he was just another guy named Khan. Though his ordeal was quickly ended with a phone call to the Indian consulate, Khan is making a meal out of the incident. And indeed, Khan’s trying hour in Newark is being treated as an international incident of the first order, with U.S. ambassador to Delhi Timothy J. Roemer (a former Indiana congressman) genuflecting to the star, calling him a “global icon.”

There may be more to this, however, than just the normal “don’t you know who I am?” routine from a celebrity who may be not be as well known as he thinks he is. As it happens, Khan has been a frequent visitor to the United States lately. According to reports, he has been making a film about racial profiling in the United States; the working title of the movie is supposedly My Name Is Khan.

But as much as anyone who has been hassled, humiliated, or just had his time wasted at an airport by clueless security personnel might have some sympathy for Khan, the actor’s subsequent remarks in an interview with Indian television channel CNN-IBN (as reported by Agence France-Presse) about his experience reveal that he is a man with an agenda. Milking this minor mishap into a major kerfuffle raises his international profile and perhaps that of his film project.

Not content with raising the blood  pressure of his loyal fan base, Khan is raising the stakes in this controversy. Complaining about the fact that his name “is common on their checklist,” Khan railed at America’s efforts to protect its citizens from terrorists who might share his name:

America needs to understand one small thing. That there are about 190-195 smaller countries and that makes the whole world. It’s not an isolated, parallel universe existence for this country. There is a whole world which makes all the good and bad that is happening. So if we are scared of violence and terrorism, all of us are responsible for it. It’s not that the world is and America is not.

That’s the sort of nonsense that might play well in third-world countries where resentment of America is the coin of the realm, but in the real universe, it’s important to note that America is not “responsible” for the violence and terrorism unleashed on the world by Islamists. Responsibility for it lies with the jihadists, who have not only targeted the United States but, as we learned last fall, also Mumbai, the home of the Indian film industry. Indians, of all people, ought to know the perils of living in a world with Islamist terrorists, many of whom are based next door in Pakistan.

Khan and others who complain about their treatment at airports may be angry about the minimal efforts we make to screen visitors since 9/11, but if critics of profiling believe that young males with Muslim names or affiliations should not be subjected to more scrutiny than little old ladies (and, in practice, it is far from clear that they are), then they are asking us to believe that the latter are just as likely to commit such crimes as the former.

So long as there is a real terror threat coming from Islamists, Mr. Khan, welcome visitor though he may be, ought to forget about advising Americans about security policies and stick to the Bollywood tradition of rescuing damsels in distress from a fate worse than death.

Read Less

Announcement

Multiple readers have contacted us with inquiries about Jennifer Rubin’s recent absence from the blog. To all those concerned, please rest assured that Jennifer is merely on a much deserved vacation, from which she will return soon to resume blogging.

Multiple readers have contacted us with inquiries about Jennifer Rubin’s recent absence from the blog. To all those concerned, please rest assured that Jennifer is merely on a much deserved vacation, from which she will return soon to resume blogging.

Read Less

CSM: Why Don’t Those Demented Jews Shut Up Already About the Holocaust

In the United States there’s an unwritten but valid rule that applies to debates about topical issues: the first person to cry “Nazi” is usually the one who loses. In discussions about Israel, the same thing ought to apply but doesn’t always, as leftist critics routinely throw the epithet around to describe any action of Israel’s they don’t like, such as routine measures of self-defense against active terror organizations. The point is, anyone who can’t speak about the Jewish state without invoking a comparison between it and the Nazis is an anti-Semite whose real goal is to delegitimize the Jews by smearing them with the tag of their greatest foe.

A variant of this ploy is sometimes heard not from anti-Semites but from Jews who are similarly put off by Israeli actions, if not the political culture of the country itself. For them, it’s not that the Jews are Nazis but that they are so obsessed with what the Nazis did to them that they do bad things to others as well as to themselves. Seen from this point of view, Holocaust remembrance and the invocation of the spectacle of Jewish powerlessness in the face of malevolent evil has become an evil in and of itself, in that it feeds Jewish paranoia.

This is the view of certain leftist authors, such as Tom Segev, who made it famous in his lamentable book The Seventh Million. Though his work has been widely discredited as both history and sociology, writers seeking to discredit Israeli policy trot out Segev’s ill-conceived pop-psychology theory every once in a while. It’s a technique that attempts to combine condescension with a touch of sympathy for the poor grief-deluded Jews.

The latest example comes from one Bill Glucroft, a self-styled digital journalist who wrote in the Christian Science Monitor today that the Holocaust is casting a shadow over Israel’s choices and “undermining its security” because paranoia about threats prevents Jews from simply ending the conflict with the Palestinians by “giving them a home.” This happens, he says, because “invoking the Holocaust is the way Israeli policymakers evade the difficult decision making needed to shift the status quo; nothing else matters, and anything is justified, when everything is about surviving annihilation—a rationale that serves especially well in delaying the creation of a Palestinian state.”

The main problem with this analysis is that Israel has been trying to hand the Palestinians a state on a silver platter for nearly 16 years, ever since the failed Oslo process began. Ehud Barak offered one to Yasser Arafat at Camp David in July 2000. Ehud Olmert tried to give it to Mahmoud Abbas last year. Though most Israelis have justified concerns about their security, given that only a few years ago the Palestinians launched a terror offensive designed to break them with random suicide bombings in restaurants, malls, and streets, the vast majority, including the prime minister of Israel, have long since agreed to the notion of a Palestinian state.

It’s the Palestinians who view any deal that recognizes the legitimacy of a Jewish state as anathema and keep turning down peace whenever the opportunity arises. Far from Israel “holding a key to a home for the Palestinians,” it’s the Palestinians themselves who need to rethink the notion that their national identity is linked to the destruction of Israel rather than to the building of a homeland.

Contrary to amateur psychologists like Glucroft, the vast majority of Israelis understand that they are not living in the Warsaw Ghetto. But they also understand that they are locked in a conflict with an adversary that views Israeli concessions as invitations for more terrorism (such as the withdrawal from Gaza in 2005). Most rightfully understand that the only condition under which Jews can live with such dangerous neighbors is a position of strength, but that doesn’t make them Holocaust head cases. It just means that, unlike Glucroft and others whose ideology blinds them to Palestinian realities, Israelis understand that they are living in 2009 and not in a mythical future where hostility to Zionism has ended.

Telling the Jews to shut up already about the Holocaust in order to falsify the present situation is a unique form of dishonesty. Such writers would do better to try convincing their Palestinian clients that their own paranoia and Jew-hatred ought to be junked if peace is to have a chance.

In the United States there’s an unwritten but valid rule that applies to debates about topical issues: the first person to cry “Nazi” is usually the one who loses. In discussions about Israel, the same thing ought to apply but doesn’t always, as leftist critics routinely throw the epithet around to describe any action of Israel’s they don’t like, such as routine measures of self-defense against active terror organizations. The point is, anyone who can’t speak about the Jewish state without invoking a comparison between it and the Nazis is an anti-Semite whose real goal is to delegitimize the Jews by smearing them with the tag of their greatest foe.

A variant of this ploy is sometimes heard not from anti-Semites but from Jews who are similarly put off by Israeli actions, if not the political culture of the country itself. For them, it’s not that the Jews are Nazis but that they are so obsessed with what the Nazis did to them that they do bad things to others as well as to themselves. Seen from this point of view, Holocaust remembrance and the invocation of the spectacle of Jewish powerlessness in the face of malevolent evil has become an evil in and of itself, in that it feeds Jewish paranoia.

This is the view of certain leftist authors, such as Tom Segev, who made it famous in his lamentable book The Seventh Million. Though his work has been widely discredited as both history and sociology, writers seeking to discredit Israeli policy trot out Segev’s ill-conceived pop-psychology theory every once in a while. It’s a technique that attempts to combine condescension with a touch of sympathy for the poor grief-deluded Jews.

The latest example comes from one Bill Glucroft, a self-styled digital journalist who wrote in the Christian Science Monitor today that the Holocaust is casting a shadow over Israel’s choices and “undermining its security” because paranoia about threats prevents Jews from simply ending the conflict with the Palestinians by “giving them a home.” This happens, he says, because “invoking the Holocaust is the way Israeli policymakers evade the difficult decision making needed to shift the status quo; nothing else matters, and anything is justified, when everything is about surviving annihilation—a rationale that serves especially well in delaying the creation of a Palestinian state.”

The main problem with this analysis is that Israel has been trying to hand the Palestinians a state on a silver platter for nearly 16 years, ever since the failed Oslo process began. Ehud Barak offered one to Yasser Arafat at Camp David in July 2000. Ehud Olmert tried to give it to Mahmoud Abbas last year. Though most Israelis have justified concerns about their security, given that only a few years ago the Palestinians launched a terror offensive designed to break them with random suicide bombings in restaurants, malls, and streets, the vast majority, including the prime minister of Israel, have long since agreed to the notion of a Palestinian state.

It’s the Palestinians who view any deal that recognizes the legitimacy of a Jewish state as anathema and keep turning down peace whenever the opportunity arises. Far from Israel “holding a key to a home for the Palestinians,” it’s the Palestinians themselves who need to rethink the notion that their national identity is linked to the destruction of Israel rather than to the building of a homeland.

Contrary to amateur psychologists like Glucroft, the vast majority of Israelis understand that they are not living in the Warsaw Ghetto. But they also understand that they are locked in a conflict with an adversary that views Israeli concessions as invitations for more terrorism (such as the withdrawal from Gaza in 2005). Most rightfully understand that the only condition under which Jews can live with such dangerous neighbors is a position of strength, but that doesn’t make them Holocaust head cases. It just means that, unlike Glucroft and others whose ideology blinds them to Palestinian realities, Israelis understand that they are living in 2009 and not in a mythical future where hostility to Zionism has ended.

Telling the Jews to shut up already about the Holocaust in order to falsify the present situation is a unique form of dishonesty. Such writers would do better to try convincing their Palestinian clients that their own paranoia and Jew-hatred ought to be junked if peace is to have a chance.

Read Less




Welcome to Commentary Magazine.
We hope you enjoy your visit.
As a visitor to our site, you are allowed 8 free articles this month.
This is your first of 8 free articles.

If you are already a digital subscriber, log in here »

Print subscriber? For free access to the website and iPad, register here »

To subscribe, click here to see our subscription offers »

Please note this is an advertisement skip this ad
Clearly, you have a passion for ideas.
Subscribe today for unlimited digital access to the publication that shapes the minds of the people who shape our world.
Get for just
YOU HAVE READ OF 8 FREE ARTICLES THIS MONTH.
FOR JUST
YOU HAVE READ OF 8 FREE ARTICLES THIS MONTH.
FOR JUST
Welcome to Commentary Magazine.
We hope you enjoy your visit.
As a visitor, you are allowed 8 free articles.
This is your first article.
You have read of 8 free articles this month.
YOU HAVE READ 8 OF 8
FREE ARTICLES THIS MONTH.
for full access to
CommentaryMagazine.com
INCLUDES FULL ACCESS TO:
Digital subscriber?
Print subscriber? Get free access »
Call to subscribe: 1-800-829-6270
You can also subscribe
on your computer at
CommentaryMagazine.com.
LOG IN WITH YOUR
COMMENTARY MAGAZINE ID
Don't have a CommentaryMagazine.com log in?
CREATE A COMMENTARY
LOG IN ID
Enter you email address and password below. A confirmation email will be sent to the email address that you provide.