Commentary Magazine


Topic: disease

History Made Visible

Humans are visual animals. Just as a dog, suddenly aware of the unexpected, turns his nose toward it and starts sniffing, we humans turn our eyes toward things to try to figure them out. Of course, many things are invisible because of distance, size, lack of light, or obstruction. Much of the history of technology has been about overcoming these problems, with telescopes, microscopes, infrared sensors, radar, television, etc.  Once we can see something clearly, we can usually figure it out.

But some things are just inherently not visible. Epidemics, for instance. So we humans, clever creatures that we are, have devised ways to make even them visible. When cholera broke out in London in 1854, no one had a clue as to why or how the disease was spreading. A physician named John Snow stuck a pin in a map of London to indicate the residence of everyone who had developed cholera. It was quickly evident that the cases were clustered tightly around a particular public well from which people were drawing water for household use. Close the well, said Snow, and the epidemic will end. Snow, widely considered the father of epidemiology, was right, and the epidemic quickly abated when authorities closed the well.

Statistics, too, are a way of making the inherently invisible visible because they can be converted into graphs and charts.  Add the power of computers and you can produce charts that border on the magical. Consider this one published in Business Insider. It charts the fertility rate (the number of babies born per woman) against life expectancy over the past 50 years for a large number of countries. Each country is represented by a circle, its size a function of that country’s population.

The first thing you notice is that the fertility rate has been dropping sharply in most countries, while life expectancy has been rising equally sharply. The circles migrate toward the lower right of the chart over time to show this. But one of the large circles suddenly drops off a cliff beginning about 1970 as its fertility rate drops precipitously. How come? Click on the bubble and its name comes up: China and its one-child-per-family policy. And some countries suddenly reverse course, and their life expectancy collapses, moving their circles rapidly back toward the left of the chart. What is causing that? Click on the circles and the names of the countries come up: Rwanda, Cambodia, etc.

In other words, this chart makes history itself visible. Is that cool, or what?

Humans are visual animals. Just as a dog, suddenly aware of the unexpected, turns his nose toward it and starts sniffing, we humans turn our eyes toward things to try to figure them out. Of course, many things are invisible because of distance, size, lack of light, or obstruction. Much of the history of technology has been about overcoming these problems, with telescopes, microscopes, infrared sensors, radar, television, etc.  Once we can see something clearly, we can usually figure it out.

But some things are just inherently not visible. Epidemics, for instance. So we humans, clever creatures that we are, have devised ways to make even them visible. When cholera broke out in London in 1854, no one had a clue as to why or how the disease was spreading. A physician named John Snow stuck a pin in a map of London to indicate the residence of everyone who had developed cholera. It was quickly evident that the cases were clustered tightly around a particular public well from which people were drawing water for household use. Close the well, said Snow, and the epidemic will end. Snow, widely considered the father of epidemiology, was right, and the epidemic quickly abated when authorities closed the well.

Statistics, too, are a way of making the inherently invisible visible because they can be converted into graphs and charts.  Add the power of computers and you can produce charts that border on the magical. Consider this one published in Business Insider. It charts the fertility rate (the number of babies born per woman) against life expectancy over the past 50 years for a large number of countries. Each country is represented by a circle, its size a function of that country’s population.

The first thing you notice is that the fertility rate has been dropping sharply in most countries, while life expectancy has been rising equally sharply. The circles migrate toward the lower right of the chart over time to show this. But one of the large circles suddenly drops off a cliff beginning about 1970 as its fertility rate drops precipitously. How come? Click on the bubble and its name comes up: China and its one-child-per-family policy. And some countries suddenly reverse course, and their life expectancy collapses, moving their circles rapidly back toward the left of the chart. What is causing that? Click on the circles and the names of the countries come up: Rwanda, Cambodia, etc.

In other words, this chart makes history itself visible. Is that cool, or what?

Read Less

The Left’s Ornery Adolescents

Why are those Americans who are most distrustful of the U.S. government, and so eager to undermine it, the same ones who are most desperate to give it control over their own lives? Michael Moore has made a big P.R. show of his pledge to pay Julian Assange’s bail. “WikiLeaks, God bless them, will save lives as a result of their actions,” he writes, and puts the U.S. government on notice: “You simply can’t be trusted.” Moore offers advice to those of us who see something wrong with Assange. “[A]ll I ask is that you not be naive about how the government works when it decides to go after its prey.” Right. Instead, you should be naïve about how government works when it decides to take control of your health care, regulate your business, and spend your earnings. Moore, you may have forgotten, calls for the U.S. government to provide “free, universal health care for life” for “every resident of the United States” and demands that “pharmaceutical companies … be strictly regulated like a public utility.” That’s the old anti–Big Brother spirit.

When men like Michael Moore are not calling for the government to be undermined and defied, they’re petitioning for it to chauffeur them to the movies, cook their meals, and tuck them into bed. One news cycle finds HBO’s Bill Maher telling America not to allow the government to inject “a disease into your arm” in the form of a vaccine and that “I don’t trust the government, especially with my health.” The next, he’s calling for “Medicare for all” and lamenting the absence of a fully government-run health-care system that would operate like the U.S. postal service.

At the Nation, progressive totem Tom Hayden penned an article titled “WikiLeaks vs. The Empire,” defending Assange on the grounds that “the closed doors of power need to be open to public review” and noting that “the American people might revolt if we knew the secrets being kept from us.” Oh, and about that secretive and untrustworthy “Empire”? Hayden wants to put it in charge of the health care of all Americans, naturally.

This paradoxical political posturing resembles nothing so much as middle-class adolescent rebellion. Troubled kids protest their parents’ dangerous values, their authoritarianism, their materialism, and the moral hypocrisy that keeps the whole farcical delusion afloat. But most of all, they protest the piddling allowance on which no self-respecting 13-year-old old can be expected to keep himself in the latest combat-based video games, faddish clothes, and instantly gratifying gadgetry.

The troubled kids of the left distrust the extraordinary powers wielded by their leaders in the name of safety and well-being — but it’s also a real bummer that the government won’t assert more power to keep us safe and well.

Why are those Americans who are most distrustful of the U.S. government, and so eager to undermine it, the same ones who are most desperate to give it control over their own lives? Michael Moore has made a big P.R. show of his pledge to pay Julian Assange’s bail. “WikiLeaks, God bless them, will save lives as a result of their actions,” he writes, and puts the U.S. government on notice: “You simply can’t be trusted.” Moore offers advice to those of us who see something wrong with Assange. “[A]ll I ask is that you not be naive about how the government works when it decides to go after its prey.” Right. Instead, you should be naïve about how government works when it decides to take control of your health care, regulate your business, and spend your earnings. Moore, you may have forgotten, calls for the U.S. government to provide “free, universal health care for life” for “every resident of the United States” and demands that “pharmaceutical companies … be strictly regulated like a public utility.” That’s the old anti–Big Brother spirit.

When men like Michael Moore are not calling for the government to be undermined and defied, they’re petitioning for it to chauffeur them to the movies, cook their meals, and tuck them into bed. One news cycle finds HBO’s Bill Maher telling America not to allow the government to inject “a disease into your arm” in the form of a vaccine and that “I don’t trust the government, especially with my health.” The next, he’s calling for “Medicare for all” and lamenting the absence of a fully government-run health-care system that would operate like the U.S. postal service.

At the Nation, progressive totem Tom Hayden penned an article titled “WikiLeaks vs. The Empire,” defending Assange on the grounds that “the closed doors of power need to be open to public review” and noting that “the American people might revolt if we knew the secrets being kept from us.” Oh, and about that secretive and untrustworthy “Empire”? Hayden wants to put it in charge of the health care of all Americans, naturally.

This paradoxical political posturing resembles nothing so much as middle-class adolescent rebellion. Troubled kids protest their parents’ dangerous values, their authoritarianism, their materialism, and the moral hypocrisy that keeps the whole farcical delusion afloat. But most of all, they protest the piddling allowance on which no self-respecting 13-year-old old can be expected to keep himself in the latest combat-based video games, faddish clothes, and instantly gratifying gadgetry.

The troubled kids of the left distrust the extraordinary powers wielded by their leaders in the name of safety and well-being — but it’s also a real bummer that the government won’t assert more power to keep us safe and well.

Read Less

The President Shouldn’t Listen to Counsels of Despair on Afghanistan

The best indication that things are going pretty well in Afghanistan: a sundry collection of “experts” is ready to raise the white flag. A motley crew of academics and journalists (including my frequent sparring partner Nir Rosen) has just released an open letter to President Obama claiming that “the situation on the ground is much worse than a year ago” and that “operations in the south of Afghanistan, in Kandahar and Helmand provinces, are not going well.” Yet a few paragraphs later, the authors switch gears, writing, “The military campaign is suppressing, locally and temporarily, the symptoms of the disease, but fails to offer a cure.” So which is it: is the campaign failing, or is it suppressing the Taliban?

Having just visited Afghanistan, where I had the opportunity to visit Helmand and Kandahar provinces, among others, I return convinced that the campaign is, in fact, going well and that it is suppressing the Taliban. Obviously, the Taliban have not given up the struggle, as witness, for example, the terrible suicide bombing of a new Afghan-American patrol base in Kandahar’s Zhare district — an attack that killed six American servicemen. But keep in mind that Zhare has for years been one of the Taliban’s strongholds. The fact that the Taliban are still able to carry out an occasional suicide bombing there is much less significant than the fact that the area is now dotted with American patrol bases. U.S. troops have in fact had considerable, though still incomplete, success in pushing the Taliban out of many of their southern redoubts.

Once progress on the ground is further along, it is likely that elements of the Taliban will be eager to stop fighting. At that point, peace talks may produce some results. But that time is not now. Which is why the authors of the open letter are so far off base when they call on President Obama “to sanction and support a direct dialogue and negotiation with the Afghan Taliban leadership residing in Pakistan.” The open letter claims that the “Taliban’s leadership has indicated its willingness to negotiate,” which is true in the sense that the Taliban are happy to negotiate the exit of international forces in the expectation that they will then take over the whole country and reimpose their fundamentalist dictatorship. But they have so far shown little willingness to negotiate on terms acceptable to most Afghans or to the international community, which has sponsored Afghanistan’s post-2001 experiment in democracy. In fact, when some elements of the Taliban leadership showed a willingness to make concessions, they were promptly locked up by the Pakistanis, who don’t want to see their proxies surrendering without their say-so.

The notion “that mediation can help achieve a settlement which brings peace to Afghanistan” is simply delusional — at least for now. For talks to have any success, General Petraeus needs more time to pursue his comprehensive counterinsurgency strategy. Thankfully, it appears as if he will get that time from the Obama administration — which is far too sensible to listen to the counsels of despair from this open letter.

The best indication that things are going pretty well in Afghanistan: a sundry collection of “experts” is ready to raise the white flag. A motley crew of academics and journalists (including my frequent sparring partner Nir Rosen) has just released an open letter to President Obama claiming that “the situation on the ground is much worse than a year ago” and that “operations in the south of Afghanistan, in Kandahar and Helmand provinces, are not going well.” Yet a few paragraphs later, the authors switch gears, writing, “The military campaign is suppressing, locally and temporarily, the symptoms of the disease, but fails to offer a cure.” So which is it: is the campaign failing, or is it suppressing the Taliban?

Having just visited Afghanistan, where I had the opportunity to visit Helmand and Kandahar provinces, among others, I return convinced that the campaign is, in fact, going well and that it is suppressing the Taliban. Obviously, the Taliban have not given up the struggle, as witness, for example, the terrible suicide bombing of a new Afghan-American patrol base in Kandahar’s Zhare district — an attack that killed six American servicemen. But keep in mind that Zhare has for years been one of the Taliban’s strongholds. The fact that the Taliban are still able to carry out an occasional suicide bombing there is much less significant than the fact that the area is now dotted with American patrol bases. U.S. troops have in fact had considerable, though still incomplete, success in pushing the Taliban out of many of their southern redoubts.

Once progress on the ground is further along, it is likely that elements of the Taliban will be eager to stop fighting. At that point, peace talks may produce some results. But that time is not now. Which is why the authors of the open letter are so far off base when they call on President Obama “to sanction and support a direct dialogue and negotiation with the Afghan Taliban leadership residing in Pakistan.” The open letter claims that the “Taliban’s leadership has indicated its willingness to negotiate,” which is true in the sense that the Taliban are happy to negotiate the exit of international forces in the expectation that they will then take over the whole country and reimpose their fundamentalist dictatorship. But they have so far shown little willingness to negotiate on terms acceptable to most Afghans or to the international community, which has sponsored Afghanistan’s post-2001 experiment in democracy. In fact, when some elements of the Taliban leadership showed a willingness to make concessions, they were promptly locked up by the Pakistanis, who don’t want to see their proxies surrendering without their say-so.

The notion “that mediation can help achieve a settlement which brings peace to Afghanistan” is simply delusional — at least for now. For talks to have any success, General Petraeus needs more time to pursue his comprehensive counterinsurgency strategy. Thankfully, it appears as if he will get that time from the Obama administration — which is far too sensible to listen to the counsels of despair from this open letter.

Read Less

A Response to John Derbyshire

In his post responding to George W. Bush’s op-ed on combating AIDS in Africa, John Derbyshire writes this:

The subsidizing of expensive medications (the biggest part of our AIDS-relief effort, though not all of it) in fact has long-term consequences more likely to be negative than positive. The high incidence of AIDS in sub-Saharan Africa is caused by customary practices there. What is needed is for people to change those customary practices. Instead, at a cost of billions to the U.S. taxpayer, we have made it possible for Africans to continue in their unhealthy, disease-spreading habits.

Perhaps the future of sub-Saharan Africa would be brighter if the people of that place changed some of their customs; but now, thanks to us, they don’t have to.

Here are a few facts that undermine Derbyshire’s case: (a) Africans have fewer sex partners on average over a lifetime than do Americans; (b) 22 countries in Africa have had a greater than 25 percent decline in infections in the past 10 years (for South African and Namibian youth, the figure is 50 percent in five years); and (c) America’s efforts are helping to create a remarkable shifts in how, in Africa, boys view girls — reflected in a decline of more than 50 percent in sexual partners among boys.

So Derbyshire’s argument that our AIDS efforts are “more likely to be negative than positive” because they will continue to subsidize and encourage “unhealthy, disease-spreading habits” is not only wrong but the opposite of reality.

There is more. Derbyshire’s view might best be expressed as “the Africans had an AIDS death sentence coming to them.” But in Africa, gender violence and abuse is involved in the first sexual encounter up to 85 percent of time. And where President Bush’s PEPFAR initiative has been particularly effective is in slowing the transmission of the disease from mothers to children. Perhaps Derbyshire can explain to us how exactly infants are complicit in their AIDS affliction. Or maybe he doesn’t much care if they are. Read More

In his post responding to George W. Bush’s op-ed on combating AIDS in Africa, John Derbyshire writes this:

The subsidizing of expensive medications (the biggest part of our AIDS-relief effort, though not all of it) in fact has long-term consequences more likely to be negative than positive. The high incidence of AIDS in sub-Saharan Africa is caused by customary practices there. What is needed is for people to change those customary practices. Instead, at a cost of billions to the U.S. taxpayer, we have made it possible for Africans to continue in their unhealthy, disease-spreading habits.

Perhaps the future of sub-Saharan Africa would be brighter if the people of that place changed some of their customs; but now, thanks to us, they don’t have to.

Here are a few facts that undermine Derbyshire’s case: (a) Africans have fewer sex partners on average over a lifetime than do Americans; (b) 22 countries in Africa have had a greater than 25 percent decline in infections in the past 10 years (for South African and Namibian youth, the figure is 50 percent in five years); and (c) America’s efforts are helping to create a remarkable shifts in how, in Africa, boys view girls — reflected in a decline of more than 50 percent in sexual partners among boys.

So Derbyshire’s argument that our AIDS efforts are “more likely to be negative than positive” because they will continue to subsidize and encourage “unhealthy, disease-spreading habits” is not only wrong but the opposite of reality.

There is more. Derbyshire’s view might best be expressed as “the Africans had an AIDS death sentence coming to them.” But in Africa, gender violence and abuse is involved in the first sexual encounter up to 85 percent of time. And where President Bush’s PEPFAR initiative has been particularly effective is in slowing the transmission of the disease from mothers to children. Perhaps Derbyshire can explain to us how exactly infants are complicit in their AIDS affliction. Or maybe he doesn’t much care if they are.

Let’s now turn to Derbyshire’s characterization that America is becoming the “welfare provider of last resort to all the world’s several billion people”: he is more than a decade behind in his understanding of overseas-development policy.

President Bush’s policies were animated by the belief that the way to save lives was to rely on the principle of accountability. That is what was transformational about Bush’s development effort. He rejected handing out money with no strings attached in favor of tying expenditures to reform and results. And it has had huge radiating effects. When PEPFAR was started, America was criticized by others for setting goals. Now the mantra around the world is “results-based development.” Yet Derbyshire seems to know nothing about any of this. That isn’t necessarily a problem — unless, of course, he decides to write on the topic.

Beyond that, though, the notion that AIDS relief in Africa is AFDC on a global scale is silly. We are not talking about providing food stamps to able-bodied adults or subsidizing illegitimacy; we’re talking about saving the lives of millions of innocent people and taking steps to keep human societies from collapsing. Private charity clearly wasn’t enough.

On the matter of Derbyshire’s claim that AIDS relief in Africa is unconnected to our national interest: al-Qaeda is actively trying to establish a greater presence in nations like Tanzania, Kenya, and Nigeria, which have become major ideological battlegrounds. And mass disease and death, poverty and hopelessness, make the rise of radicalism more, not less, likely. (Because of AIDS, in some countries nearly a half-century of public-health gains have been wiped away.)

Many things allow militant Islam to take root and grow; eliminating AIDS would certainly not eliminate jihadism. Still, a pandemic, in addition to being a human tragedy, makes governments unstable and regions ungovernable. And as one report put it, “Unstable and ungoverned regions of the world … pose dangers for neighbors and can become the setting for broader problems of terrorism … The impoverished regions of the world can be unstable, volatile, and dangerous and can represent great threats to America, Europe, and the world. We must work with the people of these regions to promote sustainable economic growth, better health, good governance and greater human security. …”

One might think that this observation very nearly qualifies as banal — but for Derbyshire, it qualifies as a revelation.

For the sake of the argument, though, let’s assume that the American government acts not out of a narrow interpretation of the national interest but instead out of benevolence — like, say, America’s response to the 2004 tsunami that hit Indonesia and other nations in the Indian Ocean. Why is that something we should oppose, or find alarming, or deem un-conservative? The impulse to act is, in fact, not only deeply humane but also deeply American.

In a speech in Lewiston, Illinois, in 1858, Abraham Lincoln, in quoting from the Declaration (“all men are created equal … endowed by their Creator with certain unalienable right”), said:

This was their majestic interpretation of the economy of the Universe. This was their lofty, and wise, and noble understanding of the justice of the Creator to His creatures. Yes, gentlemen, to all His creatures, to the whole great family of man. In their enlightened belief, nothing stamped with the Divine image and likeness was sent into the world to be trodden on, and degraded, and imbruted by its fellows.

This belief about inherent human dignity does not mean that America can solve every problem in the world or that we shouldn’t focus most of our energy and treasure on America itself. But if the United States is able, at a reasonable cost ($25 billion over five years), to help prevent widespread death, that is something we should be proud of it. (A recent Stanford study found that PEPFAR was responsible for saving the lives of more than a million Africans in just its first three years.)

Derbyshire seems to take an almost childish delight in advertising his indifference to the suffering of others, at least when the others live on a different continent and come from a different culture. Back in February 2006, when more than 1,000 people were believed to have died when an Egyptian ferry sank in the Red Sea, Derbyshire wrote:

In between our last two posts I went to Drudge to see what was happening in the world. The lead story was about a ship disaster in the Red Sea. From the headline picture, it looked like a cruise ship. I therefore assumed that some people very much like the Americans I went cruising with last year were the victims. I went to the news story. A couple of sentences in, I learned that the ship was in fact a ferry, the victims all Egyptians. I lost interest at once, and stopped reading. I don’t care about Egyptians.

Cultivating what Adam Smith (in The Theory of Moral Sentiments) called “sympathy” and “fellow feeling” is a complicated matter. Suffice it to say that very few of us care about the suffering and fate of others as much as we should. Yet most of us aren’t proud of this fact; we are, rather, slightly embarrassed by it. Not John Derbyshire. He seems eager to celebrate his callousness, as if it were a sign of manliness and tough-mindedness. I haven’t a clue whether this is a pose, done for shock value or some such thing, or real. All we can do is judge Derbyshire by his public words. And they are not only unpersuasive; they are at times downright ugly.

Read Less

No Pop for the Poor

New York City’s mayor wants the federal government to say food stamps can’t be used to buy soda – a story that is less about the technicalities of welfare and more about political paternalism.

Now, there’s a strong argument to be made that if the government is setting the table and preparing the dinner, it should be able to choose the menu. But that argument is not being made; on the contrary, those who want soda omitted from the items obtainable by food stamps are making the link between health and public spending.

The New York Times ran an op-ed today by city and state health commissioners. In it, they point out that “some 57 percent of adults in New York City and 40 percent of children in New York City public schools are overweight or obese” and that “one in eight adult city residents now has diabetes, and the disease is nearly twice as common among poorer New Yorkers as it is among wealthier ones.”

Pay close attention to their following conclusion: “Obesity-related illnesses cost New York State residents nearly $8 billion a year in medical costs, or $770 per household. All of us pay the price through higher taxes.”

This story could be seen as some microscopic foreshadowing of what’s to come for everybody, not just for the surprisingly high number of food-stamp recipients — 1.7 million in New York City alone, or 35 percent of the city’s residents (who, by the way, will still be able to buy that soda on their own buck).

Granted, in New York City, two-thirds of the population does not rely on government to fill the pantry. But once everyone’s health care is a public-spending issue, it is logical to assume that, at least to some extent, private behaviors will be up for public scrutiny; they have become a public cost issue.

Never mind Tocqueville’s warning about the democratic danger of preferring comfort to freedom. For those willing to sacrifice some degree of liberty for a government that ensures their well-being, here’s a little reminder that paternalism isn’t always so comfortable. In addition to saying yes, it also says no sometimes.

New York City’s mayor wants the federal government to say food stamps can’t be used to buy soda – a story that is less about the technicalities of welfare and more about political paternalism.

Now, there’s a strong argument to be made that if the government is setting the table and preparing the dinner, it should be able to choose the menu. But that argument is not being made; on the contrary, those who want soda omitted from the items obtainable by food stamps are making the link between health and public spending.

The New York Times ran an op-ed today by city and state health commissioners. In it, they point out that “some 57 percent of adults in New York City and 40 percent of children in New York City public schools are overweight or obese” and that “one in eight adult city residents now has diabetes, and the disease is nearly twice as common among poorer New Yorkers as it is among wealthier ones.”

Pay close attention to their following conclusion: “Obesity-related illnesses cost New York State residents nearly $8 billion a year in medical costs, or $770 per household. All of us pay the price through higher taxes.”

This story could be seen as some microscopic foreshadowing of what’s to come for everybody, not just for the surprisingly high number of food-stamp recipients — 1.7 million in New York City alone, or 35 percent of the city’s residents (who, by the way, will still be able to buy that soda on their own buck).

Granted, in New York City, two-thirds of the population does not rely on government to fill the pantry. But once everyone’s health care is a public-spending issue, it is logical to assume that, at least to some extent, private behaviors will be up for public scrutiny; they have become a public cost issue.

Never mind Tocqueville’s warning about the democratic danger of preferring comfort to freedom. For those willing to sacrifice some degree of liberty for a government that ensures their well-being, here’s a little reminder that paternalism isn’t always so comfortable. In addition to saying yes, it also says no sometimes.

Read Less

A Mess of His Own Making

The non-peace talks are on hiatus while Mahmoud Abbas goes running to the Arab League for instructions. Elliott Abrams explains why we shouldn’t much care:

The sky is not falling. Israeli-Palestinian peace negotiations were suspended on Sunday, perhaps briefly and perhaps for months, after Israel’s 10-month moratorium on settlement construction expired. Palestinian officials said they would refuse to talk if construction restarted, and so they did. Yet war hasn’t broken out, nor will it. …

Also last week, Palestinian President Mahmoud Abbas reminded his people that “we tried the intifada and it caused us a lot of damage.” Hamas, the terrorist group that rules the Gaza Strip, can commit acts of terror at any time. But with Israeli and Palestinian officials working together to keep the peace, Hamas can’t create a general uprising.

Peace negotiations between Israel and the Palestine Liberation Organization (PLO) have been an on-again, off-again affair since they began with the Oslo Accords in 1993. During the Arafat years talks alternated with terrorism, for Arafat viewed both as useful and legitimate tactics. After the so-called second intifada of 2000-2001 and the 9/11 attacks, Israel’s then-Prime Minister Ariel Sharon ran out of patience with that game, as did President George W. Bush. From then on they worked to push Arafat aside.

As Abrams points out, the Bush team oversaw negotiations for five years, under the “in and up” but not “out” understanding on settlements:

The Obama administration junked that deal, and its continuing obsession with a settlement freeze—Mr. Obama mentioned it again at the U.N. last week—has cornered Mr. Abbas. The Americans are effectively urging him back to the table while making it impossible for him to get there. This diplomatic problem is what medical science calls “iatrogenic”: a disease caused by the physicians themselves.

Whether or not the parties return to the table, Abrams explains, it is important to keep our eyes on the real world. On the West Bank, economic progress continues. Security has improved. (“Most of this good news came, of course, during 18 months when there were no peace negotiations at all.”) So long as the Obami manage not to get in the way of all that, there is hope that one day there will be a Palestinian society that supports a peace deal. But not now. So let the diplomats shuttle. Or not.

The greatest danger right now is not to “peace” but to Obama’s prestige and credibility. And frankly, that’s an iatrogenic phenomenon, too. Or in common parlance, Obama has made his bed, and unless the Arab League and Bibi rescue him out, he will be forced to lie in it.

The non-peace talks are on hiatus while Mahmoud Abbas goes running to the Arab League for instructions. Elliott Abrams explains why we shouldn’t much care:

The sky is not falling. Israeli-Palestinian peace negotiations were suspended on Sunday, perhaps briefly and perhaps for months, after Israel’s 10-month moratorium on settlement construction expired. Palestinian officials said they would refuse to talk if construction restarted, and so they did. Yet war hasn’t broken out, nor will it. …

Also last week, Palestinian President Mahmoud Abbas reminded his people that “we tried the intifada and it caused us a lot of damage.” Hamas, the terrorist group that rules the Gaza Strip, can commit acts of terror at any time. But with Israeli and Palestinian officials working together to keep the peace, Hamas can’t create a general uprising.

Peace negotiations between Israel and the Palestine Liberation Organization (PLO) have been an on-again, off-again affair since they began with the Oslo Accords in 1993. During the Arafat years talks alternated with terrorism, for Arafat viewed both as useful and legitimate tactics. After the so-called second intifada of 2000-2001 and the 9/11 attacks, Israel’s then-Prime Minister Ariel Sharon ran out of patience with that game, as did President George W. Bush. From then on they worked to push Arafat aside.

As Abrams points out, the Bush team oversaw negotiations for five years, under the “in and up” but not “out” understanding on settlements:

The Obama administration junked that deal, and its continuing obsession with a settlement freeze—Mr. Obama mentioned it again at the U.N. last week—has cornered Mr. Abbas. The Americans are effectively urging him back to the table while making it impossible for him to get there. This diplomatic problem is what medical science calls “iatrogenic”: a disease caused by the physicians themselves.

Whether or not the parties return to the table, Abrams explains, it is important to keep our eyes on the real world. On the West Bank, economic progress continues. Security has improved. (“Most of this good news came, of course, during 18 months when there were no peace negotiations at all.”) So long as the Obami manage not to get in the way of all that, there is hope that one day there will be a Palestinian society that supports a peace deal. But not now. So let the diplomats shuttle. Or not.

The greatest danger right now is not to “peace” but to Obama’s prestige and credibility. And frankly, that’s an iatrogenic phenomenon, too. Or in common parlance, Obama has made his bed, and unless the Arab League and Bibi rescue him out, he will be forced to lie in it.

Read Less

LIVE BLOG: Sen. Tom Coburn for Tort Reform

Sen. Tom Coburn makes a strong case for tort reform, where we can cut costs by 15 percent, he says. That, he explains, will in turn increase access. He urges policies to promote disease prevention, argues that Medicaid has much higher rates of fraud than private plans, and makes the case for tackling lawsuit abuse, which spurs defensive medicine. He says we need to go where the money is. Obama says he’s adopted “all the good ideas” on fraud and abuse. But he hasn’t. He doesn’t back serious, real tort reform.

Sen. Tom Coburn makes a strong case for tort reform, where we can cut costs by 15 percent, he says. That, he explains, will in turn increase access. He urges policies to promote disease prevention, argues that Medicaid has much higher rates of fraud than private plans, and makes the case for tackling lawsuit abuse, which spurs defensive medicine. He says we need to go where the money is. Obama says he’s adopted “all the good ideas” on fraud and abuse. But he hasn’t. He doesn’t back serious, real tort reform.

Read Less

Flotsam and Jetsam

If you thought Obama was talking “We are the World” gibberish again to the “Muslim World,” you were right. He sort of seemed to be saying (if you get the plain English translation): “We’ll pull out of Iraq, soon and responsibly (is there any other way?); also, we’ll close our eyes and click our heels together three times and wish upon a star over and over again until Israelis and Palestinians reach Peace; in return you, in Afghanistan and beyond, will become modern, woman-respecting democrats because of our forged partnerships (and a few troops? Oh, never mind them!).” Read the whole thing, as they say.

Mickey Kaus reads the typically aggressive and hyper-partisan Obami’s invitation to Republicans to the health-care summit and finds: “Unsubtle subtext: We like our bill and the purpose of this meeting is to set things up so it can pass. … But what if, as a Republican, you don’t think we are ‘the closest … to resolving this issue in … nearly 100 years’? Maybe you don’t think the bill will resolve the issue at all! (I disagree, but I’m not a Republican.) … Even if Obama’s only trying to appear bipartisan, his aides are doing a mighty poor job of conveying that impression.”

Even Dana Milbank can figure out that the Washington blizzards were “an inconvenient meteorological phenomenon for Al Gore.” He writes: “In Washington’s blizzards, the greens were hoisted by their own petard. For years, climate-change activists have argued by anecdote to make their case. Gore, in his famous slide shows, ties human-caused global warming to increasing hurricanes, tornadoes, floods, drought, and the spread of mosquitoes, pine beetles, and disease.” He even concedes, “The scientific case has been further undermined by high-profile screw-ups. First there were the hacked e-mails of a British research center that suggested the scientists were stacking the deck to overstate the threat. Now comes word of numerous errors in a 2007 report by the UN’s Intergovernmental Panel on Climate Change, including the bogus claim that the Himalayan glaciers would disappear in 25 years.” Maybe Al Gore should give back the Oscar.

I suppose it’s not news when Harry Reid screws up a potential bipartisan deal and blindsides the White House. But, on his sinking down the bipartisan Senate bill, even the New York Times acknowledges that “it was a telling glimpse into the state of mind of rattled Senate Democrats.” And another reason why Reid’s defeat might be a very welcome development by his party.

There is an alternative to civilian trials for terrorists. And it’s legal and everything: “Sen. Lindsey Graham (R-S.C.) repeated his call Saturday for the Obama administration to try suspected terrorists in military tribunals. A former military lawyer himself, Graham said the tribunal system was well-equipped to handle delicate terrorism cases. . . . Graham was a main author of the Military Commission Act of 2009, which modified the tribunal system to align with a Supreme Court ruling.” Funny how none of the Obama spinners defending their handling of terrorist even mention the 2009 statute.

Politico asks “Why Cheney attacks?” The insiderish Beltway outlet can’t really be that dense, right? For starters, Cheney has been right and is in sync with the American people. And then the former VP does manage to get under the skin of the Obami and send them scrambling. (Politico might want to cut out the Stephen Walt and Keith Olbermann quotes — jeez — as well as the Beagle Blogger psychobabble if it wants to be taken seriously on these sorts of stories.)

Gov. Chris Christie earns plaudits: “As politicians spend America into the fiscal abyss, Republican Gov. Chris Christie has a novel idea: Freeze spending. For such statesmanship, watch him be demonized like no one before. . . New Jersey’s new governor, the successor of so many corrupt chief executives, is taking action that will make him, like Reagan, the focus of pure hate from those who think what taxpayers earn is Monopoly money to be treated according to the whims and desires of politicians, bureaucrats, union bosses and other power players.”

Not everyone (anyone?) is buying the itsy-bitsy-sanctions approach. (“Sanctions on the accounts of Iran’s Revolutionary Guard in WESTERN banks?”) Amitai Etzioni writes: ” You can fool some people some of the time, but the Obama Administration credibility is melting faster than the snow in Washington.”

If you thought Obama was talking “We are the World” gibberish again to the “Muslim World,” you were right. He sort of seemed to be saying (if you get the plain English translation): “We’ll pull out of Iraq, soon and responsibly (is there any other way?); also, we’ll close our eyes and click our heels together three times and wish upon a star over and over again until Israelis and Palestinians reach Peace; in return you, in Afghanistan and beyond, will become modern, woman-respecting democrats because of our forged partnerships (and a few troops? Oh, never mind them!).” Read the whole thing, as they say.

Mickey Kaus reads the typically aggressive and hyper-partisan Obami’s invitation to Republicans to the health-care summit and finds: “Unsubtle subtext: We like our bill and the purpose of this meeting is to set things up so it can pass. … But what if, as a Republican, you don’t think we are ‘the closest … to resolving this issue in … nearly 100 years’? Maybe you don’t think the bill will resolve the issue at all! (I disagree, but I’m not a Republican.) … Even if Obama’s only trying to appear bipartisan, his aides are doing a mighty poor job of conveying that impression.”

Even Dana Milbank can figure out that the Washington blizzards were “an inconvenient meteorological phenomenon for Al Gore.” He writes: “In Washington’s blizzards, the greens were hoisted by their own petard. For years, climate-change activists have argued by anecdote to make their case. Gore, in his famous slide shows, ties human-caused global warming to increasing hurricanes, tornadoes, floods, drought, and the spread of mosquitoes, pine beetles, and disease.” He even concedes, “The scientific case has been further undermined by high-profile screw-ups. First there were the hacked e-mails of a British research center that suggested the scientists were stacking the deck to overstate the threat. Now comes word of numerous errors in a 2007 report by the UN’s Intergovernmental Panel on Climate Change, including the bogus claim that the Himalayan glaciers would disappear in 25 years.” Maybe Al Gore should give back the Oscar.

I suppose it’s not news when Harry Reid screws up a potential bipartisan deal and blindsides the White House. But, on his sinking down the bipartisan Senate bill, even the New York Times acknowledges that “it was a telling glimpse into the state of mind of rattled Senate Democrats.” And another reason why Reid’s defeat might be a very welcome development by his party.

There is an alternative to civilian trials for terrorists. And it’s legal and everything: “Sen. Lindsey Graham (R-S.C.) repeated his call Saturday for the Obama administration to try suspected terrorists in military tribunals. A former military lawyer himself, Graham said the tribunal system was well-equipped to handle delicate terrorism cases. . . . Graham was a main author of the Military Commission Act of 2009, which modified the tribunal system to align with a Supreme Court ruling.” Funny how none of the Obama spinners defending their handling of terrorist even mention the 2009 statute.

Politico asks “Why Cheney attacks?” The insiderish Beltway outlet can’t really be that dense, right? For starters, Cheney has been right and is in sync with the American people. And then the former VP does manage to get under the skin of the Obami and send them scrambling. (Politico might want to cut out the Stephen Walt and Keith Olbermann quotes — jeez — as well as the Beagle Blogger psychobabble if it wants to be taken seriously on these sorts of stories.)

Gov. Chris Christie earns plaudits: “As politicians spend America into the fiscal abyss, Republican Gov. Chris Christie has a novel idea: Freeze spending. For such statesmanship, watch him be demonized like no one before. . . New Jersey’s new governor, the successor of so many corrupt chief executives, is taking action that will make him, like Reagan, the focus of pure hate from those who think what taxpayers earn is Monopoly money to be treated according to the whims and desires of politicians, bureaucrats, union bosses and other power players.”

Not everyone (anyone?) is buying the itsy-bitsy-sanctions approach. (“Sanctions on the accounts of Iran’s Revolutionary Guard in WESTERN banks?”) Amitai Etzioni writes: ” You can fool some people some of the time, but the Obama Administration credibility is melting faster than the snow in Washington.”

Read Less

Iran: Arsonist and Firefighter

J.E. Dyer highlighted Iran’s new boldness all across the Middle East — and if I may weigh in, the Yemen situation looks like classic Iran: play the arsonist, then volunteer to be the fireman — for a small reward, naturally!

The spookiest bit of this latest twist of affairs is Washington’s response, as Jennifer notes. And when an official statement reads like this: “It’s our view that there can be no long-term military solution to the conflict between the Yemeni government and the Houthi rebels,” it almost looks like it came out of the EU.  Never shall there be a military solution to a conflict! A bit like saying, “There shall be no medical solution to a disease” — let the microbes and the antibodies negotiate their way to a compromise through the good offices of the United Nations. Let them receive an envoy from the EU! But no conflict. Nope.

I can picture the fear running through the spines of Iran’s Revolutionary Guards as they hear Washington’s tone quickly aligning itself with the discourse of those pugnacious Eurocrats.

J.E. Dyer highlighted Iran’s new boldness all across the Middle East — and if I may weigh in, the Yemen situation looks like classic Iran: play the arsonist, then volunteer to be the fireman — for a small reward, naturally!

The spookiest bit of this latest twist of affairs is Washington’s response, as Jennifer notes. And when an official statement reads like this: “It’s our view that there can be no long-term military solution to the conflict between the Yemeni government and the Houthi rebels,” it almost looks like it came out of the EU.  Never shall there be a military solution to a conflict! A bit like saying, “There shall be no medical solution to a disease” — let the microbes and the antibodies negotiate their way to a compromise through the good offices of the United Nations. Let them receive an envoy from the EU! But no conflict. Nope.

I can picture the fear running through the spines of Iran’s Revolutionary Guards as they hear Washington’s tone quickly aligning itself with the discourse of those pugnacious Eurocrats.

Read Less

You Want Domestic Policy?

John McCain is trying his best to shift from his single-minded focus on foreign policy to a broader agenda that will appeal to key independent voters. His topic this week is health care, about which he offered a detailed speech and a new ad.

His approach borrows from George W. Bush’s ill-fated healthcare plan (and from Rudy Giuliani’s as well). The basic idea is to shift from employer-based plans (in which the consumer/patient is not responsible for costs) to individually-purchased healthcare plans (where consumers will be in charge). By ending the employer benefit tax exemption and providing a tax credit instead, allowing interstate insurance purchases, and throwing in some tort reform, this proposal aims to decrease cost and increase availability.

But McCain has a ways to go if he’s going to sell it. One troubling aspect of the proposal, his GAP plan, is in bad shape. It’s aimed at a vaguely defined pool of hard-to-insure and needy healthcare consumers, and it sounds like little more than an adjunct to Medicare and Medicaid. McCain says:

I will work with Congress, the governors, and industry to make sure that it is funded adequately and has the right incentives to reduce costs such as disease management, individual case management, and health and wellness programs. These programs reach out to people who are at risk for different diseases and chronic conditions and provide them with nurse care managers to make sure they receive the proper care and avoid unnecessary treatments and emergency room visits. The details of a Guaranteed Access Plan will be worked out with the collaboration and consent of the states.

Although he disclaims any intention to create a new entitlement program, he says that the GAP plan would put “reasonable limits on premiums, and assistance would be available for Americans below a certain income level.” That said, McCain’s plan is as market-based an approach as a politician who doesn’t want to risk running on a platform of “Buy your own darn insurance!” is going to offer

John McCain is trying his best to shift from his single-minded focus on foreign policy to a broader agenda that will appeal to key independent voters. His topic this week is health care, about which he offered a detailed speech and a new ad.

His approach borrows from George W. Bush’s ill-fated healthcare plan (and from Rudy Giuliani’s as well). The basic idea is to shift from employer-based plans (in which the consumer/patient is not responsible for costs) to individually-purchased healthcare plans (where consumers will be in charge). By ending the employer benefit tax exemption and providing a tax credit instead, allowing interstate insurance purchases, and throwing in some tort reform, this proposal aims to decrease cost and increase availability.

But McCain has a ways to go if he’s going to sell it. One troubling aspect of the proposal, his GAP plan, is in bad shape. It’s aimed at a vaguely defined pool of hard-to-insure and needy healthcare consumers, and it sounds like little more than an adjunct to Medicare and Medicaid. McCain says:

I will work with Congress, the governors, and industry to make sure that it is funded adequately and has the right incentives to reduce costs such as disease management, individual case management, and health and wellness programs. These programs reach out to people who are at risk for different diseases and chronic conditions and provide them with nurse care managers to make sure they receive the proper care and avoid unnecessary treatments and emergency room visits. The details of a Guaranteed Access Plan will be worked out with the collaboration and consent of the states.

Although he disclaims any intention to create a new entitlement program, he says that the GAP plan would put “reasonable limits on premiums, and assistance would be available for Americans below a certain income level.” That said, McCain’s plan is as market-based an approach as a politician who doesn’t want to risk running on a platform of “Buy your own darn insurance!” is going to offer

Read Less

George McGovern, Free Marketeer?

Who would have thought that George McGovern would write an eminently sane column on the dangers of government paternalism? It has gems like this:

Buying health insurance on the Internet and across state lines, where less expensive plans may be available, is prohibited by many state insurance commissions. Despite being able to buy car or home insurance with a mouse click, some state governments require their approved plans for purchase or none at all. It’s as if states dictated that you had to buy a Mercedes or no car at all.

He even looks at the unintended consequences of bans on “payday lending,” long a bogey-man of liberals, and finds the cure is worse than the disease. He concludes:

Since leaving office I’ve written about public policy from a new perspective: outside looking in. I’ve come to realize that protecting freedom of choice in our everyday lives is essential to maintaining a healthy civil society. Why do we think we are helping adult consumers by taking away their options? We don’t take away cars because we don’t like some people speeding. We allow state lotteries despite knowing some people are betting their grocery money. Everyone is exposed to economic risks of some kind. But we don’t operate mindlessly in trying to smooth out every theoretical wrinkle in life. The nature of freedom of choice is that some people will misuse their responsibility and hurt themselves in the process. We should do our best to educate them, but without diminishing choice for everyone else.

No, honest: George McGovern wrote that. Perhaps he could talk to Hillary Clinton and Barack Obama about their notions for solving the home mortgage crisis. I am sure he would advise Clinton of the unintended consequences of freezing rates on sub-prime loans, or warn Obama about the costs to taxpayers and consumers at large of a bail-out fund for affected homeowners.

All this raises a question: has McGovern become wiser as the years have passed or has his party has become dimmer? Perhaps both . . .

Who would have thought that George McGovern would write an eminently sane column on the dangers of government paternalism? It has gems like this:

Buying health insurance on the Internet and across state lines, where less expensive plans may be available, is prohibited by many state insurance commissions. Despite being able to buy car or home insurance with a mouse click, some state governments require their approved plans for purchase or none at all. It’s as if states dictated that you had to buy a Mercedes or no car at all.

He even looks at the unintended consequences of bans on “payday lending,” long a bogey-man of liberals, and finds the cure is worse than the disease. He concludes:

Since leaving office I’ve written about public policy from a new perspective: outside looking in. I’ve come to realize that protecting freedom of choice in our everyday lives is essential to maintaining a healthy civil society. Why do we think we are helping adult consumers by taking away their options? We don’t take away cars because we don’t like some people speeding. We allow state lotteries despite knowing some people are betting their grocery money. Everyone is exposed to economic risks of some kind. But we don’t operate mindlessly in trying to smooth out every theoretical wrinkle in life. The nature of freedom of choice is that some people will misuse their responsibility and hurt themselves in the process. We should do our best to educate them, but without diminishing choice for everyone else.

No, honest: George McGovern wrote that. Perhaps he could talk to Hillary Clinton and Barack Obama about their notions for solving the home mortgage crisis. I am sure he would advise Clinton of the unintended consequences of freezing rates on sub-prime loans, or warn Obama about the costs to taxpayers and consumers at large of a bail-out fund for affected homeowners.

All this raises a question: has McGovern become wiser as the years have passed or has his party has become dimmer? Perhaps both . . .

Read Less

Caveat Emptor

Are there people out there who take Wikipedia seriously as a source of objective information? There shouldn’t be, but unfortunately there are. In fact, lots of students use it a source of first resort. It’s so popular, that whenever you type almost any subject into Google, the first hit is usually for a Wikipedia entry.

Yet disinformation abounds, often motivated by animus or prejudice. There is, for instance, the by-now famous story of a former assistant to Robert F. Kennedy who was brazenly—and completely without foundation—accused on Wikipedia of complicity in the assassinations of both JFK and RFK. (For this sorry tale, see his article.)

A friend has now called my attention to another bizarre distortion, this one an attempt not to besmirch the character of one man but of an entire country. If you look up the Philippine War (1899-1902) you get this entry. And in the very first paragraph you get this statement: “The U.S. conquest of the Philippines has been described as a genocide, and resulted in the death of 1.4 million Filipinos (out of a total population of seven million).”

I was pretty startled to read this. I have written a whole chapter on the war in my book, The Savage Wars of Peace, and I have never once heard that the U.S. was guilty of genocide. How could it have entirely escaped my attention?

Read More

Are there people out there who take Wikipedia seriously as a source of objective information? There shouldn’t be, but unfortunately there are. In fact, lots of students use it a source of first resort. It’s so popular, that whenever you type almost any subject into Google, the first hit is usually for a Wikipedia entry.

Yet disinformation abounds, often motivated by animus or prejudice. There is, for instance, the by-now famous story of a former assistant to Robert F. Kennedy who was brazenly—and completely without foundation—accused on Wikipedia of complicity in the assassinations of both JFK and RFK. (For this sorry tale, see his article.)

A friend has now called my attention to another bizarre distortion, this one an attempt not to besmirch the character of one man but of an entire country. If you look up the Philippine War (1899-1902) you get this entry. And in the very first paragraph you get this statement: “The U.S. conquest of the Philippines has been described as a genocide, and resulted in the death of 1.4 million Filipinos (out of a total population of seven million).”

I was pretty startled to read this. I have written a whole chapter on the war in my book, The Savage Wars of Peace, and I have never once heard that the U.S. was guilty of genocide. How could it have entirely escaped my attention?

There is, needless to say, not a scintilla of evidence that Presidents McKinley and Roosevelt made any attempt to wipe out the population of the Philippines. There is no doubt that a lot of Filipinos died in the course of the war, but most of those deaths were the result of disease, not American bullets. In my book, I cite the generally accepted casualty totals: 4,234 American dead and, on the other side, 16,000 Filipinos killed in battle and another 200,000 civilians killed mainly by disease and famine. My sources for these estimates are books written by William Thaddeus Sexton, an historian writing in the 1930’s, and two more recent accounts written by Stanley Karnow and Walter LaFeber. Neither Karnow nor LaFeber is exactly an American imperialist; in fact, both are well-known liberals. Yet their casualty counts are seven times lower than those claimed by Wikipedia, and they make no mention of any genocide.

Where does the Wikipedia figure come from? The footnote refers to an online essay, “U.S. Genocide in the Philippines” by E. San Juan Jr., posted on an obscure website. The author is described as follows: “E. San Juan, Jr. was recently Fulbright Professor of American Studies at the Katholieke Universiteit Leuven, Belgium, and visiting professor of literature and cultural studies at National Tsing Hua University in Taiwan, Republic of China.” Not exactly a pedigree that instantly screams out that he has any special expertise on the Philippine War.

In his short essay (1,046 words), E. San Juan Jr. concedes that his claims of genocide and of 1.4 million dead do not come from any mainstream sources. He writes: “Among historians, only Howard Zinn and Gabriel Kolko have dwelt on the ‘genocidal’ character of the catastrophe.” But even these ultra-left-wing “revisionist” historians (who also have no expertise in the Philippine War) have, in his telling, cited no more than 600,000 dead Filipinos.

So whence the figure of 1.4 million? According to Mr. San Juan, “The first Filipino scholar to make a thorough documentation of the carnage is the late Luzviminda Francisco in her contribution to The Philippines: The End of An Illusion (London, 1973).” I confess to never having heard of Ms. Francisco (whose works are cataloged online by neither the Library of Congress nor the New York Public Library), but Amazon does contain a link for one of her books. It’s called Conspiracy for Empire: Big business, corruption, and the politics of imperialism in America, 1876-1907 and it was published in 1985 by something called the Foundation for Nationalist Studies, which doesn’t have a web page (or at least none that I could discover).

I am, to put it mildly, underwhelmed by the historical evidence gathered here to accuse the U.S. of having killed 1.4 million people in an attempted genocide. This is not the kind of finding that would be accepted for a second by any reputable scholar, regardless of political orientation. But it is the kind of pseudo-fact that is all too common on the world’s most schlocky wannabe “encyclopedia.” Caveat emptor.

Read Less

Learning to Love Anthrax

Who was behind the anthrax attacks of 2001? The FBI has still not solved the case and, at this rate, it probably never will. But even if we never solve the mystery, we still were taught a terrifying lesson in the perils of biological terrorism. We really do need to worry about biowarfare (BW) agents like anthrax and botulinum falling into the hands of groups like al Qaeda.

Or do we? Perhaps not as much as we think. Last year, Christian Enemark, a national-security expert at the Australian National University in Sydney, prepared a comprehensive evaluation, “Biological Attacks and the Non-State actor: A Threat Assessment.” Focusing on the use of salmonella bacteria by the Rajneesh cult in Washington State in 1984, the Aum Shinrikyo attacks in Japan in the early 1990’s, and the U.S. anthrax attacks, it offers a complete balance sheet of the pros and cons of using BW agents for terrorist purposes.

On the pro side from the terrorist’s point of view, one of the attractive features of using biological weapons is the effect on the “worried well.” Even small attacks, like the 2001 anthrax episode, which sickened seventeen people and killed five, play upon “the visceral human fear of infection” so that even a modest attack generates a huge impact:

Read More

Who was behind the anthrax attacks of 2001? The FBI has still not solved the case and, at this rate, it probably never will. But even if we never solve the mystery, we still were taught a terrifying lesson in the perils of biological terrorism. We really do need to worry about biowarfare (BW) agents like anthrax and botulinum falling into the hands of groups like al Qaeda.

Or do we? Perhaps not as much as we think. Last year, Christian Enemark, a national-security expert at the Australian National University in Sydney, prepared a comprehensive evaluation, “Biological Attacks and the Non-State actor: A Threat Assessment.” Focusing on the use of salmonella bacteria by the Rajneesh cult in Washington State in 1984, the Aum Shinrikyo attacks in Japan in the early 1990’s, and the U.S. anthrax attacks, it offers a complete balance sheet of the pros and cons of using BW agents for terrorist purposes.

On the pro side from the terrorist’s point of view, one of the attractive features of using biological weapons is the effect on the “worried well.” Even small attacks, like the 2001 anthrax episode, which sickened seventeen people and killed five, play upon “the visceral human fear of infection” so that even a modest attack generates a huge impact:

People are acutely sensitive to the prospect of infection, as distinct from other health risks in everyday life such as smoking and high fat consumption. In addition, we tend to distinguish between types of infection based on the historical reputation of a disease, whether it is characterized by grotesque symptoms and/or high fatality rates, and how common or familiar it is. Cities function normally in the midst of a community-wide epidemic of regular influenza. But an outbreak of plague, responsible for the 14th-century Black Death in Europe, would be likely to cause widespread panic. And the highly lethal Ebola virus, although not easily transmissible between humans, inspires particular dread because it causes massive hemorrhaging in its victims. New or unfamiliar diseases can also engender a level of fear out of proportion to the threat they pose, in morbidity and mortality terms, relative to other diseases. This is demonstrated by the panic reaction to SARS in 2003. In China alone, over 100,000 people die each year from tuberculosis, and there are projected to be 10 million Chinese with HIV/AIDS by 2010.Yet SARS, which ultimately resulted in fewer than 400 Chinese deaths, generated dread to an extent vastly disproportionate to the disease’s ability to kill.

On the other side of the ledger–and I have drawn only small snippets from Enemark’s comprehensive accounting–if biological weapons can easily cause terror, they cannot readily be employed by non-state actors to kill on a mass scale.

Enemark cites research showing that cult-like organizations, “may be the least suited to meet the complex requirements for a BW program.” The Aum Shinrikyo case, in particular, “illustrates that a paranoid, fantasy-prone, and sometimes violent atmosphere [inside a cult] is not conducive to the sound scientific judgments needed to produce and weaponize biological agents.”

Aum’s leaders reinforced the cult’s doctrines among members through the use of physical isolation, beatings, physical torture, and the administration of hallucinogenic drugs such as LSD. As an organization, Aum was also fickle by nature and inclined to embark on numerous expensive, and sometimes bizarre, ventures rather than concentrate on perfecting a particular weapon. Its activities in pursuit of producing mass casualties included an expedition to acquire the Ebola virus during an outbreak in Zaire in October 1992. Aum also attempted to build a high-power laser weapon and sought a device for generating earthquakes.

Like Aum, al Qaeda is arguably “paranoid” and “fantasy-prone,” and its scientists–if it currently has any in its ranks–would definitely have to operate inside a violent environment. Even though al Qaeda has been able to plan some terrorism spectaculars, as on 9/11, its skills inside a bio-warfare lab might well lag.

Of course, states (like Saddam Hussein’s Iraq) face far fewer obstacles in developing biological weapons, and Enemark examines the possibility that a Saddam-like regime would provide a toxic agent to a group like al Qaeda. But he concludes that “any state anxious for its own survival would be most unlikely to entrust a BW capability to ruthless outsiders.”

Is Enemark right? Are we worrying too much about BW when we should be worrying about other things? However one answers that question, this is a paper deserving close attention.

Read Less

Debating Cancer

Several of the Democratic presidential candidates gathered yesterday in Iowa for a debate and forum on cancer sponsored by the Lance Armstrong Foundation. Armstrong, a cancer survivor and advocate for research, began the event by telling the crowd that “the next occupant of the Oval Office must discuss this critical issue with voters.”

But as the forum went on, it became increasingly difficult to see what exactly there is to discuss. Of course everyone agrees that cancer treatment and research are critically important. Cancer in its various forms kills more Americans than any other disease (having surpassed heart disease for that top spot in 2005). And cancer research receives about $5.5 billion a year in funding from the National Institutes of Health, far more than is allocated to any other disease and about 25 percent more than was spent on cancer research in 2001.

Read More

Several of the Democratic presidential candidates gathered yesterday in Iowa for a debate and forum on cancer sponsored by the Lance Armstrong Foundation. Armstrong, a cancer survivor and advocate for research, began the event by telling the crowd that “the next occupant of the Oval Office must discuss this critical issue with voters.”

But as the forum went on, it became increasingly difficult to see what exactly there is to discuss. Of course everyone agrees that cancer treatment and research are critically important. Cancer in its various forms kills more Americans than any other disease (having surpassed heart disease for that top spot in 2005). And cancer research receives about $5.5 billion a year in funding from the National Institutes of Health, far more than is allocated to any other disease and about 25 percent more than was spent on cancer research in 2001.

So what is there for candidates to say? “Even more money” is the easy refrain: both John Edwards and Hillary Clinton promised to double the budget of the NIH (which currently stands at $28 billion). But throwing money at medical research can have unintended consequences. When the NIH budget doubled between 1998 and 2003, the many cautionary lessons that emerged pointed to the hidden dangers of rapid growth in research funding. Beyond funding, candidates described more and more extreme ways to limit smoking (while acknowledging he wasn’t sure if it would be constitutional, Edwards endorsed a national ban on smoking in public places).

The bulk of the debate, however, was taken up with arguments about a federally financed single-payer health care system. Not arguments pro and contra, mind you, but arguments about just how far to go and how to get there—with John Edwards again taking the cake by promising he wouldn’t even let the insurance and drug companies be involved in the development of his system. A serious moderator (i.e., not Chris Matthews) might have brought up the comparatively dismal figures for cancer treatment in several developed nations that have universal health care systems. But Matthews just let the aimless arguments continue.

Meanwhile, in the real world of cancer research, serious progress is being made, in no small part because of the work of those evil pharmaceutical companies. Cancer death rates have declined by about 1.5 percent per year for the last fifteen years, and cancer is slowly becoming more like a chronic disease than a swift killer. Much work remains, but, as yesterday’s forum made clear, it’s not the work of politicians.

Read Less

EEOC Meets CIA

John Lehman, secretary of the Navy in the Reagan administration, served on the 9/11 commission. He makes an important point about it in yesterday’s Washington Post, noting that its recommendation that we establish a secure system of identification cards—a vital measure of self-defense in view of how the 9/11 terrorists infiltrated our society—was enacted but remains unfulfilled.

Lehman goes on to say that the commission’s other 40 “nonpolitical and non-ideological” recommendations—some enacted, some not—“continue to stand the test of time.” All of them, he stresses, were and are “achievable in the real world.”

Forgive me, but I have serious doubts. Yes, all of the recommendations were achievable in the real world. That is precisely one of the problems. One new measure in particular—making the CIA director subordinate to a National Intelligence Director (NID) as Congress has in fact done—has only served to graft a new layer of bureaucracy over agencies, like the CIA itself and the FBI, that were dysfunctional in the first place and in need of fundamental reform if not outright reconception.

John Negroponte, our country’s first NID, and an immensely versatile and talented public servant, gave up this position to become Condoleezza Rice’s deputy, a significant step down. One can only wonder why. Since this past February, John McConnell has been wrestling with the job. One can only wish him well.

Read More

John Lehman, secretary of the Navy in the Reagan administration, served on the 9/11 commission. He makes an important point about it in yesterday’s Washington Post, noting that its recommendation that we establish a secure system of identification cards—a vital measure of self-defense in view of how the 9/11 terrorists infiltrated our society—was enacted but remains unfulfilled.

Lehman goes on to say that the commission’s other 40 “nonpolitical and non-ideological” recommendations—some enacted, some not—“continue to stand the test of time.” All of them, he stresses, were and are “achievable in the real world.”

Forgive me, but I have serious doubts. Yes, all of the recommendations were achievable in the real world. That is precisely one of the problems. One new measure in particular—making the CIA director subordinate to a National Intelligence Director (NID) as Congress has in fact done—has only served to graft a new layer of bureaucracy over agencies, like the CIA itself and the FBI, that were dysfunctional in the first place and in need of fundamental reform if not outright reconception.

John Negroponte, our country’s first NID, and an immensely versatile and talented public servant, gave up this position to become Condoleezza Rice’s deputy, a significant step down. One can only wonder why. Since this past February, John McConnell has been wrestling with the job. One can only wish him well.

The 9/11 commission was certainly correct that the old order was profoundly flawed. Indeed a long line of CIA directors recognized the contradictory limitations of their position, which seemed to grant them control over the entire U.S. intelligence effort but actually did not. James Schlesinger, who ran the agency for a spell, noted way back in 1971, in a top-secret memo to Richard Nixon, that a series of Presidents had exhorted directors of the CIA “to play the role of [intelligence] community leader and coordinator, but [their] authority over the community has remained minimal.”

But the 9/11 commission’s remedy may well prove to be worse than the disease it was meant to cure. The staff of 1,500 or so employees who now report to the DNI are no doubt among the best and the brightest in the intelligence community. But is this a virtue or a vice? Top talent has been drawn away from the task of actually collecting and interpreting intelligence and into the job of bureaucratic coordination.

What is more, the office of the Director of National Intelligence—the ODNI—is inexorably taking on many of the dysfunctional characteristics of the agencies beneath it, including a seemingly ineradicable preoccupation with affirmative action. As DNI, Negroponte certainly had his hands full with this issue. Even while working tirelessly to avert a second 9/11, he also felt compelled to toil hand in glove with an interagency body called the Diversity Senior Advisory Panel for the Intelligence Community (DSAPIC) to understand “the causes of the under-representation of women, minorities, and persons with disabilities” in the intelligence community. This body came up with a plan, Diversity: A National Security Imperative for the Intelligence Community, that made Negroponte confident that the intelligence community would reach its “goal of a work force that looks like America.”

Never mind that a far more urgent “national-security imperative” would be to have an intelligence community that looks not like America but like our key intelligence targets, including Iran or North Korea or Lebanon, where we are flailing around in the dark. Under John McConnell the mindless commitment to diversity evidently persists. It could not be an accident that on the ODNI’s organizational chart the “Equal Employment Opportunity and Diversity Officer” occupies one of the most prominent spots, positioned on the same line as the director himself.

One would have hoped that the top of the chart would have been occupied by a benignly titled slot like “director of special projects,” whose real job would be to think through how to identify and apprehend home-grown jihadists. A most important fact to bear in mind is that it was not the ODNI or the CIA or the FBI that broke the plot now brought to an end in New Jersey, but a sharp-eyed video-rental clerk.

To apply for a position at the ODNI, click here.

To fight terrorism, click here.

Read Less

The “Emergencies” of the Stem-Cell Debate

The Washington Post’s Dana Milbank offers a portrait of the debate on stem-cell research that took place in the Senate over the last two days. He notes the extent to which Senators, especially those working to remove the boundaries governing federal funding of embryo research, focused on sad, often quite touching stories of illness and suffering in their own lives and those of their families and friends.

This makes sense, of course, since the debate was about medical research. But on the other hand, it does raise the question of exactly what case those stories were intended to make. The stem-cell debate is not about whether our country should support medical research—there is an absolute consensus on that point. The federal government spends about $30 billion on such research through the National Institutes of Health each year. The debate is not even about whether to support stem-cell research. The federal government has spent about $3 billion on various forms of stem-cell research since 2001, including more than $130 million on embryonic stem-cell research.

Read More

The Washington Post’s Dana Milbank offers a portrait of the debate on stem-cell research that took place in the Senate over the last two days. He notes the extent to which Senators, especially those working to remove the boundaries governing federal funding of embryo research, focused on sad, often quite touching stories of illness and suffering in their own lives and those of their families and friends.

This makes sense, of course, since the debate was about medical research. But on the other hand, it does raise the question of exactly what case those stories were intended to make. The stem-cell debate is not about whether our country should support medical research—there is an absolute consensus on that point. The federal government spends about $30 billion on such research through the National Institutes of Health each year. The debate is not even about whether to support stem-cell research. The federal government has spent about $3 billion on various forms of stem-cell research since 2001, including more than $130 million on embryonic stem-cell research.

But that money has been spent under the constraints of President Bush’s embryonic stem-cell funding policy, which says only research that uses lines of cells created before the policy was enacted can be funded. Those created from embryos destroyed after the policy came into effect are not eligible. The idea is to prevent taxpayer dollars from offering an incentive to destroy human embryos–which is precisely what the bill the Senate passed last night (by a vote of 63 to 34, falling short of the margin needed to override a veto) would do.

What’s wrong with such research? The President believes (as do I) that human embryos—human beings in their earliest developmental stages—should not be treated as raw materials for scientific experimentation. America’s commitment to equality requires us to treat all human beings with at least that minimal regard. (I laid out this case on the New York Times op-ed page a few months ago.) Others believe that human embryos are not worthy of that degree of regard or protection, because they’re not developed enough, or large enough, or possessed of the capacity for cognition or pain.

These are legitimate arguments about the nature of the human embryo and the appropriate attitude toward it. But what bearing does a Senator’s story about his neighbor’s diabetes have on the argument? The point of these stories in the stem-cell debate is to argue not that one approach or another is ethical, but that the fact that so many of us and our loved ones suffer from serious ailments should cause us to put ethics aside. It is a case for approaching medical research—and indeed the very fact of illness, if not of death itself—with what might best be called a crisis mentality. Our illnesses, our deaths, are an emergency, and until the emergency is over we can’t bother with abstractions about equality and dignity.

This, too, is not a senseless attitude. But it is deeply misguided and dangerous. The “emergency” will never be over. Disease and death will haunt us always. We are right to struggle against them, and modern medicine offers us some formidable tools in that effort, which we ought to use and develop further. But we must do so with a sense that medicine is a science of postponement and an art of delay, not a crusade for final victory over death. That means the mission of medicine is permanent, not temporary. And that in turn means modern medicine must find its place in everyday life, rather than insist that we treat everyday life as an emergency that requires the suspension of moral and ethical rules. If we can have no recourse to ethics until we’re done fighting disease, then we can have no ethics at all.

Read Less

The Aging Society

An analysis of government demographic data published this week reveals an astonishing spike in the number of Americans stricken with Alzheimer’s disease—an increase of 10 percent in just five years. The chief reason is one we could hardly regret: the great success of modern medicine.

The incidence of Alzheimer’s increases sharply with age, and many more Americans are living into their seventies, eighties, and nineties than ever have before. Among those fortunate enough to make it past age eighty-five, a whopping 50 percent are afflicted with Alzheimer’s or a similar form of dementia. The baby boomers are the first American generation to have lived their entire lives truly in the age of modern medicine (which we might roughly define as beginning with the introduction of penicillin into general use). That has made them the healthiest generation to date, and will surely make them the longest lived. It will also mean that for an enormous number of American families, the coming decades will be shaped by the contours of the slow mental (and eventually physical) decline brought on by dementia.

Read More

An analysis of government demographic data published this week reveals an astonishing spike in the number of Americans stricken with Alzheimer’s disease—an increase of 10 percent in just five years. The chief reason is one we could hardly regret: the great success of modern medicine.

The incidence of Alzheimer’s increases sharply with age, and many more Americans are living into their seventies, eighties, and nineties than ever have before. Among those fortunate enough to make it past age eighty-five, a whopping 50 percent are afflicted with Alzheimer’s or a similar form of dementia. The baby boomers are the first American generation to have lived their entire lives truly in the age of modern medicine (which we might roughly define as beginning with the introduction of penicillin into general use). That has made them the healthiest generation to date, and will surely make them the longest lived. It will also mean that for an enormous number of American families, the coming decades will be shaped by the contours of the slow mental (and eventually physical) decline brought on by dementia.

The advance of the disease can be emotionally excruciating for the patient’s loved ones, as the afflicted person is slowly lost to them but is still very much with them. The disease can also carry enormous economic costs for the families involved, since in middle and later stages patients often need constant and intense care.

We are thoroughly unprepared for the scale of the demographic shift to come, as our society (on average) grays in the coming years. Families will learn to cope, and the nation will find ways to afford the added burdens, but it will take time and it won’t be pleasant at first, as a whole set of complex emotional, ethical, and economic challenges rush at us.

Some have begun the work of thinking through these challenges. The President’s Council on Bioethics, for instance, released a report on the issue in 2005. Leon Kass (the Council’s former chairman) and Eric Cohen also wrote a powerful article on some related questions in the January 2006 issue of COMMENTARY. Many others are at work as well. But it’s fair to say that policymakers and the bulk of the public still have no idea what’s coming, and how the life of every American family will be affected.

Here’s a hint for anyone hoping to run for President ten years from now: learn everything you can about long-term care.

Read Less

From COMMENTARY: Health Care in Three Acts

As President Bush prepares to address the issue of health care in his State of the Union address, COMMENTARY is fortunate to have a trenchant analysis of the wider problem, “Health Care in Three Acts,” by Eric Cohen and Yuval Levin, coming out in the February issue. Here is an advance look.

Americans say they are very worried about health care: on generic lists of voter concerns, health issues regularly rank just behind terrorism and the Iraq war. And politicians are eager to do something about it. To empower consumers, the White House has advanced the idea of Health Savings Accounts; to help the uninsured, it has explored using Medicaid more creatively. Senator Edward Kennedy of Massachusetts, the Democrats’ leader on this issue, has backed “Medicare for all.” The American Medical Association has called for tax credits to put private coverage within reach of more Americans. A number of recent books have proposed solutions to our health-care problems ranging from socialized medicine on the Left to laissez-faire schemes of cost containment on the Right. In Washington and in the state capitals, pressure is building for serious reforms.

But what exactly are Americans worried about? Untangling that question is harder than it looks. In a 2006 poll, the Kaiser Family Foundation found that while a majority proclaimed themselves dissatisfied with both the quality and the cost of health care in general, fully 89 percent said they were satisfied with the quality of care they themselves receive. Eighty-eight percent of those with health insurance rated their coverage good or excellent—the highest approval rating since the survey began 15 years ago. A modest majority, 57 percent, were satisfied even with its cost.

Read More

As President Bush prepares to address the issue of health care in his State of the Union address, COMMENTARY is fortunate to have a trenchant analysis of the wider problem, “Health Care in Three Acts,” by Eric Cohen and Yuval Levin, coming out in the February issue. Here is an advance look.

Americans say they are very worried about health care: on generic lists of voter concerns, health issues regularly rank just behind terrorism and the Iraq war. And politicians are eager to do something about it. To empower consumers, the White House has advanced the idea of Health Savings Accounts; to help the uninsured, it has explored using Medicaid more creatively. Senator Edward Kennedy of Massachusetts, the Democrats’ leader on this issue, has backed “Medicare for all.” The American Medical Association has called for tax credits to put private coverage within reach of more Americans. A number of recent books have proposed solutions to our health-care problems ranging from socialized medicine on the Left to laissez-faire schemes of cost containment on the Right. In Washington and in the state capitals, pressure is building for serious reforms.

But what exactly are Americans worried about? Untangling that question is harder than it looks. In a 2006 poll, the Kaiser Family Foundation found that while a majority proclaimed themselves dissatisfied with both the quality and the cost of health care in general, fully 89 percent said they were satisfied with the quality of care they themselves receive. Eighty-eight percent of those with health insurance rated their coverage good or excellent—the highest approval rating since the survey began 15 years ago. A modest majority, 57 percent, were satisfied even with its cost.

Evidently, though, this widespread contentment with one’s own lot coexists with concern on two other fronts. Thus, in the very same Kaiser poll, nearly 90 percent considered the number of Americans without health insurance to be a serious or critical national problem. Similarly, a majority of those with insurance of their own fear that they will lose their coverage if they change jobs, or that, “in the next few years,” they will no longer be able to afford the coverage they have. At least as troubling is what the public does not seem terribly bothered about—namely, the dilemmas of end-of-life care in a rapidly aging society and the exploding costs of Medicare as the baby-boom generation hits age sixty-five.

All of this makes it difficult to speak of health care as a single coherent challenge, let alone to propose a single workable solution. In fact, America faces three fairly distinct predicaments, affecting three fairly distinct portions of the population—the poor, the middle class, and the elderly—and each of them calls for a distinct approach.

For the poor, the problem is affording coverage. Forty-six million Americans were uninsured in 2005, according to the Census Bureau. This is about 15.9 percent of the population, which has been the general range now for more than a decade, peaking at 16.3 percent in 1998.

But that stark figure fails to convey the shifting face and varied make-up of the uninsured. On average, a family that loses its coverage will become insured again in about five months, and only one-sixth of the uninsured lack coverage for two years or more. In addition, about a fifth of the uninsured are not American citizens, and therefore could not readily benefit from most proposed reforms. Roughly a third of the uninsured are eligible for public-assistance programs (especially Medicaid) but have not signed up, while another fifth (many of them young adults, under thirty-five) earn more than $50,000 a year but choose not to buy coverage.

It is also crucial to distinguish between a lack of insurance coverage and a lack of health care. American hospitals cannot refuse patients in need who are without insurance; roughly $100 billion is spent annually on care for such patients, above and beyond state and federal spending on Medicaid. The trouble is that most of this is emergency care, which includes both acute situations that might have been prevented and minor problems that could have been treated in a doctor’s office for considerably less money. The real problem of the uninsured poor, then, is not that they are going without care, but that their lack of regular and reliable coverage works greatly to the detriment of their family stability and physical well-being, and is also costly to government.

For the middle class, the problem is different: the uncertainty caused in part by the rigid link between insurance and employment and in part by the vicissitudes of health itself. America’s employment-based insurance system is unique in the world, a product of historical circumstances and incremental reforms that have made health care an element of compensation for work rather than either a simple marketplace commodity or a government entitlement. This system now covers roughly 180 million Americans. It works well for the vast majority of them, but the link it creates between one’s job and one’s health coverage, and the peculiar economic inefficiencies it yields, result in ever-mounting costs for employers and, in an age of high job mobility, leave many families anxious about future coverage even in good times.

The old, finally, face yet another set of problems: the steep cost of increasingly advanced care (which threatens to paralyze the government) and the painful decisions that come at the limits of medicine and the end of life. Every American over sixty-five is eligible for at least some coverage by the federal Medicare program, which pays much of the cost of most hospital stays, physician visits, laboratory services, diagnostic tests, outpatient services, and, as of 2006, prescription drugs. Established in 1965, Medicare is funded in part by a flat payroll tax of 2.9 percent on nearly every American worker and, beyond that, by general federal revenue. Most recipients pay only a monthly premium that now stands at $88.50, plus co-payments on many procedures and hospital stays.

But precisely because Medicare is largely funded by a payroll tax, it suffers acutely from the problems of an aging society. In 1950, just over 8 percent of Americans were over sixty-five. Today that figure stands at nearly 15 percent, and by 2030 it is expected to reach over 20 percent, or 71 million Americans. Moreover, the oldest of the old, those above the age of eighty-five, who require the most intense and costly care, are now the fastest growing segment of the population; their number is expected to quadruple in the next half-century.

For Medicare, therefore, just as for Social Security, the number of recipients is increasing while the number of younger workers to pay the bills is declining. But Medicare faces a greater danger still. Its costs are a function not only of the number of eligible recipients but of the price of the services they use. Over the past few years, health-care spending in America has increased by about 8 percent each year, most steeply for older Americans who have the most serious health problems. As these costs continue to rise much faster than the wages on which Medicare’s funding is based, the program’s fiscal decline will be drastic, with commensurately drastic consequences for the federal budget.

Three different “crises,” then, each of a different weight and character. The crisis of the uninsured, while surely a serious challenge, has often been overstated, especially on the Left, in an effort to promote more radical reforms than are necessary. The crisis of insured middle-class families has been misdiagnosed both by the Right, which sees it purely as a function of economic inefficiency, and by the Left, which sees it as an indictment of free-market medicine. And the crisis of Medicare has been vastly understated by everyone, in an effort to avoid taking the painful measures necessary to prevent catastrophe. In each case, a clearer understanding may help point the way to more reasonable reforms.

In the case of the uninsured, the best place to begin is with the solution most frequently proposed to their plight: a government-run system of health care for all Americans.

Under such a system—which exists in some form in most other industrialized democracies—the government pays everyone’s medical bills, and in many cases even owns and runs the health-care system itself. The appeal of this idea lies in its basic fairness and simplicity: everyone gets the same care, from the same source, in the same way, based purely on need. In one form or another—actual proposals have varied widely, with Hillary Clinton’s labyrinthine scheme of 1993 merely the best known of many—this “single-payer” model remains the preferred health-care solution of the American Left. But it is ill-suited to the actual problems of America’s uninsured, and adopting it would greatly exacerbate other problems as well.

Everywhere it has been tried, the single-payer model has yielded inefficient service and lower-quality care. In Britain today, more than 700,000 patients are waiting for hospital treatment. In Canada, it takes, on average, seventeen weeks to see a specialist after a referral. In Germany and France, roughly half of the men diagnosed with prostate cancer will die from the disease, while in the United States only one in five will. According to one study, 40 percent of British cancer patients in the mid-1990’s never got to see an oncologist at all.

Such dire statistics have in fact caused many Western democracies with single-payer systems to turn toward market mechanisms for relief. The Swedes have begun to privatize home care and laboratory services. Australia now offers generous tax incentives to citizens who eschew the public system for private care. To send a message to the government, the Canadian Medical Association recently elected as its president a physician who runs a private hospital in Vancouver, actually illegal in Canada. “This is a country in which dogs can get a hip replacement in under a week,” the new president told a newspaper interviewer, “while humans can wait two or three years.”

Defenders of the single-payer concept often point out that, despite patient complaints about the quality of care, overall measures of health in countries with such systems are roughly equivalent to those in America. That may be so, but the chief reason lies in social and cultural factors—crime rates, diet, and so forth—that make life in many other Western nations safer and healthier than life in America, and that would not be altered by a single-payer health system. Besides, citizens in those other nations benefit enormously from medical innovations produced and made possible by America’s dynamic private market; if that market were hobbled by a European-style bureaucracy, their quality of care would suffer along with ours.

And quality of care, it is important to remember, is one thing that most Americans are happy with. Any reform that promises to replace immediate access to specialists with long waiting lines, or the freedom to choose one’s own doctor with restrictive government mandates, is certain to evoke deep hostility, and thereby to cut into public support for efforts to help the uninsured.

On this score, proponents of socialized medicine would do well to consult the cautionary example of the health-maintenance organization (HMO). HMO’s are insurers who contract directly with providers, often for a flat fee, reviewing physician referrals and medical decisions in order to prevent unnecessary procedures or expenses. By the mid-1990’s, this capacity for cost-containment had made HMO’s very attractive to policy-makers and families alike. And they delivered on their cost-cutting promise. In those years, as David Gratzer notes in his recent book The Cure (Encounter, 325 pp., $25.95), private health-care spending per capita grew by just 2 percent annually (today the figure is nearly 10 percent, though the reasons for this, as we shall see below, go beyond just the decline of HMO’s).*

But the public soon chafed under the authoritarian character of a system in which case managers were entrusted with decisions that often seemed arbitrary, while doctors resented having their medical judgment questioned by bureaucrats. Participation soon declined, and HMO’s themselves began to take on the characteristics of traditional insurance plans. By the middle of this decade, they had joined the bipartisan list of stock American villains: in the 2004 presidential campaign, President Bush accused Senator John Kerry of getting “millions from executives at HMO’s,” while Kerry pledged to “free our government from the dominance of the lobbyists, the drug industry, big oil, and HMO’s—so that we can give America back its future and its soul.”

In a single-payer government system, everything Americans dislike about HMO’s would be worse: rationing, top-down control, perverse incentives, and, for patients, very little say. As has happened in Europe, a single-payer approach would also turn health-care costs entirely into government costs, grossly distorting public spending and threatening to crowd out other important government functions. The result would be a political, fiscal, and social disaster.

There is a better way to assist the uninsured: not universal government health care but universal private insurance coverage. Such an effort could begin by identifying the populations in need. Those who are uninsured by their own choice could be offered incentives to purchase at least some minimal coverage, or be penalized for failing to do so. Those who cannot afford insurance could be given subsidies to purchase private coverage based on their level of income, and then pooled into a common group to give them some purchasing power and options. Their coverage would still not equal that available to people in the most generous employer-based plans, but it would offer reliable access to care without destroying the quality and flexibility of the American system.

Although such a plan might not be cheap, it would not be nearly so expensive or complex as a single-payer system. The money for it could be taken, in part, from Medicaid funds now used to pay doctors and hospitals for care already provided to the uninsured, with such “uncompensated-care” programs gradually transformed into a voucher system for purchasing private coverage. But though it might rely on some federal dollars, the reform itself would best be undertaken and managed at the state level. After all, health insurance is regulated by the states, Medicaid is largely managed by the states, and different states face different challenges and possess different resources.

In Massachusetts and Florida, ideas like these are already being tested, although it is too early to judge the results. The federal government can help other states try this more practical approach by clearing away regulatory obstacles and by providing incentives for experiments in creative reforms.

This brings us to the health-care anxieties of middle-class Americans. Although these concerns are in most respects much less pressing than those of the poor, they are real enough. Middle-class families are, besides, the heart and soul of America’s culture and economy, as well as the essential political force for any sober assessment and improvement of America’s health-care system.

Generally speaking, the worries expressed by these Americans stem from the peculiarities of our employer-based insurance market. It is, indeed, a very odd thing that more than 180 million Americans should be covered by insurance purchased for them by their employers. The companies we work for do not buy our food and clothing, or our car and home insurance. They pay us for our labor, and we use that money to buy what we want.

No less odd is the character of what we call health insurance. Insurance usually means coverage for extreme emergencies or losses. We expect auto insurance to kick in when our car is badly damaged in an accident, not when we need a routine oil change; homeowner’s insurance covers us after a fire, flood, or break-in, not when we need to repair the deck or unclog the gutters. But when it comes to health, we expect some element of virtually every expense to be covered, including routine doctor checkups and regular care.

America’s insurance system is largely a historical accident. During World War II, the federal government imposed wage controls on American employers. No longer able to raise salaries to compete for employees, companies turned instead to offering the lure of fringe benefits, and the era of employer-based health care was born. Thanks to a 1943 IRS ruling allowing an exemption for money spent by employers on health insurance, an enormous tax incentive was created as well. Rather than giving a portion of every dollar to the government, employees could get a full dollar’s worth of insurance through their company.

Of course, wage controls are long gone, but the system they inadvertently created, including the tax exemption, remains in place. Although this system has served most Americans very well, it has two significant drawbacks. First, by forging a tight link between one’s job and one’s health insurance, it makes losing a job, or changing jobs, a scary proposition, especially for parents. Second, it lacks any serious check on costs. Because insurance often pays the bulk of every single bill (instead of kicking in only for emergencies or extreme expenses), most American families do not know, or attend to, the actual cost of their health care.

Any car owner can tell you the price of a gallon of gas or an oil change. But what is the price of knee surgery? Or even a regular doctor’s visit? Does one hospital or doctor charge more than another? Most patients pay only a deductible that, while often not cheap, bears almost no relation to the price of the service they receive. As a result, they do not behave like consumers, shopping for the best price and thereby forcing providers to compete for their dollar.

Inured to such issues, families worry most about the lack of portability of their insurance, leaving it to economists to worry about the distorting effects of price inefficiencies. To gain the support of middle-class parents, any reform to the system would therefore need to address the former issue first.

Policy-makers on the Left have tended to understand this, but have over-read the anxiety of families, seeing it as a broad indictment of America’s free-market health care. They have thus offered the same bad solution to the problems of the insured as they do to the problems of the uninsured: a government-run system that will replace our present one. As for conservative policy-makers, they sometimes tend to overlook the concerns of middle-class families altogether, focusing on inefficiency before portability.

The conservative health-care solution of the moment is the health savings account, or HSA. It has two components: a savings account to which individuals and employers can make tax-free contributions to be drawn on exclusively for routine health-care costs, and a high-deductible insurance plan to help pay for catastrophic expenses.

Since individuals can take their HSA’s with them when they change jobs (provided the new employer allows it), this option can indeed help promote insurance portability. But, generally speaking, that is neither its foremost aim nor its effect. Instead, it is seen by its proponents as helping to level the playing field by giving to individuals the same tax breaks that employers get in purchasing coverage, and as helping to train people to think like consumers, since in spending their own money they will have an incentive to spend as little of it as possible. In short, proponents of the HSA want to use market mechanisms to achieve lower costs and improved quality.

This is certainly a worthy goal—but does it meet the concerns of most Americans? David Gratzer, an advocate of the HSA, tells the story of a woman who used such an account in exactly the desired way. Needing foot surgery, and impelled to spend her own money wisely, she

took charge of the situation and thought about what she really needed. When a simple day-surgery was suggested, she looked around and decided on a local surgery center. She asked about clinic fees and offered to pay upfront—thereby getting a 50-percent discount. When she found out that an anesthetist would come in specifically to do the foot block, she asked her surgeon just to do it. She also negotiated the surgeon’s compensation down from $1,260 to $630. Finally, she got a prescription from her doctor for both antibiotics and painkillers, but only filled the former. “In the past, my attitude would have been, ‘just have all the prescriptions filled because insurance was paying for it, whether or not I need them.’”

Although Gratzer offers this as an ideal example, it will surely strike many people as a nightmare. Haggling with doctors, ignoring prescriptions, bypassing a specialist to save money—is this the solution to middle-class health-care worries? Who among us feels confident taking so much responsibility for judgments over his own health, let alone over the care of his children or his elderly parents?

If the HSA is to have wide appeal, it must be sold first and foremost as a means not of efficiency but of portability—and as part of a broader effort to expand the portability of health insurance generally. Nor should such an effort be aimed, at least at first, at undoing our employer-based system. Perhaps, given a blank slate, no sensible person would ever have designed the current system. But we do not have a blank slate. We have a system providing care that the vast majority of insured Americans are quite happy with—and that has also helped America resist the pressure for government-run health care of the kind for which every other developed nation is now paying a heavy price.

We have, in other words, a system that works but is in need of repairs, most notably in the realm of improved portability. Making this happen will require better cooperation between state and federal policy-makers. An exclusively national solution would require federalizing the regulation of health insurance, which is both undesirable and politically unachievable. Instead, states should be encouraged to develop insurance marketplaces like the one now taking shape in Massachusetts. Mediating between providers and purchasers, these would allow employers, voluntary groups, and individuals to select from a common set of private options. Whether working full-time, part-time, or not at all, individuals and families could choose from the same menu of plans and thus maintain constant coverage even as their job situations or life circumstances change. For those who cannot afford insurance and do not receive it from an employer, Medicaid dollars could be used to subsidize the purchase of a private plan.

The federal government, meanwhile, could ensure that Medicaid dollars allotted to states can be used to support such a structure of subsidies. It could also pursue other, smaller measures, like extending or eliminating the time limit on the COBRA program, which allows individuals leaving a job to keep their employment-based plan by paying the full premium. As states begin implementing marketplace reforms, the federal government could also find ways to encourage regional and eventually national marketplaces, which would enable the purchase of insurance across state lines.

In any such scheme, Health Savings Accounts would surely have a place. So would other measures of cost containment like greater price transparency. But the key to any large reform must be its promise to address the real worries of insured American families by preserving what is good about the current system while facing up to its limits and confronting its looming difficulties.

Unfortunately, when it comes to paying for the health care of older Americans, there are few attractive options. Costs have risen steeply in recent years, while the economic footing of the Medicare program has been steadily eroding. Nor are demographic realities likely to change for at least a generation; to the contrary, they may only worsen. So the solution must involve some form of cost containment.

This will not be easy. As Arnold Kling points out in Crisis of Abundance (Cato Institute, 120 pp., $16.95), costs are rising not because of increasing prices for existing medical services but because of a profound transformation in the way medicine is practiced in America. Between 1975 and 2002, the U.S. population increased by 35 percent, but the number of physicians in the country grew by over 100 percent. The bulk of these were specialists, whose services cost a great deal more than those of general practitioners. New technologies of diagnosis (like MRI exams) have also become routine, and not just for the old, and the number and variety of treatments, including surgeries, have likewise increased. We spend more because more can be done for us.

All of this spells heavier demands on the Medicare budget, to the point where the program’s fiscal prospects have become very bleak. Already accounting for roughly 15 percent of federal spending, Medicare will be at 25 percent by 2030 and growing. In David Gratzer’s words, “Medicare threatens to be the program that ate the budget.”

Worse yet, one of the most expensive and complicated burdens of an aging society is not even covered by Medicare. This is long-term care, involving daily medical and personal assistance to people incapable of looking after themselves. The Congressional Budget Office estimates that Americans spent roughly $137 billion on long-term care in 2000, and that by 2020 the figure will reach $207 billion. Longer lives, and the high incidence of dementia among the oldest of the old, are bound to impose an extraordinary new financial strain on middle-income families, whose consequent demand for government help will only worsen our already looming fiscal crisis.

Medicaid, which covers health care for the poor, does pay for some long-term care in most states. To qualify for this, and to avoid burdening their children, a growing number of the elderly have opted to spend down their assets when the need arises. But this ends up burdening their children anyway, if less directly. States already spend more on Medicaid than on primary and secondary education combined; if Medicaid comes to shoulder the bulk of long-term costs in the coming decades, it will bankrupt state coffers and place enormous strains on the federal budget.

Of course, the challenges of an aging society reach well beyond economics. As more and more Americans face an extended decline in their final years, elderly patients and their families will confront painful choices about how much care is worthwhile, who should assume the burdens of care-giving, and when to forgo additional life-sustaining treatment. Compared to this profound human challenge, fiscal dilemmas can seem relatively paltry. But they too necessitate hard and unavoidable choices.

One way or another, the Medicare program will have to be adjusted to a society with radically different demographics from the one it was designed to serve. If “seventy is the new fifty,” as a popular bumper sticker tells us, then the age of Medicare eligibility must begin to move up as well. That will inevitably impose a hardship on those who are already not vigorous in their sixties, as well as on those whose jobs are too physically demanding for even a healthy sixty-five-year-old. So hand in hand with raising the age of eligibility will need to go programs encouraging (or requiring) health-care savings earlier in life. At the same time, Medicare benefits will gradually have to become means-tested, so that help goes where it is most needed and benefits are most generous to those with the lowest incomes and fewest assets.

More fundamentally, the structure of the Medicare program will have to change. Its benefits now increase in an open-ended way that both reflects and drives the upward movement of health costs; if Medicare is to remain sustainable, constraints will gradually have to be put in place, so that benefits grow by a set percentage each year. The program will also need its own distinct and reasonably reliable funding source, which will require an adjustment in the design of the payroll tax.

Any such reforms will be politically explosive, to put it mildly. No politician in his right mind would run on a platform of limiting Medicare eligibility and capping its benefits. And yet, a decade from now, caring for aging parents will have become a burning issue for a great swath of America’s families as parents find themselves squeezed between the needs of their own parents and the needs of their children. Every politician will be expected to offer a solution, and will be subject to dangerous temptations: promising limitless care at the very moment when fiscal responsibility requires setting limits, or promising to “solve” our fiscal problems by abandoning the elderly. The least that responsible policy-makers can do now is to familiarize Americans with the realities of our aging society, so that when the time comes for difficult choices, we will not be blind-sided.

Understanding America’s three distinct health-care challenges, and the deficiencies of conventional responses to them, is the first step toward reform. Any approach we take will assuredly cost the taxpayers money. Already, nearly a third of the federal budget is spent on health-care, and that portion is certain to grow. The choice, however, is between paying the necessary price to ameliorate our genuine problems or paying far more to satisfy ideological whims or avoid politically painful decisions.

Neither socialized medicine nor a pure market approach is suited to America’s three health-care challenges, while the bipartisan conspiracy to ignore the looming crisis of Medicare in particular will return to haunt our children. Coming to grips with the true nature of our challenges suggests, instead, a set of pragmatic answers designed to address the real problems of the uninsured, of middle-class families, and of the elderly while protecting America’s private health-insurance system and looking out for the long-term fiscal health of the nation.

Even as we pursue practical options for reform, however, it behooves us to remember that health itself will always remain out of our ultimate control. Medicine works at the boundaries of life, and its limits remind us of our own. While our health-care system can be improved, our unease about health can never truly be quieted. And while reform will require hard decisions, solutions that would balance the books by treating the disabled and debilitated as unworthy of care are no solutions at all. In no small measure, America’s future vitality and character will depend upon our ability to rise to this challenge with the right mix of creativity and sobriety.

Read Less




Welcome to Commentary Magazine.
We hope you enjoy your visit.
As a visitor to our site, you are allowed 8 free articles this month.
This is your first of 8 free articles.

If you are already a digital subscriber, log in here »

Print subscriber? For free access to the website and iPad, register here »

To subscribe, click here to see our subscription offers »

Please note this is an advertisement skip this ad
Clearly, you have a passion for ideas.
Subscribe today for unlimited digital access to the publication that shapes the minds of the people who shape our world.
Get for just
YOU HAVE READ OF 8 FREE ARTICLES THIS MONTH.
FOR JUST
YOU HAVE READ OF 8 FREE ARTICLES THIS MONTH.
FOR JUST
Welcome to Commentary Magazine.
We hope you enjoy your visit.
As a visitor, you are allowed 8 free articles.
This is your first article.
You have read of 8 free articles this month.
YOU HAVE READ 8 OF 8
FREE ARTICLES THIS MONTH.
for full access to
CommentaryMagazine.com
INCLUDES FULL ACCESS TO:
Digital subscriber?
Print subscriber? Get free access »
Call to subscribe: 1-800-829-6270
You can also subscribe
on your computer at
CommentaryMagazine.com.
LOG IN WITH YOUR
COMMENTARY MAGAZINE ID
Don't have a CommentaryMagazine.com log in?
CREATE A COMMENTARY
LOG IN ID
Enter you email address and password below. A confirmation email will be sent to the email address that you provide.