Commentary Magazine


Topic: Bob Dylan

The Truth and Barack Obama

Who knew that Barack Obama’s real ambition is to be Howard Kurtz?

In his commencement address at Hampton University, the president once again decided to act as if he were America’s Media-Critic-in-Chief. In Obama’s words:

You’re coming of age in a 24/7 media environment that bombards us with all kinds of content and exposes us to all kinds of arguments, some of which don’t always rank that high on the truth meter. And with iPods and iPads; and Xboxes and PlayStations — none of which I know how to work — (laughter) — information becomes a distraction, a diversion, a form of entertainment, rather than a tool of empowerment, rather than the means of emancipation. So all of this is not only putting pressure on you; it’s putting new pressure on our country and on our democracy.

Later in the speech, Obama added this:

So, allowing you to compete in the global economy is the first way your education can prepare you. But it can also prepare you as citizens. With so many voices clamoring for attention on blogs, and on cable, on talk radio, it can be difficult, at times, to sift through it all; to know what to believe; to figure out who’s telling the truth and who’s not. Let’s face it, even some of the craziest claims can quickly gain traction. I’ve had some experience in that regard.

There are several things one can take away from the president’s remarks.

The first is that there’s a certain irony in being instructed by Obama about avoiding arguments that “don’t always rank that high on the truth meter.” This instruction, after all, comes from a man who, throughout the health-care debate, repeatedly made false and misleading arguments about the effects of ObamaCare on bending the cost curve, on the deficit and debt, on whether people will be forced to leave their employer-based policies, on whether his plan advocated Medicare cuts, on whether it would subsidize abortions, and much else.

Mr. Obama is also the person who, when he was running for the presidency, promised all health-care negotiations would be broadcast on C-SPAN (They weren’t.), that he would accept public financing for his campaign (He didn’t.), that he would put an end to “phony accounting” (He hasn’t.), that lobbyists will not work in his White House (They do.), that he would slash earmarks by more than half (He has not.), that he opposed giving Miranda rights to terrorists (He favors them.), that he was against an individual health-care mandate (He supported it.), and that he would resist the temptation “to fall back on the same partisanship and pettiness and immaturity that has poisoned our politics for so long” (He succumbed to the temptation.).

Where, I wonder, does Mr. Obama rank these statements on his cherished Truth Meter?

And what are we to make of the fact that the very paragraph from Obama’s speech where he laments the lack of truth in public statements includes — you guessed it — a false statement by Obama?

In his commencement address, Obama insists he doesn’t know how to work an iPod. But here’s an item that appeared on the Huffington Post on June 25, 2008:

WASHINGTON — Bob Dylan. Yo-Yo Ma. Sheryl Crow. Jay-Z. These aren’t musical acts in a summer concert series: They’re artists featured on Barack Obama’s iPod.

“I have pretty eclectic tastes,” the Democratic presidential contender said in an interview to be published in Friday’s issue of Rolling Stone.

Is that distant sound we hear the Truth Meter going off again?

By now Obama has spoken out against the New Media often enough to know that he both despises it and is obsessed with it. For all of his talk about his eagerness to listen to others, “especially when we disagree,” as he put it on the night of his election, Obama clearly resents being challenged. He gets especially exasperated and condescending when his challenger has made the better argument. That is, in fact, a trait of Team Obama; we see that attitude on display almost every day in the person of Robert Gibbs, the snidest and least likable press secretary in our lifetime.

The president and his aides are clearly used to being cosseted. They seem to believe the American public should treat them as reverentially as staff members of the New Yorker do.

It may seem odd for a man who presents himself as a public intellectual who cherishes open-mindedness and vigorous debate to be so relentlessly critical of the diversity of voices and viewpoints now in the public square. But remember this: Barack Obama is a man whose attitudes and sensibilities have been shaped by the academy, an institution that is the least (classically) liberal and open-minded in American life today. A stifling conformity and an unwillingness to engage arguments on the merits, combined with a reflexive tendency to attack the motives of those who hold opposing views, are hallmarks of the modern university. They are also, alas, hallmarks of America’s 44th president. But Mr. Obama is learning the hard way that America is not one big Ivy League campus. Here, differing opinions are heard, whether they are welcomed by those in power or not. The public will not bow down before any man or any office. And politicians who treat dissenting voices as if they are a Tower of Babble, to be mocked and ridiculed into silence, eventually receive their comeuppance. So shall Obama.

Who knew that Barack Obama’s real ambition is to be Howard Kurtz?

In his commencement address at Hampton University, the president once again decided to act as if he were America’s Media-Critic-in-Chief. In Obama’s words:

You’re coming of age in a 24/7 media environment that bombards us with all kinds of content and exposes us to all kinds of arguments, some of which don’t always rank that high on the truth meter. And with iPods and iPads; and Xboxes and PlayStations — none of which I know how to work — (laughter) — information becomes a distraction, a diversion, a form of entertainment, rather than a tool of empowerment, rather than the means of emancipation. So all of this is not only putting pressure on you; it’s putting new pressure on our country and on our democracy.

Later in the speech, Obama added this:

So, allowing you to compete in the global economy is the first way your education can prepare you. But it can also prepare you as citizens. With so many voices clamoring for attention on blogs, and on cable, on talk radio, it can be difficult, at times, to sift through it all; to know what to believe; to figure out who’s telling the truth and who’s not. Let’s face it, even some of the craziest claims can quickly gain traction. I’ve had some experience in that regard.

There are several things one can take away from the president’s remarks.

The first is that there’s a certain irony in being instructed by Obama about avoiding arguments that “don’t always rank that high on the truth meter.” This instruction, after all, comes from a man who, throughout the health-care debate, repeatedly made false and misleading arguments about the effects of ObamaCare on bending the cost curve, on the deficit and debt, on whether people will be forced to leave their employer-based policies, on whether his plan advocated Medicare cuts, on whether it would subsidize abortions, and much else.

Mr. Obama is also the person who, when he was running for the presidency, promised all health-care negotiations would be broadcast on C-SPAN (They weren’t.), that he would accept public financing for his campaign (He didn’t.), that he would put an end to “phony accounting” (He hasn’t.), that lobbyists will not work in his White House (They do.), that he would slash earmarks by more than half (He has not.), that he opposed giving Miranda rights to terrorists (He favors them.), that he was against an individual health-care mandate (He supported it.), and that he would resist the temptation “to fall back on the same partisanship and pettiness and immaturity that has poisoned our politics for so long” (He succumbed to the temptation.).

Where, I wonder, does Mr. Obama rank these statements on his cherished Truth Meter?

And what are we to make of the fact that the very paragraph from Obama’s speech where he laments the lack of truth in public statements includes — you guessed it — a false statement by Obama?

In his commencement address, Obama insists he doesn’t know how to work an iPod. But here’s an item that appeared on the Huffington Post on June 25, 2008:

WASHINGTON — Bob Dylan. Yo-Yo Ma. Sheryl Crow. Jay-Z. These aren’t musical acts in a summer concert series: They’re artists featured on Barack Obama’s iPod.

“I have pretty eclectic tastes,” the Democratic presidential contender said in an interview to be published in Friday’s issue of Rolling Stone.

Is that distant sound we hear the Truth Meter going off again?

By now Obama has spoken out against the New Media often enough to know that he both despises it and is obsessed with it. For all of his talk about his eagerness to listen to others, “especially when we disagree,” as he put it on the night of his election, Obama clearly resents being challenged. He gets especially exasperated and condescending when his challenger has made the better argument. That is, in fact, a trait of Team Obama; we see that attitude on display almost every day in the person of Robert Gibbs, the snidest and least likable press secretary in our lifetime.

The president and his aides are clearly used to being cosseted. They seem to believe the American public should treat them as reverentially as staff members of the New Yorker do.

It may seem odd for a man who presents himself as a public intellectual who cherishes open-mindedness and vigorous debate to be so relentlessly critical of the diversity of voices and viewpoints now in the public square. But remember this: Barack Obama is a man whose attitudes and sensibilities have been shaped by the academy, an institution that is the least (classically) liberal and open-minded in American life today. A stifling conformity and an unwillingness to engage arguments on the merits, combined with a reflexive tendency to attack the motives of those who hold opposing views, are hallmarks of the modern university. They are also, alas, hallmarks of America’s 44th president. But Mr. Obama is learning the hard way that America is not one big Ivy League campus. Here, differing opinions are heard, whether they are welcomed by those in power or not. The public will not bow down before any man or any office. And politicians who treat dissenting voices as if they are a Tower of Babble, to be mocked and ridiculed into silence, eventually receive their comeuppance. So shall Obama.

Read Less

You Don’t Need to Be a Weatherman but It May Help

The supposedly rock-solid consensus among all thinking human beings about the impending catastrophe of global warming has taken another hit from an unlikely villain: your friendly local TV weather forecaster. According to a front-page feature in Monday’s New York Times, some of the biggest global-warming skeptics are precisely those people whom many Americans look to for insight about the weather. The Times reports that a study released this week by George Mason University and the University of Texas reveals that “only about half of the 571 television weathercasters surveyed believed that global warming was occurring and fewer than a third believed that climate change was caused mostly by human activities.” This is very bad news for environmental extremists, since the public seems to trust the weather guys more than Al Gore.

Apparently there is a real split developing in the world of weather between climatologists and meteorologists, with the latter showing a remarkable disinclination to accept the claims of the former that the planet is melting. But the frame of reference of this piece, like so much of the mainstream media’s coverage of those who raise questions about the alarmist theories of global warming, is not to examine the views and the reasons of the skeptics. Instead, the point of the article is to view it as yet another unfortunate problem to be overcome on the road to eradicating heretical dissent from the global-warming orthodoxy of our time. And since the average American is more likely to hear about the weather from a TV weather forecaster than to be lectured by a climatologist, this is especially dangerous for a field that has been rocked by a series of scandals that have undermined confidence in the honesty and accuracy of global-warming advocates.

For the Times, the problem is primarily one of academic achievement. The climatologists who are promoting fear of global warming—and profiting handsomely from it—are generally affiliated with universities and tend to have advanced degrees whereas many meteorologists do not. For Heidi Cullen, a climatologist who works to promote global-warming hysteria at something called Climate Central, the problem is that the weathermen are just not smart enough to understand her field. Indeed, she says the claim that it will be hotter 50 years from now is as open and shut a case as asserting that August will be warmer than January. But if you think about it, it makes sense that those who work on a day-to-day basis with weather forecasts would have their doubts about computer models about the weather we will get 50 years from now. They know all too well how variable the climate can be and that efforts to project forecasts with certainty, especially those promising apocalyptic disasters, should be taken with a shovel-full of salt.

The response from climatologists is, of course, not to listen to the skeptics or take them seriously, even if the skeptics in question know a thing or two about the weather. Instead, as the Times pompously relates, what the global-warming crowd wants is more “education” and “outreach” designed to squelch doubts about their theories before the debate about the issue—and the dangerous “cap and trade” schemes to handicap our economy to supposedly avert a global-warming disaster—gets out of hand.

As Bob Dylan famously wrote, “You don’t need to be a weatherman to know which way the wind blows.” But when it comes to bringing some common sense to the “climate change” debate, it apparently helps to be one.

The supposedly rock-solid consensus among all thinking human beings about the impending catastrophe of global warming has taken another hit from an unlikely villain: your friendly local TV weather forecaster. According to a front-page feature in Monday’s New York Times, some of the biggest global-warming skeptics are precisely those people whom many Americans look to for insight about the weather. The Times reports that a study released this week by George Mason University and the University of Texas reveals that “only about half of the 571 television weathercasters surveyed believed that global warming was occurring and fewer than a third believed that climate change was caused mostly by human activities.” This is very bad news for environmental extremists, since the public seems to trust the weather guys more than Al Gore.

Apparently there is a real split developing in the world of weather between climatologists and meteorologists, with the latter showing a remarkable disinclination to accept the claims of the former that the planet is melting. But the frame of reference of this piece, like so much of the mainstream media’s coverage of those who raise questions about the alarmist theories of global warming, is not to examine the views and the reasons of the skeptics. Instead, the point of the article is to view it as yet another unfortunate problem to be overcome on the road to eradicating heretical dissent from the global-warming orthodoxy of our time. And since the average American is more likely to hear about the weather from a TV weather forecaster than to be lectured by a climatologist, this is especially dangerous for a field that has been rocked by a series of scandals that have undermined confidence in the honesty and accuracy of global-warming advocates.

For the Times, the problem is primarily one of academic achievement. The climatologists who are promoting fear of global warming—and profiting handsomely from it—are generally affiliated with universities and tend to have advanced degrees whereas many meteorologists do not. For Heidi Cullen, a climatologist who works to promote global-warming hysteria at something called Climate Central, the problem is that the weathermen are just not smart enough to understand her field. Indeed, she says the claim that it will be hotter 50 years from now is as open and shut a case as asserting that August will be warmer than January. But if you think about it, it makes sense that those who work on a day-to-day basis with weather forecasts would have their doubts about computer models about the weather we will get 50 years from now. They know all too well how variable the climate can be and that efforts to project forecasts with certainty, especially those promising apocalyptic disasters, should be taken with a shovel-full of salt.

The response from climatologists is, of course, not to listen to the skeptics or take them seriously, even if the skeptics in question know a thing or two about the weather. Instead, as the Times pompously relates, what the global-warming crowd wants is more “education” and “outreach” designed to squelch doubts about their theories before the debate about the issue—and the dangerous “cap and trade” schemes to handicap our economy to supposedly avert a global-warming disaster—gets out of hand.

As Bob Dylan famously wrote, “You don’t need to be a weatherman to know which way the wind blows.” But when it comes to bringing some common sense to the “climate change” debate, it apparently helps to be one.

Read Less

Streisand in Jerusalem

Israeli President Shimon Peres has announced the impressive list of luminaries who will attend the upcoming conference celebrating Israel’s 60th birthday. They include George W. Bush, Tony Blair, Mikhail Gorbachev, Henry Kissinger, Rupert Murdoch, Vaclav Havel, Alan Dershowitz, Google co-founder Sergey Brin, Facebook founder Mark Zuckerberg, and former Indonesian President Abdurrahman Wahid.

While these VIP’s will highlight Israel’s many successes in a variety of sectors, the conference will also pay respect to the challenges that Israel has yet to overcome. At least this is how I’m interpreting the invitation of Barbra Streisand, whose rendition of Avinu Malkeinu promises to be a low point in Israel’s cultural history.

So, here’s to a more hopeful Israeli future–which, in my book, means inviting an 82-year-old Bob Dylan to play Hava Negila at the 75th celebration. (Frankly, even Bill Clinton returning for a repeat performance of “Imagine” might be an improvement.)

Israeli President Shimon Peres has announced the impressive list of luminaries who will attend the upcoming conference celebrating Israel’s 60th birthday. They include George W. Bush, Tony Blair, Mikhail Gorbachev, Henry Kissinger, Rupert Murdoch, Vaclav Havel, Alan Dershowitz, Google co-founder Sergey Brin, Facebook founder Mark Zuckerberg, and former Indonesian President Abdurrahman Wahid.

While these VIP’s will highlight Israel’s many successes in a variety of sectors, the conference will also pay respect to the challenges that Israel has yet to overcome. At least this is how I’m interpreting the invitation of Barbra Streisand, whose rendition of Avinu Malkeinu promises to be a low point in Israel’s cultural history.

So, here’s to a more hopeful Israeli future–which, in my book, means inviting an 82-year-old Bob Dylan to play Hava Negila at the 75th celebration. (Frankly, even Bill Clinton returning for a repeat performance of “Imagine” might be an improvement.)

Read Less

Oscar Predictions

Here’s the thing: I’m generally lousy at them, because I always overthink these things. So I suppose since everybody and his brother are saying No Country for Old Men is going to win, it probably is — and since everybody is saying there’s a backlash against Juno, I guess there is.

But here’s the rub: I talk to a lot of people who actually just go to the movies rather than write about them. And most of these people didn’t really like No Country. They thought there were wonderful scenes but found the last 15 minutes baffling in a particularly off-putting way. I haven’t met a single person who doesn’t love Juno. People who write about movies twist themselves into knots thinking about these matters so much they decide Juno is meretricious and the end of No Country doesn’t matter.

What I’m saying is, evidently it would be an upset if Juno won. But why? Would would a universally liked, enormously popular, and very affecting film be considered an underdog against a brilliantly made but bloody and unsatisfying existential thriller? There are only 6,000 Academy Awards voters. None of them is a critic.

Last year, when The Departed beat Little Miss Sunshine, it did so in large measure because people really loved The Departed – and it was a slightly bigger hit. Nobody really loves No Country for Old Men. Juno is the movie this year that knocked people for a loop, a happy loop.

So it just seems to me the smart money being on No Country is a result of overthink. Based on what we know about the Oscars, the only obvious choice is Juno, except for the fact that it’s a comedy. Which is a big except. All of this only goes to show that if Atonement — epic, romantic, with English accents — had been better, it would have walked away with the award.

Daniel Day Lewis will win for best actor. Nobody knows who will win Best Actress — although if Ellen Page takes it for Juno, that will be a serious indication that the movie is going to win the big one. And while everybody says Javier Bardem is a lock for supporting actor in No Country, supporting is where the surprises always happen. Nobody knows about supporting actress either, though it strikes me as weird that the Academy might give Cate Blanchett a second Oscar for impersonating a famous person (the first was Katharine Hepburn in The Aviator; this would be for being Bob Dylan in I’m Not There).

But I’ve never won an Oscar pool.

Here’s the thing: I’m generally lousy at them, because I always overthink these things. So I suppose since everybody and his brother are saying No Country for Old Men is going to win, it probably is — and since everybody is saying there’s a backlash against Juno, I guess there is.

But here’s the rub: I talk to a lot of people who actually just go to the movies rather than write about them. And most of these people didn’t really like No Country. They thought there were wonderful scenes but found the last 15 minutes baffling in a particularly off-putting way. I haven’t met a single person who doesn’t love Juno. People who write about movies twist themselves into knots thinking about these matters so much they decide Juno is meretricious and the end of No Country doesn’t matter.

What I’m saying is, evidently it would be an upset if Juno won. But why? Would would a universally liked, enormously popular, and very affecting film be considered an underdog against a brilliantly made but bloody and unsatisfying existential thriller? There are only 6,000 Academy Awards voters. None of them is a critic.

Last year, when The Departed beat Little Miss Sunshine, it did so in large measure because people really loved The Departed – and it was a slightly bigger hit. Nobody really loves No Country for Old Men. Juno is the movie this year that knocked people for a loop, a happy loop.

So it just seems to me the smart money being on No Country is a result of overthink. Based on what we know about the Oscars, the only obvious choice is Juno, except for the fact that it’s a comedy. Which is a big except. All of this only goes to show that if Atonement — epic, romantic, with English accents — had been better, it would have walked away with the award.

Daniel Day Lewis will win for best actor. Nobody knows who will win Best Actress — although if Ellen Page takes it for Juno, that will be a serious indication that the movie is going to win the big one. And while everybody says Javier Bardem is a lock for supporting actor in No Country, supporting is where the surprises always happen. Nobody knows about supporting actress either, though it strikes me as weird that the Academy might give Cate Blanchett a second Oscar for impersonating a famous person (the first was Katharine Hepburn in The Aviator; this would be for being Bob Dylan in I’m Not There).

But I’ve never won an Oscar pool.

Read Less

I’m Not There

In I’m Not There, director Todd Haynes takes the stale, conventional music biopic and runs it through a blender. The film claims to be inspired by the life and music of Bob Dylan, and features six different performers, including Richard Gere, Christian Bale, Heath Ledger, and Cat Blanchett as Dylanesque figures (none is actually named Bob Dylan). But one needn’t be a Dylanologist, or even more than a casual fan to wonder at the fantastic concoction he’s whipped up.

Gone are the genre’s usual forms. The familiar arcs of talent, love, addiction, stardom, and redemption that played out in Great Balls of Fire, Ray, and Walk the Line are nowhere to be found, and holiday audiences looking for those familiar patterns will almost certainly be confused and disappointed. Haynes doesn’t just dismiss the clichés; he seems unaware of them, as if he’s inventing everything in the film for the first time.

That’s not to say one can’t spot his influences. Haynes pulls from the fragmented narratives of Bunuel, the feverish and foggy visions of Fellini, and the cinematic playfulness of Godard. Haynes is a fussy formalist, mimicking a dozen or more distinct and easily identifiable styles throughout the film, but his grand scheme embraces a dreamlike expressionism. This isn’t a film about the life of Bob Dylan so much as a rock film fantasia, like Alice in Wonderland as reimagined by Hunter S. Thompson.

Much of the movie’s buzz has centered on the casting, especially Blanchett’s. And indeed, she’s remarkable in her role as Jude, a Dylanish ‘60’s rock hero given to rash behavior and elliptical pronouncements. She provides one point on the ever-spinning Dylan pinwheel, a dazzling array of characters. Between the manic stylistic riffing and the hall-of-mirrors approach to the central figure, Haynes seems to be gesturing toward the fluidity of identity in the media age, where the idea of the self has become fragmented and illusory, polluted by cross-talk and competing personae.

If this sounds a little murky, that’s because it is, but it’s also often exhilarating, and to ask for too much clarity would probably be a mistake. Any movie seeking to capture the essence of Bob Dylan that’s easy and simple to understand is almost certainly doomed to fail.

In I’m Not There, director Todd Haynes takes the stale, conventional music biopic and runs it through a blender. The film claims to be inspired by the life and music of Bob Dylan, and features six different performers, including Richard Gere, Christian Bale, Heath Ledger, and Cat Blanchett as Dylanesque figures (none is actually named Bob Dylan). But one needn’t be a Dylanologist, or even more than a casual fan to wonder at the fantastic concoction he’s whipped up.

Gone are the genre’s usual forms. The familiar arcs of talent, love, addiction, stardom, and redemption that played out in Great Balls of Fire, Ray, and Walk the Line are nowhere to be found, and holiday audiences looking for those familiar patterns will almost certainly be confused and disappointed. Haynes doesn’t just dismiss the clichés; he seems unaware of them, as if he’s inventing everything in the film for the first time.

That’s not to say one can’t spot his influences. Haynes pulls from the fragmented narratives of Bunuel, the feverish and foggy visions of Fellini, and the cinematic playfulness of Godard. Haynes is a fussy formalist, mimicking a dozen or more distinct and easily identifiable styles throughout the film, but his grand scheme embraces a dreamlike expressionism. This isn’t a film about the life of Bob Dylan so much as a rock film fantasia, like Alice in Wonderland as reimagined by Hunter S. Thompson.

Much of the movie’s buzz has centered on the casting, especially Blanchett’s. And indeed, she’s remarkable in her role as Jude, a Dylanish ‘60’s rock hero given to rash behavior and elliptical pronouncements. She provides one point on the ever-spinning Dylan pinwheel, a dazzling array of characters. Between the manic stylistic riffing and the hall-of-mirrors approach to the central figure, Haynes seems to be gesturing toward the fluidity of identity in the media age, where the idea of the self has become fragmented and illusory, polluted by cross-talk and competing personae.

If this sounds a little murky, that’s because it is, but it’s also often exhilarating, and to ask for too much clarity would probably be a mistake. Any movie seeking to capture the essence of Bob Dylan that’s easy and simple to understand is almost certainly doomed to fail.

Read Less

I’m Not There—Until They Hand Out Oscars

Midway through I’m Not There, director Todd Haynes’s soon-to-open film fantasia in which Bob Dylan is played by six different actors to signify different phases in the life of the Bard of Hibbing, the Australian actress Cate Blanchett pops up—and as was the case with her appearance as Kate Hepburn in The Aviator, Blanchett makes it immediately clear that this is an Oscar™ role.

Though Blanchett is strenuously coiffed and made up to look like Dylan, with a frizzy wig and Ray-Bans and loose-fitting shirts, never for a moment do you forget that this is Cate Blanchett Acting The Hell Out Of This Role. The clatter of Blanchett’s acting drowns out everything around her.

Within the cubist style of the movie, it isn’t particularly surprising to see a woman play Dylan—he’s also played here by a black kid calling himself “Woody Guthrie.” To have a black kid portray the larval Dylan makes a kind of sense, since, as a troubadour in training, young Robert Zimmerman cooked up a Guthrie-like legend for himself to hide his shame over his white middle-classness while singing about Blind Willie McTell. But there is nothing feminine about Dylan in this movie.

And yet, Blanchett’s wisp of a figure and porcelain cheekbones make it impossible to forget this is a drag performance. In a scene in which her Dylan chases an Edie Sedgwick-like object of obsession around a park, she doesn’t seem remotely masculine. She gives off no sexual hunger, no sense of need. In the end, all Blanchett ever needs in any film is our rapt attention.

Midway through I’m Not There, director Todd Haynes’s soon-to-open film fantasia in which Bob Dylan is played by six different actors to signify different phases in the life of the Bard of Hibbing, the Australian actress Cate Blanchett pops up—and as was the case with her appearance as Kate Hepburn in The Aviator, Blanchett makes it immediately clear that this is an Oscar™ role.

Though Blanchett is strenuously coiffed and made up to look like Dylan, with a frizzy wig and Ray-Bans and loose-fitting shirts, never for a moment do you forget that this is Cate Blanchett Acting The Hell Out Of This Role. The clatter of Blanchett’s acting drowns out everything around her.

Within the cubist style of the movie, it isn’t particularly surprising to see a woman play Dylan—he’s also played here by a black kid calling himself “Woody Guthrie.” To have a black kid portray the larval Dylan makes a kind of sense, since, as a troubadour in training, young Robert Zimmerman cooked up a Guthrie-like legend for himself to hide his shame over his white middle-classness while singing about Blind Willie McTell. But there is nothing feminine about Dylan in this movie.

And yet, Blanchett’s wisp of a figure and porcelain cheekbones make it impossible to forget this is a drag performance. In a scene in which her Dylan chases an Edie Sedgwick-like object of obsession around a park, she doesn’t seem remotely masculine. She gives off no sexual hunger, no sense of need. In the end, all Blanchett ever needs in any film is our rapt attention.

Read Less

Bookshelf

• What did Leonard Bernstein, Victor Borge, Dave Brubeck, the Budapest String Quartet, Johnny Cash, Noël Coward, Miles Davis, Doris Day, Bob Dylan, Vladimir Horowitz, John Gielgud, Glenn Gould, Michael Jackson, Marshall McLuhan, Albert Schweitzer, Frank Sinatra, Bruce Springsteen, Igor Stravinsky, and the original casts of Waiting for Godot and West Side Story have in common? They all recorded for Columbia. Gary Marmorstein’s The Label: The Story of Columbia Records is a breezily written primary-source history of the company whose artistically serious, technically innovative approach to the making of records—it was Columbia’s engineers who invented the long-playing record album in 1948—left a permanent mark on the history of American music.

Although Columbia was founded in 1889, it wasn’t until a half-century later, when it was bought by CBS, that it began its rise to cultural power. To an insufficiently appreciated extent, the label was soon reinvented in the image of one man, an aspiring classical composer turned record-company executive named Goddard Lieberson, whose wit, elegance, and unshakable self-assurance set the tone for Columbia’s postwar activities. Lieberson is more than deserving of a full-length biography of his own, but The Label offers the most detailed portrait to date of this spectacularly improbable character. A polymath who wrote a string quartet and a comic novel, Lieberson stole one of George Balanchine’s wives and used the profits raked in by such Mitch Miller-produced exercises in sugar-frosted pop banality as Rosemary Clooney’s “Come On-A My House” (as well as the Lieberson-produced original-cast albums of such Broadway musicals as South Pacific and My Fair Lady) to underwrite the recordings of the complete works of Stravinsky, Bernstein, Aaron Copland, and Anton Webern.

Read More

• What did Leonard Bernstein, Victor Borge, Dave Brubeck, the Budapest String Quartet, Johnny Cash, Noël Coward, Miles Davis, Doris Day, Bob Dylan, Vladimir Horowitz, John Gielgud, Glenn Gould, Michael Jackson, Marshall McLuhan, Albert Schweitzer, Frank Sinatra, Bruce Springsteen, Igor Stravinsky, and the original casts of Waiting for Godot and West Side Story have in common? They all recorded for Columbia. Gary Marmorstein’s The Label: The Story of Columbia Records is a breezily written primary-source history of the company whose artistically serious, technically innovative approach to the making of records—it was Columbia’s engineers who invented the long-playing record album in 1948—left a permanent mark on the history of American music.

Although Columbia was founded in 1889, it wasn’t until a half-century later, when it was bought by CBS, that it began its rise to cultural power. To an insufficiently appreciated extent, the label was soon reinvented in the image of one man, an aspiring classical composer turned record-company executive named Goddard Lieberson, whose wit, elegance, and unshakable self-assurance set the tone for Columbia’s postwar activities. Lieberson is more than deserving of a full-length biography of his own, but The Label offers the most detailed portrait to date of this spectacularly improbable character. A polymath who wrote a string quartet and a comic novel, Lieberson stole one of George Balanchine’s wives and used the profits raked in by such Mitch Miller-produced exercises in sugar-frosted pop banality as Rosemary Clooney’s “Come On-A My House” (as well as the Lieberson-produced original-cast albums of such Broadway musicals as South Pacific and My Fair Lady) to underwrite the recordings of the complete works of Stravinsky, Bernstein, Aaron Copland, and Anton Webern.

In addition to writing about Lieberson, Miller, and John Hammond—the producer-talent scout who spent much of his celebrated career recording jazz and pop for Columbia—Marmorstein depicts a cast of lesser-known backstage characters equally worthy of recognition. George Avakian, who brought Louis Armstrong, Dave Brubeck, and Miles Davis to Columbia and recorded some of their best-remembered albums, is given his due, as is Deborah Ishlon, the master publicist who first spread the word about Glenn Gould, and talked Stravinsky into writing his “conversation books.”

To write consistently well about a company that recorded everyone and everything from Liberace to Don Juan in Hell demands a degree of cultural competence not possessed by the average human being. While Marmorstein has done his homework—to the point of having read Lieberson’s forgotten novel 3 for Bedroom C and Ishlon’s equally obscure roman à clef Girl Singer: A Two Part Invention—he does not exhibit a complete knowledge of classical music. (Somebody at Thunder’s Mouth Press should have told him that the Brahms First Symphony isn’t a piano concerto.) But the small errors that disfigure The Label do not diminish its effectiveness as journalism, and Marmorstein’s breathless summary of Columbia’s significance is in no way overstated:

In the overlapping epochs of the 78-rpm platter, the 33-rpm vinyl disk, the cassette tape, and the compact disk, Columbia Records seemed to be everywhere. That ubiquitousness was true for no other record label. . . . Decade by decade, Columbia launched the careers of our most seminal recording artists and deposited their sound prints onto the permanent record.

All that came to an untimely end when Columbia was bought by Sony in 1987, a transaction that led in short order to the dumbing-down of the classical and jazz divisions that had been Columbia’s pride. Now that the entire recording industry has been devastated by the rise of Web-based new media, younger music lovers are largely unaware of the role that Columbia Records played in the shaping of postwar American culture. Kudos to Gary Marmorstein for telling them what they missed.

Read Less




Welcome to Commentary Magazine.
We hope you enjoy your visit.
As a visitor to our site, you are allowed 8 free articles this month.
This is your first of 8 free articles.

If you are already a digital subscriber, log in here »

Print subscriber? For free access to the website and iPad, register here »

To subscribe, click here to see our subscription offers »

Please note this is an advertisement skip this ad
Clearly, you have a passion for ideas.
Subscribe today for unlimited digital access to the publication that shapes the minds of the people who shape our world.
Get for just
YOU HAVE READ OF 8 FREE ARTICLES THIS MONTH.
FOR JUST
YOU HAVE READ OF 8 FREE ARTICLES THIS MONTH.
FOR JUST
Welcome to Commentary Magazine.
We hope you enjoy your visit.
As a visitor, you are allowed 8 free articles.
This is your first article.
You have read of 8 free articles this month.
YOU HAVE READ 8 OF 8
FREE ARTICLES THIS MONTH.
for full access to
CommentaryMagazine.com
INCLUDES FULL ACCESS TO:
Digital subscriber?
Print subscriber? Get free access »
Call to subscribe: 1-800-829-6270
You can also subscribe
on your computer at
CommentaryMagazine.com.
LOG IN WITH YOUR
COMMENTARY MAGAZINE ID
Don't have a CommentaryMagazine.com log in?
CREATE A COMMENTARY
LOG IN ID
Enter you email address and password below. A confirmation email will be sent to the email address that you provide.