Barrett’s decision marked the 59th judicial setback for a college or university since 2013 in a due-process lawsuit brought by a student accused of sexual assault. (In four additional cases, the school settled a lawsuit before any judicial decision occurred.) This body of law serves as a towering rebuke to the Obama administration’s reinterpretation of Title IX, the 1972 law barring sex discrimination in schools that receive federal funding.
Beginning in 2011, the Education Department’s Office for Civil Rights (OCR) issued a series of “guidance” documents pressuring colleges and universities to change how they adjudicated sexual-assault cases in ways that increased the likelihood of guilty findings. Amid pressure from student and faculty activists, virtually all elite colleges and universities have gone far beyond federal mandates and have even further weakened the rights of students accused of sexual assault.
Like all extreme victims’-rights approaches, the new policies had the greatest impact on the wrongly accused. A 2016 study from UCLA public-policy professor John Villasenor used just one of the changes—schools employing the lowest standard of proof, a preponderance of the evidence—to predict that as often as 33 percent of the time, campus Title IX tribunals would return guilty findings in cases involving innocent students. Villasenor’s study could not measure the impact of other Obama-era policy demands—such as allowing accusers to appeal not-guilty findings, discouraging cross-examination of accusers, and urging schools to adjudicate claims even when a criminal inquiry found no wrongdoing.
In a September 7 address at George Mason University, Education Secretary Betsy DeVos stated that “no student should be forced to sue their way to due process.” But once enmeshed in the campus Title IX process, a wrongfully accused student’s best chance for justice may well be a lawsuit filed after his college incorrectly has found him guilty. (According to data from United Educators, a higher-education insurance firm, 99 percent of students accused of campus sexual assault are male.) The Foundation for Individual Rights has identified more than 180 such lawsuits filed since the 2011 policy changes. That figure, obviously, excludes students with equally strong claims whose families cannot afford to go to court. These students face life-altering consequences. As Judge T.S. Ellis III noted in a 2016 decision, it is “so clear as to be almost a truism” that a student will lose future educational and employment opportunities if his college wrongly brands him a rapist.
“It is not the role of the federal courts to set aside decisions of school administrators which the court may view as lacking in wisdom or compassion.” So wrote the Supreme Court in a 1975 case, Wood v. Strickland. While the Supreme Court has made clear that colleges must provide accused students with some rights, especially when dealing with nonacademic disciplinary questions, courts generally have not been eager to intervene in such matters.
This is what makes the developments of the last four years all the more remarkable. The process began in May 2013, in a ruling against St. Joseph’s University, and has lately accelerated (15 rulings in 2016 and 21 thus far in 2017). Of the 40 setbacks for colleges in federal court, 14 came from judges nominated by Barack Obama, 11 from Clinton nominees, and nine from selections of George W. Bush. Brown University has been on the losing side of three decisions; Duke, Cornell, and Penn State, two each.
Court decisions since the expansion of Title IX activism have not all gone in one direction. In 36 of the due-process lawsuits, courts have permitted the university to maintain its guilty finding. (In four other cases, the university settled despite prevailing at a preliminary stage.) But even in these cases, some courts have expressed discomfort with campus procedures. One federal judge was “greatly troubled” that Georgia Tech veered “very far from an ideal representation of due process” when its investigator “did not pursue any line of investigation that may have cast doubt on [the accuser’s] account of the incident.” Another went out of his way to say that he considered it plausible that a former Case Western Reserve University student was actually “innocent of the charges levied against him.” And one state appellate judge opened oral argument by bluntly informing the University of California’s lawyer, “When I . . . finished reading all the briefs in this case, my comment was, ‘Where’s the kangaroo?’”
Judges have, obviously, raised more questions in cases where the college has found itself on the losing side. Those lawsuits have featured three common areas of concern: bias in the investigation, resulting in a college decision based on incomplete evidence; procedures that prevented the accused student from challenging his accuser’s credibility, chiefly through cross-examination; and schools utilizing a process that seemed designed to produce a predetermined result, in response to real or perceived pressure from the federal government.C olleges and universities have proven remarkably willing to act on incomplete information when adjudicating sexual-assault cases. In December 2013, for example, Amherst College expelled a student for sexual assault despite text messages (which the college investigator failed to discover) indicating that the accuser had consented to sexual contact. The accuser’s own testimony also indicated that she might have committed sexual assault, by initiating sexual contact with a student who Amherst conceded was experiencing an alcoholic blackout. When the accused student sued Amherst, the college said its failure to uncover the text messages had been irrelevant because its investigator had only sought texts that portrayed the incident as nonconsensual. In February, Judge Mark Mastroianni allowed the accused student’s lawsuit to proceed, commenting that the texts could raise “additional questions about the credibility of the version of events [the accuser] gave during the disciplinary proceeding.” The two sides settled in late July.
Amherst was hardly alone in its eagerness to avoid evidence that might undermine the accuser’s version of events; the same happened at Penn State, St. Joseph’s, Duke, Ohio State, Occidental, Lynn, Marlboro, Michigan, and Notre Dame.
Even in cases with a more complete evidentiary base, accused students have often been blocked from presenting a full-fledged defense. As part of its reinterpretation of Title IX, the Obama administration sought to shield campus accusers from cross-examination. OCR’s 2011 guidance “strongly” discouraged direct cross-examination of accusers by the accused student—a critical restriction, since most university procedures require the accused student, rather than his lawyer, to defend himself in the hearing. OCR’s 2014 guidance suggested that this type of cross-examination in and of itself could create a hostile environment. The Obama administration even spoke favorably about the growing trend among schools to abolish hearings altogether and allow a single official to serve as investigator, prosecutor, judge, and jury in sexual-assault cases.
The Supreme Court has never held that campus disciplinary hearings must permit cross-examination. Nonetheless, the recent attack on the practice has left schools struggling to explain why they would not want to utilize what the Court has described as the “greatest legal engine ever invented for the discovery of truth.” In June 2016, the University of Cincinnati found a student guilty of sexual assault after a hearing at which neither his accuser nor the university’s Title IX investigator appeared. In an unintentionally comical line, the hearing chair noted the absent witnesses before asking the accused student if he had “any questions of the Title IX report.” The student, befuddled, replied, “Well, since she’s not here, I can’t really ask anything of the report.” (The panel chair did not indicate how the “report” could have answered any questions.) Cincinnati found the student guilty anyway.1
Limitations on full cross-examination also played a role in judicial setbacks for Middlebury, George Mason, James Madison, Ohio State, Occidental, Penn State, Brandeis, Amherst, Notre Dame, and Skidmore.
Finally, since 2011, more than 300 students have filed Title IX complaints with the Office for Civil Rights, alleging mishandling of their sexual-assault allegation by their college. OCR’s leadership seemed to welcome the complaints, which allowed Obama officials not only to inspect the individual case but all sexual-assault claims at the school in question over a three-year period. Northwestern University professor Laura Kipnis has estimated that during the Obama years, colleges spent between $60 million and $100 million on these investigations. If OCR finds a Title IX violation, that might lead to a loss of federal funding. This has led Harvard Law professors Jeannie Suk Gersen, Janet Halley, Elizabeth Bartholet, and Nancy Gertner to observe in a white paper submitted to OCR that universities have “strong incentives to ensure the school stays in OCR’s good graces.”
One of the earliest lawsuits after the Obama administration’s policy shift, involving former Xavier University basketball player Dez Wells, demonstrated how an OCR investigation can affect the fairness of a university inquiry. The accuser’s complaint had been referred both to Xavier’s Title IX office and the Cincinnati police. The police concluded that the allegation was meritless; Hamilton County Prosecuting Attorney Joseph Deters later said he considered charging the accuser with filing a false police report.
Deters asked Xavier to delay its proceedings until his office completed its investigation. School officials refused. Instead, three weeks after the initial allegation, the university expelled Wells. He sued and speculated that Xavier’s haste came not from a quest for justice but instead from a desire to avoid difficulties in finalizing an agreement with OCR to resolve an unrelated complaint filed by two female Xavier students. (In recent years, OCR has entered into dozens of similar resolution agreements, which bind universities to policy changes in exchange for removing the threat of losing federal funds.) In a July 2014 ruling, Judge Arthur Spiegel observed that Xavier’s disciplinary tribunal, however “well-equipped to adjudicate questions of cheating, may have been in over its head with relation to an alleged false accusation of sexual assault.” Soon thereafter, the two sides settled; Wells transferred to the University of Maryland.
Ohio State, Occidental, Cornell, Middlebury, Appalachian State, USC, and Columbia have all found themselves on the losing side of court decisions arising from cases that originated during a time in which OCR was investigating or threatening to investigate the school. (In the Ohio State case, one university staffer testified that she didn’t know whether she had an obligation to correct a false statement by an accuser to a disciplinary panel.) Pressure from OCR can be indirect, as well. The Obama administration interpreted federal law as requiring all universities to have at least one Title IX coordinator; larger universities now employ dozens of Title IX personnel who, as the Harvard Law professors explained, “have reason to fear for their jobs if they hold a student not responsible or if they assign a rehabilitative or restorative rather than a harshly punitive sanction.”A mid the wave of judicial setbacks for universities, two decisions in particular stand out. Easily the most powerful opinion in a campus due-process case came in March 2016 from Judge F. Dennis Saylor. While the stereotypical campus sexual-assault allegation results from an alcohol-filled, one-night encounter between a male and a female student, a case at Brandeis University involved a long-term monogamous relationship between two male students. A bad breakup led to the accusing student’s filing the following complaint, against which his former boyfriend was expected to provide a defense: “Starting in the month of September, 2011, the Alleged violator of Policy had numerous inappropriate, nonconsensual sexual interactions with me. These interactions continued to occur until around May 2013.”
To adjudicate, Brandeis hired a former OCR staffer, who interviewed the two students and a few of their friends. Since the university did not hold a hearing, the investigator decided guilt or innocence on her own. She treated each incident as if the two men were strangers to each other, which allowed her to determine that sexual “violence” had occurred in the relationship. The accused student, she found, sometimes looked at his boyfriend in the nude without permission and sometimes awakened his boyfriend with kisses when the boyfriend wanted to stay asleep. The university’s procedures prevented the student from seeing the investigator’s report, with its absurdly broad definition of sexual misconduct, in preparing his appeal. “In the context of American legal culture,” Boston Globe columnist Dante Ramos later argued, denying this type of information “is crazy.” “Standard rules of evidence and other protections for the accused keep things like false accusations or mistakes by authorities from hurting innocent people.” When the university appeal was denied, the student sued.
At an October 2015 hearing to consider the university’s motion to dismiss, Saylor seemed flabbergasted at the unfairness of the school’s approach. “I don’t understand,” he observed, “how a university, much less one named after Louis Brandeis, could possibly think that that was a fair procedure to not allow the accused to see the accusation.” Brandeis’s lawyer cited pressure to conform to OCR guidance, but the judge deemed the university’s procedures “closer to Salem 1692 than Boston, 2015.”
The following March, Saylor issued an 89-page opinion that has been cited in virtually every lawsuit subsequently filed by an accused student. “Whether someone is a ‘victim’ is a conclusion to be reached at the end of a fair process, not an assumption to be made at the beginning,” Saylor wrote. “If a college student is to be marked for life as a sexual predator, it is reasonable to require that he be provided a fair opportunity to defend himself and an impartial arbiter to make that decision.” Saylor concluded that Brandeis forced the accused student “to defend himself in what was essentially an inquisitorial proceeding that plausibly failed to provide him with a fair and reasonable opportunity to be informed of the charges and to present an adequate defense.”
The student, vindicated by the ruling’s sweeping nature, then withdrew his lawsuit. He currently is pursuing a Title IX complaint against Brandeis with OCR.
Four months later, a three-judge panel of the Second Circuit Court of Appeals produced an opinion that lacked Saylor’s rhetorical flourish or his understanding of the basic unfairness of the campus Title IX process. But by creating a more relaxed standard for accused students to make federal Title IX claims, the Second Circuit’s decision in Doe v. Columbia carried considerable weight.
Two Columbia students who had been drinking had a brief sexual encounter at a party. More than four months later, the accuser claimed she was too intoxicated to have consented. Her allegation came in an atmosphere of campus outrage about the university’s allegedly insufficient toughness on sexual assault. In this setting, the accused student found Columbia’s Title IX investigator uninterested in hearing his side of the story. He cited witnesses who would corroborate his belief that the accuser wasn’t intoxicated; the investigator declined to speak with them. The student was found guilty, although for reasons differing from the initial claim; the Columbia panel ruled that he had “directed unreasonable pressure for sexual activity toward the [accuser] over a period of weeks,” leaving her unable to consent on the night in question. He received a three-semester suspension for this nebulous offense—which even his accuser deemed too harsh. He sued, and the case was assigned to Judge Jesse Furman.
Furman’s opinion provided a ringing victory for Columbia and the Obama-backed policies it used. As Title IX litigator Patricia Hamill later observed, Furman’s “almost impossible standard” required accused students to have inside information about the institution’s handling of other sexual-assault claims—information they could plausibly obtain only through the legal process known as discovery, which happens at a later stage of litigation—in order to survive a university’s initial motion to dismiss. Furman suggested that, to prevail, an accused student would need to show that his school treated a female student accused of sexual assault more favorably, or at least provide details about how cases against other accused students showed a pattern of bias. But federal privacy law keeps campus disciplinary hearings private, leaving most accused students with little opportunity to uncover the information before their case is dismissed.
At the same time, the opinion excused virtually any degree of unfairness by the institution. Furman reasoned that taking “allegations of rape on campus seriously and . . . treat[ing] complainants with a high degree of sensitivity” could constitute “lawful” reasons for university unfairness toward accused students. Samantha Harris of the Foundation for Individual Rights in Education detected the decision’s “immediate and nationwide impact” in several rulings against accused students. It also played the same role in university briefs that Saylor’s Brandeis opinion did in filings by accused students.
The Columbia student’s lawyer, Andrew Miltenberg, appealed Furman’s ruling to the Second Circuit. The stakes were high, since a ruling affirming the lower court’s reasoning would have all but foreclosed Title IX lawsuits by accused students in New York, Connecticut, and Vermont. But a panel of three judges, all nominated by Democratic presidents, overturned Furman’s decision. In the opinion’s crucial passage, Judge Pierre Leval held that a university “is not excused from liability for discrimination because the discriminatory motivation does not result from a discriminatory heart, but rather from a desire to avoid practical disadvantages that might result from unbiased action. A covered university that adopts, even temporarily, a policy of bias favoring one sex over the other in a disciplinary dispute, doing so in order to avoid liability or bad publicity, has practiced sex discrimination, notwithstanding that the motive for the discrimination did not come from ingrained or permanent bias against that particular sex.” Before the Columbia decision, courts almost always had rebuffed Title IX pleadings from accused students. More recently, judges have allowed Title IX claims to proceed against Amherst, Cornell, California–Santa Barbara, Drake, and Rollins.
After the Second Circuit’s decision, Columbia settled with the accused student, sparing its Title IX decision-makers from having to testify at a trial. James Madison was one of the few universities to take a different course, with disastrous results. A lawsuit from an accused student survived a motion to dismiss, but the university refused to settle, allowing the student’s lawyer to depose the three school employees who had decided his client’s fate. One unintentionally revealed that he had misapplied the university’s own definition of consent. Another cited the importance of the accuser’s slurring words on a voicemail, thus proving her extreme intoxication on the night of the alleged assault. It was left to the accused student’s lawyer, at a deposition months after the decision had been made, to note that the voicemail in question actually was received on a different night. In December 2016, Judge Elizabeth Dillon, an Obama nominee, granted summary judgment to the accused student, concluding that “significant anomalies in the appeal process” violated his due-process rights under the Constitution.niversities were on the losing side of 36 due-process rulings when Obama appointee Catherine Lhamon was presiding over the Office for Civil Rights between 2013 and 2016; no record exists of her publicly acknowledging any of them. In June 2017, however, Lhamon suddenly rejoiced that “yet another federal court” had found that students disciplined for sexual misconduct “were not denied due process.” That Fifth Circuit decision, involving two former students at the University of Houston, was an odd case for her to celebrate. The majority cabined its findings to the “unique facts” of the case—that the accused students likely would have been found guilty even under the fairest possible process. And the dissent, from Judge Edith Jones, denounced the procedures championed by Lhamon and other Obama officials as “heavily weighted in favor of finding guilt,” predicting “worse to come if appellate courts do not step in to protect students’ procedural due process right where allegations of quasi-criminal sexual misconduct arise.”
At this stage, Lhamon, who now chairs the U.S. Commission on Civil Rights, cannot be taken seriously when it comes to questions of campus due process. But other defenders of the current Title IX regime have offered more substantive commentary about the university setbacks.
Legal scholar Michelle Anderson was one of the few to even discuss the due-process decisions. “Colleges and universities do not always adjudicate allegations of sexual assault well,” she noted in a 2016 law review article defending the Obama-era policies. Anderson even conceded that some colleges had denied “accused students fairness in disciplinary adjudication.” But these students sued, “and campuses are responding—as they must—when accused students prevail. So campuses face powerful legal incentives on both sides to address campus sexual assault, and to do so fairly and impartially.”
This may be true, but Anderson does not explain why wrongly accused students should bear the financial and emotional burden of inducing their colleges to implement fair procedures. More important, scant evidence exists that colleges have responded to the court victories of wrongly accused students by creating fairer procedures. Some have even made it more difficult for wrongly accused students to sue. After losing a lawsuit in December 2014, Brown eliminated the right of students accused of sexual assault to have “every opportunity” to present evidence. That same year, an accused student showed how Swarthmore had deviated from its own procedures in his case. The college quickly settled the lawsuit—and then added a clause to its procedures immunizing it from similar claims in the future. Swarthmore currently informs accused students that “rules of evidence ordinarily found in legal proceedings shall not be applied, nor shall any deviations from any of these prescribed procedures alone invalidate a decision.”
Many lawsuits are still working their way through the judicial system; three cases are pending at federal appellate courts. Of the two that address substantive matters, oral arguments seemed to reveal skepticism of the university’s position. On July 26, a three-judge panel of the First Circuit considered a case at Boston College, where the accused student plausibly argued that someone else had committed the sexual assault (which occurred on a poorly lit dance floor). Judges Bruce Selya and William Kayatta seemed troubled that a Boston College dean had improperly intruded on the hearing board’s deliberations. At the Sixth Circuit a few days later, Judges Richard Griffin and Amul Thapar both expressed concerns about the University of Cincinnati’s downplaying the importance of cross-examination in campus-sex adjudications. Judge Eric Clay was quieter, but he wondered about the tension between the university’s Title IX and truth-seeking obligations.
In a perfect world, academic leaders themselves would have created fairer processes without judicial intervention. But in the current campus environment, such an approach is impossible. So, at least for the short term, the courts remain the best, albeit imperfect, option for students wrongly accused of sexual assault. Meanwhile, every year, young men entrust themselves and their family’s money to institutions of higher learning that are indifferent to their rights and unconcerned with the injustices to which these students might be subjected.
1 After a district court placed that finding on hold, the university appealed to the Sixth Circuit.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
The Campus Sex-Crime Tribunals Are Losing
Must-Reads from Magazine
The brilliant and problematic work of a Jewish writer who didn’t want to be one
The first fan letter I ever wrote was to Philip Roth in 1959 after reading Goodbye, Columbus. I was not in the habit of complimenting writers, not Saul Bellow for The Adventures of Augie March, or Herman Wouk for Marjorie Morningstar, or even Leon Uris for Exodus. I was then between college and graduate school, aspiring to be Virginia Woolf’s ideal reader, and I was teaching myself to distinguish between good writing (Bellow’s) and what my favorite professor called “push-button” prose (Uris’s). But reading Roth’s stories, I was beyond caring whether this was a “critical” or merely “commercial” success. I felt these stories were written, if not for me alone, then close enough—for someone about my age with the same disdain for the bourgeois limitations of Jewish life and the organized Jewish community. At the time, I shared some of those attitudes and thus identified almost completely with the Roth stand-in in most of the stories. And though I have since then learned to love much of what I once distrusted, I remain thankful for the freedom to identify with the male narrator, since no one had yet told me I was expected as a female to identify only with other females in literature—with Miriam rather than Moses! Happily, I came of age before Women’s Lib tried to pen me in.
Roth’s title story transcribed in credible dialogue the summer romance of clever Neil Klugman (klug is Yiddish for clever) with Brenda Patimkin, whose family had already moved from Newark, where Neil still lives with his aunt, to more prosperous Short Hills. This was the familiar adventure of a boy attracted erotically and economically to the girl who would satisfy both sets of his ambitions but who is upended by her bourgeois scruples. The erotic part of the plot centers on his demand that she facilitate their sex by getting a diaphragm from the Margaret Sanger Clinic, and the economic part, on preparations for the wedding of Brenda’s older brother Ron in the kind of merger-marriage the family expects. Rather than pursue his real ambition of becoming a gym instructor, Ron is headed for the family business—Patimkin Kitchen and Bathroom Sinks, located “in the heart of the Negro section of Newark.” I would have paid greater attention than I did to the sociology of the novella had I realized that this would remain Roth territory over his lifetime.
The mature Philip Roth was not proud of this debut collection, and I am likewise a little embarrassed to admit the almost unreserved admiration I felt for all its six stories and the title one in particular. I laughed at the preliminary exchanges between the sparring couple (“What do you look like?” “I’m…dark.” “Are you a Negro?”), and at the portrait of the Hadassah-member mother who asks about Martin Buber, “Is he orthodox or conservative?”I thought brilliantly funny the scene in which Ron plays his record of “Goodbye, Columbus” that turns out to be a transcript of the final game of his football career at Ohio State. Columbus—get it? I especially fancied Neil’s discovery in the basement of Brenda’s wealthy home the family’s old Newark refrigerator that had once stocked butter, eggs, and herring in cream sauce but was now heaped with
fruit, shelves swelled with it, every color, every texture, and hidden within, every kind of pit. There were greengage plums, black plums, red plums, apricots, nectarines, peaches, long horns of grapes, black, yellow, red, and cherries, cherries flowing out of boxes and staining everything scarlet. And there were melons—cantaloupes and honeydews—and on the top shelf, half of a huge watermelon, a thin sheet of wax paper clinging to its bare red face like a wet lip. Oh Patimkin! Fruit grew in their refrigerator and sporting goods dropped from their trees!
Because the three Patimkin children are competitively proficient in every trendy sport, the yard is similarly overstocked with their equipment. This was the most energetically rendered put-down of the Jewish upper middle class I had ever seen. And it was such fun! I sent my fan letter to Roth, c/o Houghton Mifflin Company, complimenting him for blasting “the Battleship Patimkin.” Get it? I felt I was almost in his class of wit.
But already back then I had one reservation about the story. A subplot involves a little Negro boy who comes to the library, where Neil has a summer job, looking for books on “heart”—by which he means “art.” Neil alone among the staff encourages and shields the little boy whom others mistake for a potential thief.
“Who took these pictures?” he asked me.
“Gauguin. He didn’t take them, he painted them. Paul Gauguin. He was a Frenchman.”
“Is he a white man or a colored man?
“Man,” the boy smiled, chuckled almost, “I knew that. He don’t take pictures like no colored men would. He’s a good picture taker.…Look, look, look here at this one. Ain’t that the fuckin life?”
What I distrusted about this sequence, in addition to the self-serving portrait of the racially sensitive narrator and the condescending portrait of his protégé, was the contrast the story set up between the alleged boorishness of prosperous Jews and the “spontaneous” appreciation of art by the indigent black child. This was only a little less heavy-handed than the stuff of Jewish Communist or Socialist propaganda. It was one thing to play off the more genuine or honest Jew against phonies, as Roth does in several of the other stories, but it was itself part of the phoniness to make an invidious comparison between crass Jews and the allegedly more genuine and honest (because less privileged and more discriminated-against) non-Jews.
The corrupted Jew/untainted non-Jew dichotomy seemed to me not only dumb, but trite. That same year, 1959, in Montreal where I lived, there appeared Mordecai Richler’s The Apprenticeship of Duddy Kravitz. It was a novel uncommonly similar in its cultural assumptions, though whereas Neil Klugman is the sympathetic alternative to the smug Jews of New Jersey, Duddy Kravitz is himself the Jew who aspires to acquire—in his case, land. Roth’s satire of the Patimkin wedding has its comic parallel in Richler’s parody of a crass bar mitzvah, and both works assume that Jews sacrifice their souls in their climb from immigrant poverty into what passes for security. The only characters capable of true affection and loyalty in Richler’s plot are a French-Canadian young woman and a Gentile epileptic, both of whom he betrays. Duddy Kravitz was a knock-off of Budd Schulberg’s Sammy Glick in What Makes Sammy Run? (1952), who scrambles over people in his climb from New York’s Lower East Side to Hollywood. That was preceded, in turn, by Abraham Cahan’s The Rise of David Levinsky (1917)…and along the way there was plenty of fiction of varying artistic quality featuring similarly avaricious members of the tribe. I appreciated the wonderfully rendered cliché of the Jewish nouveaux-riches Patimkins but less so the redemptive Gentile as “heart” instructor of the uglier Jew.
Philip Roth was in no permanent danger of yielding to that cliché. Rather than follow up Goodbye, Columbus with books in the same vein, he moved away from Jews and tried his hand at more conventional American subjects and literary approaches. Maybe because I read his next novels, Letting Go and When She Was Good, mostly out of duty, I felt that he had written them dutifully to prove himself master of American fiction and not just its Jewish precincts. But for that I didn’t need Roth and could have gone straight to Henry James. Then something happened. On an overnight trip to New York in 1967, I stayed with Montreal expatriates who suggested we invite another friend to join us for dinner. Our friend agreed to come on condition that we let him bring a new story he had just discovered. He insisted on reading us—aloud and in company!—“The Jewish Blues” from the first issue of a paperback magazine called New American Review. We laughed harder than we ever had (maybe ever would again) at this shpritz of stand-up comedy delivered from a horizontal position. “The Jewish Blues” became the third chapter of Portnoy’s Complaint.
Written as a series of monologues that form six psychoanalytic sessions, Portnoy’s Complaint was built entirely on clichés—the Jewish son with an Oedipal complex, the vociferous mother and constipated father, Freudian analysis with a Viennese refugee, the Jew’s sexual attraction to the Gentile shiksa and corresponding fear of the assertive Jewish woman. But because joking depends on a shared cultural vocabulary, Roth’s recourse to the clichés of American Jewish culture were in this case justified and, indeed, indispensable to the comedy’s success.
Freud had explained it all in his study of Jokes and their Relation to the Unconscious, leaving comic writers to combine as they saw fit the features of joking that he identified, such as condensation, double entendre, displacement, faulty reasoning, etc., for purposes ranging from pleasure to aggression. Freud poignantly explains the need for this irreverence: “What these jokes whisper may be said aloud: that the wishes and desires of men have a right to make themselves acceptable alongside of exacting and ruthless morality.” Civilized adults may be forgiven for using comedy to bring release from taboos they must continue to observe. When Alex Portnoy says, “I am the son in the Jewish joke—Only it ain’t no joke!” the comedy exposes the distress that laughter only momentarily relieves.
Once the laughter subsided, a number of questions arose: Did the joking of insiders suit a general public? And did Portnoy’s Complaint really break taboos, or did it exploit a cultural shift that had already set in? On the sex front, Roth was barely keeping up with the times. Hugh Hefner founded Playboy magazine in 1953 and opened the first Playboy Club in 1960. Portnoy coincided with 1967’s Summer of Love when a group called the Hombres recorded “Let It All Hang Out.” Students were burning American flags, storming political conventions, and trashing universities. Roth’s obscenity had nothing on Lenny Bruce. It was only because Alex Portnoy was represented as “Assistant Commissioner for The City of New York Commission on Human Opportunity” that his sexual and lexical breakout felt almost as sacrilegious as Hester Prynne’s adultery. The impression of repression made for the comic release.
As I saw it, the real risks Roth took were not orgiastic or onanistic—but lay elsewhere, mainly in his satire of Christians. Alex’s father is speaking:
“They worship a Jew, do you know that, Alex? Their whole big-deal religion is based on worshiping someone who was an established Jew at that time. Now how do you like that for stupidity? How do you like that for pulling the wool over the eyes of the public? Jesus Christ, who they go around telling everybody was a God, was actually a Jew! And this fact, that absolutely kills me when I have to think about it, nobody else pays any attention to. That he was a Jew, like you and me, and that they took a Jew and turned him into some kind of God after he is already dead, and then—and this is what can make you absolutely crazy—then the dirty bastards turn around afterwards, and who is the first one on their list to persecute? Who haven’t they left their hands off of to murder and to hate for two thousand years? The Jews!”
This eruption is accounted for by the parents’ years of kowtowing to bigoted employers, but Alex is even more offensive than his father when he notices a picture of Jesus floating up to Heaven “in a pink nightgown” in the home of a girl he is trying to seduce:
The Jews I despise for their narrow-mindedness, their self-righteousness, the incredibly bizarre sense that these cave men who are my parents and relatives have somehow gotten of their superiority—but when it comes to tawdriness and cheapness, to beliefs that would shame even a gorilla, you simply cannot top the goyim. What kind of base and brainless schmucks are these people to worship somebody who, number one, never existed, and number two, if he did, looking as he does in that picture, was without a doubt The Pansy of Palestine….
Rereading this book (as I have done more than once), I wondered whether the narrator’s assaults on Jews and on himself were not the excuse for attacks on Gentiles and on Christians specifically. In the past, Jews who lived as a minority among Gentiles—and at their mercy—reasonably refrained from aggressing against their hosts. In hostile or potentially hostile societies, Jewish boys were discouraged from fighting back lest it bring on collective retribution. For the same reasons, Jews held back as well from verbal insult, and this prohibition burrowed deep into the culture. Roth violated this taboo, feeling sufficiently at home in America not to have such concerns about offending the goyim and probably realizing that, as with sex, what was once forbidden was now becoming all the rage.
As he anticipated, those truly offended by Portnoy were not Christians but Jews. Criticism came from some of the distinguished Jewish elders of the day, like Marie Syrkin in New York and Gershom Scholem in Jerusalem—intellectuals who had borne the full weight of anti-Semitism a mere two decades earlier and who now feared the consequence of Roth’s Jewish impropriety. Syrkin saw the leering Nazi-style anti-Jewish stereotype behind Roth’s Jewish joking. A little like the chief rabbi of Moscow who is reported to have warned in 1919, “Trotsky makes the revolutions, and the Bronsteins pay the bills,” Scholem thought that by trotting out every negative stereotype of the Jew, this self-styled “American writer” was actually stoking a new anti-Semitism. Trotsky had quit the Jews by changing his name from Bronstein, but just as the Moscow rabbi warned that Jews would be charged for his deeds, so Scholem wondered “what price the world Jewish community is going to pay for this book.” A second tier of criticism from American rabbis and Jewish organizational leaders protested Roth’s negative portrayal of the Jewish way of life, and from reviewers there were objections to the book’s alleged lack of artistic merit.
Against all these charges, I sided with Roth. In the late 1960s, Jews had reason to believe that there was little danger of triggering anti-Semitism in America: Jews were then at the height of their popularity. Liberal sympathy for Holocaust victims was unadulterated by fear of having to absorb Jewish refugees, now that Israel was there to absorb them. Paul Newman had strode the screen like a colossus as Ari Ben Canaan in Otto Preminger’s film Exodus, based on the Leon Uris bestseller, projecting Israel’s new image of masculine competence. Moreover, Judaism was by then enshrined as one of America’s three religions—Protestant, Catholic, Jew—sharing their fate, for better and worse, including as targets of satire. Roth’s debut coincided with the Jewish moment in American culture, and he proved it by eventually surpassing all other Jewish American novelists in popularity. By raising the specter of anti-Semitism, Roth’s anachronistic critics made Roth seem all the more up-to-date.
I was in no greater sympathy with those who expected Roth to be “fair” to the Jewish community. We were by then a small army of college-graduated Jews who had been trained to differentiate advertising from literature, and to reject the notion of any writerly loyalty other than to writing itself. When accused of misrepresenting the Jews, Roth responded in this magazine with an imagined list of similar complaints that might have been leveled at other authors, e.g., to Fyodor Dostoevsky for the portrait of Raskolnikov: “‘All the students in our school, and most of the teachers, feel that you have been unfair to us….’ ‘Dear Mark Twain—None of the slaves on our plantation has ever run away. But what will our owner think when he reads of Nigger Jim?’ ‘Dear Vladimir Nabokov—The girls in our class…’” When it came to defending artistic independence, Roth was clearly able to hold his own.
The more vexing question of Portnoy’s literary merit was raised most cogently by Irving Howe—in this magazine in 1972. As the literary critic who defined the New York intellectuals (also in this magazine), Howe seemed to be speaking for his intellectual cohort when he quotably wrote, “The cruelest thing anyone can do with Portnoy’s Complaint is to read it twice.” He then cruelly tried to substantiate his claim. Nonetheless, Howe managed to inflate the book’s impact while depreciating its value by calling the novel a “cultural document of some importance,” claiming that younger Jews took it as a signal for abandoning their Jewishness while some Gentile readers took it as sign that Jews were no better than anyone else:
[They] could almost be heard breathing a sigh of relief, for it signaled an end to philo-Semitism in American culture, one no longer had to listen to all that talk about Jewish morality, Jewish endurance, Jewish wisdom, Jewish families. Here was Philip Roth himself, a writer who even seemed to know Yiddish, confirming what had always been suspected about those immigrant Jews but had recently not been tactful to say.
Was it not praising with faint damn to credit Roth with having changed the direction of American culture? And why should Howe be more distressed than the rabbis? This panning could only help further stoke the image of Roth as a bold, renegade Jewish writer.
Roth later got his own back in a recognizable caricature of his critic (as Milton Appel in The Anatomy Lesson), but this was more than a personal feud. The book and the controversy it stirred marked a shift in American Jewish culture—a generational one. Howe, like Roth, had once rebelled against Jewish observance and like him, too, had married “outside the faith,” but by the time he wrote this review essay, he had created anthologies of Yiddish literature and had retrieved his heritage in World of Our Fathers, a cultural history of the Jewish immigrant experience.
Howe’s generation was saturated with old-world Jewishness. Delmore Schwartz could evoke the Jewish intonations of a mother’s speech. Isaac Rosenfeld wrote some of his stories in Yiddish. Joseph Dorman’s film Arguing the World, takes Irving Kristol, Daniel Bell, and Nathan Glazer back to their immigrant neighborhoods and probes their attachments to their Jewish upbringing. While Saul Bellow and Bernard Malamud are often linked with Roth in a triumvirate of Jewish writers, there is actually a world of difference between the older writers who drew from a reservoir of Jewishness and Philip Roth, whose mother made jello, not challah, whose dad played baseball rather than read the Forverts. Howe addressed this difference when he charged Roth with running on empty:
Portnoy’s Complaint is not, as enraged critics have charged, an anti-Semitic book, though it contains plenty of contempt for Jewish life. Nor does Roth write out of traditional Jewish self-hatred, for the true agent of such self-hatred is always indissolubly linked with Jewish past and present, quite as closely as those who find in Jewishness moral or transcendent sanctions. What the book speaks for is a yearning to undo the fate of birth; there is no wish to do the Jews any harm (a little nastiness is something else), nor any desire to engage with them as a fevered antagonist; Portnoy is simply crying out to be left alone, to be released from the claims of distinctiveness and the burdens of the past, so that, out of his own nothingness, he may create himself as a “human being.” Who, born a Jew in the 20th century, has been so lofty in spirit never to have shared this fantasy? But who, born a Jew in the 20th century, has been so foolish in mind as to dally with it for more than a moment?
It was impossible for Roth to recover what he never had, but Howe accused him of embracing the hollowness of what American Jewish life had become rather than trying to fill it.
This cultural shift also had a political undercurrent. Some of the New York intellectuals had undergone a political transformation from left-tending liberalism to neoconservatism. Having started out on the left, they understood its dangerous attractions and the corresponding need to protect American freedoms. Once opposed or indifferent to Zionism for its national backsliding from the international ideal, they discovered Israel and accepted responsibility for its defense. They were not all Cold Warriors to the same degree, but they wanted to bring down the Soviet Union. They were shocked by the radical assault on elite universities where some of them were now privileged to teach. Their disquiet intensified as protest against the war in Vietnam morphed into an attack on Western civilization. Though Howe continued to call himself a socialist, he was like the others culturally conservative, and he associated Roth with the radical impulse. He decries Roth for his vulgarity, by which he means not the scatology or descriptions of masturbation but “the impulse to submit the rich substance of human experience, sentiment, value, and aspiration to a radically reductive leveling or simplification.” In Howe’s judgment, Portnoy’s Complaint violated the standards of civilizing refinement that the older Jewish intellectuals were trying to uphold.
My political sympathies were generally with the New York intellectuals—but the book made me laugh. I was learning to trust my own response when it contradicted that of my literary betters, and my artless reaction to Roth’s novel made me ready to defend him from Howe’s critique. I thought Howe had missed the whole point of the comedy: Laughter would explode the clichés of American Jewish culture, including the image of the arrested adolescent who was passing himself off as the typical Jewish male. Laughter was a therapeutic purge, part indictment, part confession, with curative potential. Portnoy’s mock-analysis culminates in the punch line: “So [said the doctor] Now vee may perhaps to begin. Yes?” This was both part of the comedy and its resolution. Alex was about to rise from the couch a somewhat steadied Jewish American male capable of love and happiness, as donor and recipient. I saw this work as a signpost on the road to the cultural and political maturity that the neoconservatives had already reached, and I expected Portnoy’s creator, the original klug man, to move on.
Was I right?
Irving Howe was proved spectacularly wrong in his assessment of Roth’s literary powers. Endlessly inventive, Roth may have bombed with the works that came in the immediate wake of Portnoy, such as Our Gang and The Breast, but the creation of Nathan Zuckerman in the late 1970s as a Roth stand-in served him for eight full novels, ranging in style from postmodern to traditional and in quality from passable to great. Roth proved fully capable of probing the human soul in tight novellas and epic sagas. And in a one-man literary Marshall Plan, he also generously sponsored the work of European authors—Tadeusz Borowski, Bruno Schulz, Danilo Kiš, Milan Kundera—and featured other writers in his fiction, reviving Anne Frank in one of his novels and including (then) living Israeli Aharon Appelfeld in another. We now know that serious heart problems curtailed the range but not necessarily the intensity of his writing. From book to book one never knew what to expect, so I acquired and read almost all of them.
It is harder to confront Roth’s effect on American Jewry. As said, no other American writer was ever so closely associated with Jewish subjects and a Jewish readership, nor can one imagine Roth successful without them. Yet the attachment had not been his idea. When Roth’s designated biographer, Blake Bailey, said recently, “The Jewish thing was really what informed Philip as a writer,” he then noted that the credit really went to George Starbuck, Roth’s first editor, who had been given a longer manuscript and discarded all but the stories with Jewish themes. Starbuck made the shrewd decision that Goodbye, Columbus would be about Jewish life in America at the time when Jews were all the rage. Roth said, “In many ways, George formed my career, because I didn’t know that I was a Jewish writer.” It was a shotgun wedding, not unlike Roth’s unhappy first marriage to Margaret Martinson, from which he was released by her death. He could not quit the Jewish union, however, without giving up the dowry of fame it had brought him, so he stayed to the end in the cheerless marriage.
Roth’s denial of meaningful Jewish attachment remained an essential feature of his writing, complicated by the lack of alternative, for unlike Russian Jewish writers like Boris Pasternak who turned to Christianity, he disliked Christianity even more than being a Jew. In a 1961 Commentary symposium on “Jewishness and the Younger Intellectuals,” the year after he had won the National Book Award for Goodbye, Columbus, Roth wrote that he could not distinguish a Jewish style of life different from the American urban and suburban middle classes, or any values separating Jews from others.
There does not seem to me a complex of values or aspirations or beliefs that continue to connect one Jew to another in our country, but rather an ancient and powerful disbelief, which, if it is not fashionable or wise to assert in public, is no less powerful for being underground: that is, the rejection of the myth of Jesus as Christ….And wherein my fellow Jews reject Jesus as the supernatural envoy of God, I feel a kinship with them.
Needless to say, this form of kinship is not a basis for any true affection. He then goes on to deny any other form of religious or cultural cohesion so that “we are bound together, I to my fellow Jews, my fellow Jews to me, in a relationship that is peculiarly enervating and unviable. Our rejection, our abhorrence finally, of the Christian fantasy leads us to proclaim to the world that we are Jews still—alone, however, what have we to proclaim to one another?”
It is one thing to nurse such a paltry idea of the Jewish people but much more troubling to use it as the basis of a literary career. Roth’s rejection of faith is the kind that many Jews admit to at the start of their cognitive and emotional development. Daniel Bell fondly recalled telling his rabbi that he could not have a bar mitzvah because he did not believe in God and having the rabbi answer, “Do you really think He cares?” But Roth’s starting point remained his endpoint: American Jews were Jewish only by negative definition. The influence of this idea is everywhere manifest among those liberal Jews who, while finding no inspiration in their own religious tradition, reflexively distrust true Christians, especially evangelicals even when (or especially when) they are Israel’s strong supporters. Their rejection of Christians supersedes and displaces their affection for fellow Jews. That this insults Christian honesty and undermines Jewish security is not as troubling as the mean defensiveness of those who actually hold such views. Roth could fall back on the privilege of the satirist. His cultural adherents have no such pretext.
Roth was just like the earlier generation of Jewish writers and intellectuals in remaining attached to his childhood, but its imagined inauthenticity left him stuck in a time warp. The work that shows off this emptiness to greatest disadvantage is the 2004 novel The Plot against America. It reimagines what might have happened to Philip Roth’s actual family—father Herman, mother Bess, and brother Sandy—had Nazi sympathizer Charles A. Lindbergh become the Republican candidate for the presidency and defeated Franklin Delano Roosevelt in the 1940 election. The idea for such a dystopian fiction must have occurred to Roth because by the turn of the century anti-Semitism was once again on the rise in America, but he re-created an obsolete scenario instead of the real one. As had already been obvious for decades, the new aggression against the Jews originated in the Arab war against the Jewish state and had been couched since the 1960s in the slogans of Soviet anti-Zionism. The Zionism-racism accusation, pushed through by the Soviet-Arab axis at the United Nations, penetrated the United States from the left just as German-Nazi propaganda had once done from the right. The aggression had flipped political sides. Casting Palestinians as victims of Israeli imperialism and appropriating for them the role of refugee victim, a coalition of grievance and blame made common cause against Israel and against American Jews who supported their homeland. Rather than deal with this new threat, Roth retreated to his childhood politically, to take on the familiar Nazi bogeyman and refight the war that American troops had already won. He misidentified the target.
Fortunately, there were also times when Roth was able to fashion aspects of his “peculiarly enervating and unviable” relation to the Jews into masterworks. He did this by returning as Nathan Zuckerman to the familiar Newark of his childhood to treat as tragedy the spiritual hollow he had once subjected to satire. American Pastoral (1997) looks at Seymour “Swede” Levov, a fleshed-out version of Ron Patimkin, who innocently pursues and apparently achieves his idea of American success. The handsome Jewish Sports Hero marries the Gentile Beauty Queen, wins his reluctant father’s approval for the union, and settles down with his wife in the suburban paradise of Rimrock. A century earlier, Fyodor Dostoevsky wrote The Possessed to probe the emergence of Russia’s intellectual mercenaries, and Roth uses this unlikely setting to do the same for the American radicals of the late 1960s.
Meredith Levov…the “Rimrock Bomber” was Seymour Levov’s daughter. The high school kid who blew up the post office and killed the doctor. The kid who stopped the war in Vietnam by blowing up somebody out mailing a letter at five a.m. A doctor on his way to the hospital…
The Swede’s younger brother updates Zuckerman, his high-school classmate, who then searches out and brings us the full story: How could a good man like Seymour Levov, living out his version of paradise, breed a monster? But he does. Of course this embrace of violence in the name of salvation was not strictly a Jewish issue, but Roth showed privileged insight into how the escape from Jewishness formed part of it.
Roth attempted something on the same scale three years later in The Human Stain. The main setting is a New England College where Zuckerman has befriended one of the deans, the Jewish professor Coleman Silk, who is spuriously accused of insulting African-American students by using the term “spooks” to describe their ghostly disappearance from his class. In the ensuing purge, Silk is revealed to be a light-skinned African-American who, when he decided to pass, did so as a Jew, until then—at least outwardly—successfully. Roth manages to break out of his constraints as a Jewish writer through the story of an African American who is breaking out of his constraints as a black man, and in the process inevitably damages his family and himself in ways that Seymour Levov unwittingly does in Rimrock. Roth avoided the charge of political incorrectness that he would have incurred as a writer had he written about a Jewish professor by casting accusers and offender as black-on-black rather than black-on-Jew. Roth was careful never to offend the liberal hand that fed him even as he took on hot topics. He was shrewd as well as smart.
Through this entire career studded with prizes and fame, Roth never graciously accepted his designation as a Jewish writer, much less any implicit responsibility or affinity for the Jews or Israel. Whom was he denying? A sad feature of his life as a writer is that in never pretending to feel anything for the Jewish God, the Jewish homeland, or the Jewish people, Roth could not luxuriate in the affection and gratitude that many readers accorded him. At the heart of his fiction, hence of his standing as a writer, is distrust of Jewishness and secondarily of America as home to that Jewishness. Cold kasha. Adverse relation to one’s habitual subjects is not the best recipe for great art, and Roth did as well with it as anyone could, but I wish that after Portnoy if not before, he could have reached the threshold of love.
With the sadness that attended Roth’s retirement from writing in 2012 and his death in 2018 came the realization that his work was never joyful. Funny and witty certainly, vital and intelligent always, and highly entertaining, but never plainly happy in the way a well-matched bride and groom enchant family and guests at their wedding. I was startled to find in the essay quoted above that Irving Howe calls him “an exceedingly joyless writer, even when being very funny.” He saw this before I did
Here is the Russian Jewish short-story master Isaac Babel (1894–1940 ) on Odessa, the “Newark” of his childhood:
If you think about it, [Odessa] is a town in which you can live free and easy. Half the population is made up of Jews, and Jews are a people who have learned a few simple truths along the way. Jews get married so as not to be alone, love so as to live through the centuries, hoard money so they can buy houses and give their wives astrakhan jackets, love children because, let’s face it, it is good and important to love one’s children.
Babel loved the Jews for what they were, the enjoyment of bourgeois pleasures being the best of their qualities. Babel loved being who he was despite the heavy price it exacted. Although he was first silenced and then tortured and killed at Stalin’s command, his work breathes happiness and joy. (With due respect for the difference, one thinks back to the legends of Rabbi Akiva that wrest laughter and joy from the great Destruction.) How is it that the modern Jewish writer who functioned under the most aversive moral and physical conditions should have cast himself as the harbinger of sunshine in Russian literature, whereas the novelist who benefited beyond all others from America’s freedom and opportunity should have put so little of its pleasures into his writing?
It might have been because Roth could never bring himself to say, “Damn right, America—I’m your Jewish writer, and thank you for letting me be proud of it!”
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
No, we’re not stagnated
Democrats find themselves in a state of confusion. Not only is there no clear favorite for the party’s 2020 presidential nomination, it’s uncertain what economic policies the party’s eventual nominee will put forward. Among the ideas currently being argued and discussed by progressive activists and wonks are free college tuition for all, expanding Medicare, heavily regulating or breaking up the big-tech platforms, and a universal basic income or jobs guarantee.
Yet wherever Joe Biden, Kamala Harris, Elizabeth Warren, Bernie Sanders, and whoever else might climb the greasy pole come down, they will likely agree on at least one thing: While Trump and tax-cutting Trumponomics may be the immediate target of their ire, they will also argue that the U.S. economy has been on the wrong track for decades. Forget Ronald Reagan’s famous question to voters in 1980, “Are you better off than you were four years ago?” As Democrats see things, the American middle class is worse off than it was before Reagan took office. In their eyes, the pro-market tilt in U.S. economic policy since Reagan’s time—lower taxes, lighter regulation, freer trade—has resulted in little more than higher inequality, lower upward mobility, and middle-class income stagnation. The claim is no longer even remotely controversial on the left and is frequently repeated by its politicians as an incontrovertible fact. This 2011 speech by Barack Obama is typical:
There is a certain crowd in Washington who, for the last few decades, have said, let’s respond to this economic challenge with the same old tune. “The market will take care of everything,” they tell us. If we just cut more regulations and cut more taxes—especially for the wealthy our economy will grow stronger…. But here’s the problem: It doesn’t work. It has never worked…. [Over] the last few decades, the rungs on the ladder of opportunity have grown farther and farther apart, and the middle class has shrunk…. This is about the nation’s welfare. It’s about making choices that benefit not just the people who’ve done fantastically well over the last few decades, but that benefit the middle class, and those fighting to get into the middle class, and the economy as a whole.
Or as Sanders summed it up at the 2016 Democratic National Convention: “This election is about ending the 40-year decline of our middle class.”
Interestingly, President Trump makes pretty much the same argument. As he said in his inaugural address: “For many decades, we’ve enriched foreign industry at the expense of American industry. . . . The wealth of our middle class has been ripped from their homes and then redistributed all across the world.” And again in his 2017 joint address to Congress: “I will not allow the mistakes of recent decades past to define the course of our future. For too long, we’ve watched our middle class shrink as we’ve exported our jobs and wealth to foreign countries.”
Trump has never been a Reagan fan, particularly on trade. As he said in March at a rally for congressional candidate Rick Saccone, “I loved [Reagan’s] style, his attitude. He was a great cheerleader for the country. But not great on the trade.” And in 1991 he testified to Congress against the 1986 Reagan tax cuts, calling them an “absolute catastrophe for the country.”
It shouldn’t be surprising that Sanders and Trump agree on America’s supposed 40 years of economic woe. They’re both populists, and populists, whether in Venezuela or the United States, need to make a political case that goes beyond complaining about current circumstances. They must argue that the failure of the nation’s elites has been total, purposeful, and long-standing. The problem isn’t just Obamanomics, but Clintonomics, Bushonomics, and Reaganomics.
Economic facts, properly understood, simply do not support the argument that the broad American middle class has been stuck in neutral for nearly two generations. Now, it is true that Census data show real median incomes rising at an almost imperceptible 0.3 percent a year from the mid-1980s through 2013. At the same time, real per-person economic growth rose at a much quicker rate, nearly 2 percent a year. The difference between those figures reflects widening inequality. The rich got richer while others stayed relatively the same.
But only partisans think those numbers truly reflect the economic realities of the typical American family.
A University of Chicago poll of top economists found that 70 percent agreed with the proposition that the Census Bureau’s conclusion “substantially understates how much better off people in the median American household are now economically, compared with 35 years ago.” The economist Martin Feldstein, for instance, argues that the agency fails to take into account shrinking household size, the rise in government-benefit transfers, and changes in tax policy. It also measures inflation in a way many experts think overstates the actual rise in living costs. The Census Bureau uses the common consumer price index, but many economists favor something called the personal-consumption-expenditures price index, viewing it as a more reliable and comprehensive measure. And the PCE typically shows a lower inflation rate than the CPI.
One organization that does take all of this into account is the Congressional Budget Office. In March, CBO released a study that calculated much stronger gains for the broad middle class—which I’ll define here as the 21st to 80th income percentiles. One way to look at how that group is doing is by calculating “income before transfers and taxes”—roughly, market incomes plus social-insurance benefits such as Social Security and Medicare. Measured in this way, middle-class incomes rose 28 percent from 1980 through 2014. So this may not be blazing-fast growth, but it’s nearly five times larger than the number offered by the Census Bureau.
Then the CBO looked at “income after transfers and taxes”—market income plus social-insurance benefits plus means-tested transfers (Medicaid, food stamps) minus federal taxes. This more fully captures all the economic resources the American middle class commands. And it finds middle-class income increased 42 percent since 1980. More impressive still: Incomes for the bottom fifth are up nearly 70 percent. This is not the stagnated America that the populists have been telling us about.
And remember, these numbers compare the middle class today with that of decades ago. But these are not the same families and households. A 2016 Urban Institute study by Stephen Rose found that 38 percent of American families in 1979 were middle class (defined as households earning between $50,000 and $100,000 annually, adjusted for inflation) vs. 32 percent in 2014. That sounds terrible. What happened to all those middle-class families?
The study divided households into five income groups: poor, lower middle class, middle class, upper middle class, and rich. Of those groups, the bottom three got smaller over the decades while the top two grew. The ranks of the poor shrank by 4.5 percentage points, the lower middle class by 6.8, the middle class also by 6.8 points. But the upper middle class got a lot bigger, expanding by 16.5 points, while the rich grew by 1.7 points. So what happened to the middle class? It disappeared because it got richer. There has not been a middle-class meltdown. There’s been a melt-up.
Confronted with these statistics about income, stagnationists tend to narrow the focus and say that what really counts is worker wages, good old-fashioned take-home pay. And they will often produce charts showing that the typical American worker makes no more than in 1975. But they are choosing the wrong inflation measure, which makes a tremendous difference when evaluating the true purchasing power of workers. A 2017 study by the Dartmouth economist Bruce Sacerdote, for instance, finds that real wages grew by at least 24 percent since the Ford administration, and perhaps much more. “Estimates of slow and steady growth seem more plausible than media headlines, which suggest that median American households face declining living standards,” Sacerdote concludes.
And that steady growth continues to allow most American to live the American Dream, if you define the Dream as each generation being wealthier than the one before. You would be forgiven for thinking this is not the case. Last year, the superstar economist Raj Chetty and his team made headlines with a study that compared the incomes of 30-year-olds starting in 1970 with the earnings of their parents at the same age. The researchers found that in 1970, 92 percent of American 30-year-olds earned more than their parents did at a similar age, versus just 51 percent in 2014. “The likelihood that young adults will earn more than their parents has plummeted”—that is how the Associated Press summarized the findings.
Yet this is really a worst-case interpretation of the data. Other economists raised issues concerning the study’s assumptions about inflation, the role of taxes and transfers, and whether looking at adult children at age 40 might have been more relevant than age 30 given that more Americans are starting their working life later than they did decades ago. Indeed, a follow-up analysis by researcher Scott Winship finds that “roughly three in four adults—and the overwhelming majority of poor children—live better off than their parents after taking the rising cost of living into account.”
But set the data aside for a moment. The idea that most Americans are worse off than they were in the 1970s seems intuitively nonsensical to those of us who were living back then. As former Obama economic adviser Jason Furman once put it: “ignore the statistics for a second and use your common sense. Remember when even upper-middle-class families worried about staying on a long distance call for too long? When flying was an expensive luxury? When only a minority of the population had central air conditioning, dishwashers, and color televisions?”
Or look at smartphone ownership. Nearly 80 percent of Americans have amazing panes of glass in their pockets, something that didn’t exist in 1980 or 2000. How many of us would choose to live in a pre-smartphone era even with a substantially higher income? A thought experiment by Washington Post reporter Matt O’Brien neatly gets at the issue: “Adjusted for inflation, would you rather make $50,000 in today’s world or $100,000 in 1980’s? In other words, is an extra $50,000 enough to get you to give up the Internet and TV and computer that you have now? This might be the best way to get a sense of how much better technology has made our lives—not to mention the fact that people are living longer—the past 35 years.”
Of course the median or typical family isn’t every family or every person. Some groups, such as working-class men, may well have seen their living standard stagnate. And then there are the communities hurt by changes in world trade patterns that never really bounced back. But that is a narrower argument than the one the stagnationists are making—it’s not the 1-percent-versus-99-percent argument that progressives and populists have been making. Their point is that pro-market policies have failed for most Americans for two generations and are thus discredited. Or as Vice President Mike Pence has put it, “the free market has been sorting it out and America’s been losing.” Bernie Sanders couldn’t have said it better.
The populists of the left and right agree that America’s golden age was in the immediate postwar decades when taxes were high, unions were strong, and economic growth was rapid. It is against that period that populists judge the economy of more recent decades. But policymakers can’t just dial up the Wayback Machine and return to the supposed Baby Boomer paradise of the 1950s and 1960s. The post–World War II decades were affected by a host of unrepeatable factors, the most important of which was that America’s economic competitors were recovering after a global war. A National Bureau of Economic Research study described the situation this way: “At the end of World War II, the United States was the dominant industrial producer in the world,” at one point responsible for nearly 60 percent of the world’s output. “This was obviously a transitory situation.”
Not only have our competitors since recovered and thrived, but globalization has brought billions of new workers into the global labor market and raised their standards of living more rapidly than the world has ever seen. Fixating on the past and drawing the wrong lessons from economic history will only leave American workers ill-prepared to meet those challenges. And if that happens, the stagnationists of the populist left and right may finally be correct.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
How Justice Anthony Kennedy’s jurisprudence of dignity came full circle
You might think this quote comes from Justice Kennedy’s majority opinion in Masterpiece Cakeshop v. Colorado, in which seven of the Court’s nine members agreed to strike down Colorado’s punishment of a Christian baker for refusing to create a custom-made cake celebrating a same-sex wedding. But in fact, the quote is from Justice Kennedy’s opinion in the 1996 case Romer v. Evans, a case involving a Colorado law that protected the right of private companies to discriminate against homosexuals.
Kennedy’s attempt to frame his constitutional protection for homosexuals in terms of human dignity for all against what he saw as the unreasonable animosity of Colorado’s popular majority was controversial in 1996. In his Romer dissent, Justice Antonin Scalia wrote that “the Court’s opinion is so long on emotive utterance and so short on relevant legal citation.” And it is no less controversial today. But it became the basis for Kennedy’s generation-long work of expanding constitutional protection for homosexuals. He began with Romer. Then came the Court’s 2003 announcement of a right to engage in homosexual sodomy. That was followed by the Court’s 2013 decision striking down the federal Defense of Marriage Act. All this was capped off by the Court’s 2015 announcement of a constitutional right to same-sex marriage. Kennedy authored all of these opinions for the Court, elaborating what might be called a “jurisprudence of dignity.”
Perhaps it is ironic, then, that Justice Kennedy invoked his jurisprudence of dignity in the other direction in Masterpiece Cakeshop. He wrote for the court not in favor of the homosexual couple who demanded that Jack Phillips bake a cake celebratingtheir wedding, but rather in favor of Phillips. But it is unsurprising—except, perhaps, among those who believe that traditional religious beliefs are themselves undignified.
In 2012, Charlie Craig and Dave Mullins visited the Masterpiece Cakeshop in Lakewood, Colorado. They were planning to marry. Though same-sex marriage was not yet legal in Colorado then, it was legal in Massachusetts; so they planned to marry in Massachusetts, then return home to Colorado to celebrate with friends and family. And a cake.
“I’ll make your birthday cakes, shower cakes, sell you cookies and brownies,” Mr. Phillips told them, according to the Supreme Court’s opinion. “I just don’t make cakes for same sex weddings.” The reason for Phillips’s refusal was straightforward. “Jack Phillips is an expert baker who has owned and operated the shop for 24 years,” the Court explained. “Phillips is a devout Christian. He has explained that his ‘main goal in life is to be obedient to’ Jesus Christ and Christ’s ‘teachings in all aspects of his life.’ … And he seeks to ‘honor God through his work at Masterpiece Cakeshop.’” Which, consistent with the Christian Bible’s teachings and tradition, does not include celebrating one man marrying another.
Another couple might have rejected Phillips’s offer and vowed never to buy another cupcake from him ever again. But Craig and Mullins took another approach: They called upon the State of Colorado, through the state’s Civil Rights Commission, to punish Phillips for what he’d done.
Colorado agreed. Rejecting Phillips’s invocation of his First Amendment rights to free speech and free exercise of his religion, the state’s commission ordered Phillips to “cease and desist from discriminating against [Craig and Mullins] and other same-sex couples by refusing to sell them wedding cakes or any product [he] would sell to heterosexual couples.” Which is to say, the state ordered Phillips to prepare the customized wedding cakes on demand. And the Colorado courts agreed, leaving Phillips to petition the U.S. Supreme Court for relief.
The petition asked the court to consider “whether applying Colorado’s public accommodations law to compel Phillips to create expression that violates his sincerely held religious beliefs about marriage violates the Free Speech or Free Exercise Clauses of the First Amendment.” If the Court grappled fully with either of those questions, the result could have been a landmark decision. Kennedy and his colleagues could have ruled in favor of Colorado and the couple, thus removing the First Amendment as a shield against the advancement of same-sex-marriage rights and interests through neutrally worded anti-discrimination laws. Or they could have ruled broadly in favor of Phillips, thus blocking the use of those neutral anti-discrimination laws as a sword against traditional religious believers.
But those seemed to be questions on which the Court’s nine justices would probably be sharply divided—likely four on one side, four on the other, with Justice Kennedy in the middle. So the Court resolved the case on narrower grounds for which there was a substantial majority of justices in agreement. Instead of deciding whether the right to free exercise of religion in such expressive commercial contexts always trumps state anti-discrimination laws on the question of same-sex marriage, the Court held merely that this particular administrative proceeding violated Phillips’s constitutional free-exercise right. Why? Because the Commission’s decision seemed motivated, at least in part, by sheer animosity to religion.
“At several points during its meeting,” the Court explained, “commissioners endorsed the view that religious beliefs cannot legitimately be carried into the public sphere or commercial domain, implying that religious beliefs and persons are less than fully welcome in Colorado’s business community.” In one of the hearings on the Phillips case, a commissioner aimed an astonishingly hostile attack at Phillips’s religious motivation that none of her fellow commissioners disclaimed or disputed:
Freedom of religion and religion has been used to justify all kinds of discrimination throughout history, whether it be slavery, whether it be the Holocaust, whether it be—I mean, we—we can list hundreds of situations where freedom of religion has been used to justify discrimination. And to me it is one of the most despicable pieces of rhetoric that people can use to—to use their religion to hurt others.
This was too much for the Justices to abide, and the Court’s decision pivoted upon this. “To describe a man’s faith as ‘one of the most despicable pieces of rhetoric that people can use’ is to disparage his religion in at least two distinct ways,” Kennedy’s majority opinion explained: “By describing it as despicable, and also by characterizing it as merely rhetorical.… This sentiment is inappropriate for a Commission charged with the solemn responsibility of fair and neutral enforcement of Colorado’s anti-discrimination law.” The Court was also troubled by starkly different treatment that the commission afforded bakers who refused to bake cakes with messages critical of same-sex marriage.
Taking the Commission’s statements and disparate conduct together, the Court concluded that Colorado’s action against Phillips was unconstitutionally motivated by hostility toward his religious beliefs. “The Constitution ‘commits government itself to religious tolerance,’” the decision said, “and upon even slight suspicion that proposals for state intervention stem from animosity to religion or distrust of its practices, all officials must pause to remember their own high duty to the Constitution and to the rights it secures.” In this particular case, in light of the Colorado Commission’s actions and stated motivations, “the Commission’s consideration of Phillips’ case was neither tolerant nor respectful of his’ religious beliefs.”
And so the Court reversed the Colorado court’s decision affirming the state commission, but in so doing, the majority opinion took care to stress that its decision does not necessarily preclude states from applying anti-discrimination laws in a way that burdens the exercise of religion:
The outcome of cases like this in other circumstances must await further elaboration in the courts, all in the context of recognizing that their disputes must be resolved with tolerance, without undue disrespect to sincere religious beliefs, and without subjecting gay persons to indignities when they seek goods and services in an open market.
We can only imagine the number of justice-hours that went into crafting that paragraph to win the approval of seven justices. The triple negative—“without undue disrespect”—is particularly pregnant: Why not just “with due respect” for sincere religious beliefs? Indeed, is the Court implying that in the conflict between same-sex marriage and the exercise of traditional religion, there is such a thing as “due disrespect” for religious belief, and that it would be constitutionally permissible?
While the opinion was written by Kennedy, the narrowness of the ruling and the breadth of justices joining it is a hallmark of Chief Justice Roberts’s tenure leading the Court. As Roberts explained to a Rice University audience in 2012, “I think the broader agreement you can get on the Court, the better. And the way you get to broader agreement is to have a narrower decision.” In cases ranging from the constitutionality of the Civil Rights Act’s redistricting provisions, to abortion protests, to even the Affordable Care Act’s attempt to force states to expand Medicaid, Chief Justice Roberts’s power to assign majority opinions seems to have facilitated the building majorities of justices across normal ideological lines—broad agreements on narrow propositions. The Court’s decision in Masterpiece Cakeshop seems the latest case in that line.
But in considering analogous lines of precedent for the Court’s approach in Masterpiece Cakeshop, the most pressing and portentous precedents might be the recent Court’s affirmative-action decisions. In those cases, as in Masterpiece Cakeshop, the Court declared a bright-line rule prohibiting unconstitutional actions by the states. But in the affirmative-action cases, the initial bright line became a road map for state officials to avoid detection instead of deterring them. For example, the Court declared in Grutter v. Bollinger (2003) that “a race-conscious admissions program cannot use a quota system” and “may consider race or ethnicity only as a ‘plus’ in a particular applicant’s file.” But instead of reducing race-centric state-university admissions, Justice Sandra Day O’Connor’s opinion for the Court had the opposite effect, as Richard Sander and Stuart Taylor Jr. highlighted in their empirical study, Mismatch: “Selective schools around the country interpreted Grutter as a green light to use preferences aggressively and mechanistically so long as they did not overtly use ‘quotas’ or ‘points.’”
So it may be with Masterpiece Cakeshop. The most immediate lesson learned by state anti-discrimination officials may not be to avoid hostility toward religion, but rather to avoid boasting about their hostility. The Court’s opinion in Masterpiece Cakeshop focused on hostility per se, not just the bureaucratic admission of hostility. Accordingly, for this legal test to maintain any meaningful viability, it will require federal and state courts to look beyond mere appearances.
When state officials punish businesses and people for conduct rooted in religious belief and reject the defendants’ invocation of the First Amendment as protecting their exercise of religion, the courts will need to look closely to ensure that the state officials are not actually motivated by anti-religious bias. Patterns of disparate treatment like that in Masterpiece Cakeshop, where a religious baker’s punishment stood in stark contrast to the liberty enjoyed by bakers of opposite beliefs, would speak louder than bureaucratic words.
But above all else, Kennedy’s opinion exemplifies the theme at the heart of all of his gay-rights opinions: the importance—indeed, the constitutional importance—of protecting vulnerable minorities from hostile, prejudiced majorities. Except that in this case, the vulnerable minority wasn’t a same-sex couple but a Christian baker.
First came 1996’s Romer. Then, in striking down Texas’s criminal statute against homosexual sodomy as an unconstitutional burden on personal liberty in Lawrence v. Texas (2003), Kennedy’s opinion stressed that the Court’s own prior acceptance of such laws “demeans the lives of homosexual persons.” Kennedy returned to these themes in U.S. v. Windsor (2013). He wrote the opinion for the Court against the constitutionality of thefederal Defense of Marriage Act (DOMA), which had defined marriage, for purposes of federal law, exclusively in terms of the marriage of a man and a woman. “DOMA’s principal effect is to identify a subset of state-sanctioned marriages and make them unequal,” he wrote, and this “differentiation demeans the [same-sex] couple.” And so the Court declared DOMA unconstitutional, “for no legitimate purpose overcomes the purpose and effect to disparage and to injure” same-sex couples.
While nominally limiting his analysis to a federal law on marriage (a subject long governed primarily by the states), Kennedy’s opinion seemed to point clearly to the wrongness of state marriage laws. Prominent law professors encouraged him in this aim by attempting to tie his novel jurisprudence-of-dignity narrative back to well-respected Supreme Court precedents, especially Brown v. Board of Education. Bruce Ackerman’s We the People: The Civil Rights Revolution (2014) reframed the Court’s nullification of state-enforced racial segregation in terms of “the distinctive wrongness of institutionalized humiliation.” Of the Windsor case, Ackerman wrote, “Justice Kennedy’s opinion was simply a restatement of Brown’s anti-humiliation principle.” Ackerman was joined in this work by NYU’s Kenji Yoshino and others who attempted to map the “anti-humiliation principle” onto American constitutional law, building a case against traditional marriage laws.
When the Court took the final step of declaring a constitutional right to same-sex marriage a year later, in Obergefell v. Hodges (2015), Kennedy’s opinion for the Court yet again struck these chords: “There is dignity in the bond between two men or two women who seek to marry and in their autonomy to make such profound choices. … They ask for equal dignity in the eyes of the law. The Constitution grants them that right.”
Scholars declared victory and looked ahead to how this anti-humiliation principle might further be used against oppressive state and federal governments. Harvard’s Laurence Tribe argued that the anti-humiliation principle “signals the beginning of the end for discrimination on the basis of sexual orientation in areas like employment and housing.” Yoshino, in the prestigious “Foreword” essay opening the Harvard Law Review’s 2015–16 volume, declared that Obergefell heralds a “New Birth of Freedom,” perhaps reaching as far as reproductive rights. “Of course,” he added, “what counts as a ‘subordinated group’ will be up for debate.”
Indeed, it would be. Yoshino expressly held open the possibility that people forced to serve same-sex weddings, “such as the florist or restaurateur who does not wish to cater a gay wedding,” might indeed be a “subordinated group” claiming protection against government humiliation.
Too few progressives took this point seriously. It was an ironic oversight. Many of those who declare traditional religious views to be decreasingly popular in America, and who presume that religious believers are destined to become an irrelevant minority in American public life, fail to see that the very same trajectory could render traditional religious Americans to be the sort of discrete and insular minority that is at risk of oppression and—yes—“humiliation” at the hands of an energized majority. They would thus be precisely the sort of group that would receive heightened protection from the Court. Perhaps more proponents of same-sex marriage should have taken this point seriously. Perhaps Justice Kennedy’s opinion in Masterpiece Cakeshop will compel them to do so.
But at the same time, those who celebrate Masterpiece Cakeshop should pause and consider the implications of this path. Justice Kennedy worried this year about a state’s “undue disrespect to sincere religious beliefs.” More worrisome is what future judges might deem to be “due disrespect” for them.
Choose your plan and pay nothing for six Weeks!
The days of the Islamic Republic of Iran may be drawing to a close. What next?
The death of the actor Nasser Malek Motiee triggered the latest explosion in May. Before the 1979 revolution put the kibosh on his career, Motiee had been a fixture of the potboilers, police procedurals, and lusty comedies (Black-Clad Mehdi and the Hot Pants!) known collectively as “Film Farsi.” In the 1969 noir Qeysar, he played a butcher who sets out to avenge his sister’s rape, only to be stabbed to death by her assailants. “Qeysar!” the butcher cries out to his brother, the titular antihero of the film. “Where are you? They’ve killed your brother!”
Thousands flocked to Motiee’s funeral in Tehran, though he had been the subject of a media blackout and, save for a single role in 2014, hadn’t been permitted to appear on the silver screen for four decades. “Our state-run media is our disgrace!” his fans chanted at his funeral. Met with tear gas and the truncheons of security forces, they put a twist on Motiee’s best-known line: “Qeysar! Where are you? They’ve killed the people!”
Meanwhile, Iranian truck drivers have been on a nationwide strike for more than a week as of this writing. The drivers park their trucks on long stretches of highway and block access to gas stations and government buildings in protest against low wages, road tolls, and benefits cuts. They aren’t alone. Teachers, steelworkers, hospital staff, railway employees, and sugar-factory hands are among the other groups that have walked off the job over the regime’s apparent refusal to spread the nuclear-deal “butter” promised by the Obama administration—released Iranian assets that might total as much as $150 billion.
Women are removing their headscarves in defiance of compulsory veiling. Often, security forces hesitate to confront them directly, lest they incur the wrath of the public, though most of the women are identified and arrested after the fact. This gesture of feminine resistance, which first emerged during a mass uprising in December and January, has now become commonplace. And while the New Year’s uprising was suppressed, smaller, more scattered demonstrations continue to break out, forcing the regime to play whack-a-mole with dissidents.
The furies of the present have joined forces with the ghosts of the past. In April, a construction worker excavated a mummified body near the tomb of a Shiite saint in southern Tehran. The mummy appeared to resemble the corpse of Reza Shah Pahlavi, the founder of modern Iran, last seen in sepia-toned newspaper photographs decades earlier. The regime’s sketchy reaction—first confirming and later denying the rumors and eventually confiscating the mummy—only intensified the fervid speculation roiling the streets. Chants of “Long live Reza Shah!” rang out from soccer stadiums. Footage posted to social media showed a lion-and-sun flag, Iran’s traditional monarchic standard, fluttering high above a major thoroughfare in the city of Karaj.
President Trump’s decision to withdraw the U.S. from the nuclear deal will no doubt compound the pressures bearing down on the mullahs. While regime change is not on the American agenda, the Islamic Republic may enter its twilight of its own accord. Make no mistake: The process could take years. The exact shape of events is impossible to foresee. Even so, American policy must prepare for the possibility. The end of Islamist rule in Iran would be a world-historical event and an unalloyed good for the country and its neighbors, marking a return to normalcy four decades after the Ayatollah Khomeini founded his regime.
But what exactly is that normal? Some in the West hope that events in Iran today will revive the spirit of 1989. A liberal flowering in Iran would redeem the Arab Spring, the rise of populists in Central and Eastern Europe, and America’s own Trumpian turn, among other recent disappointments. What better proof that history tends toward liberalism than the land of the scowling ayatollahs going liberal democratic?
Such velvety dreams are unlikely to materialize, however. Policymakers in Washington and other Western capitals would be wise to gird themselves for the more realistic outcomes for an Iran after the mullahs.
For more than two millennia, the unchanging principle of Iranian political life was estebdad, or arbitrary rule, and it remains so today. One defining feature was state ownership of all land. The state could grant plots to various classes as a special privilege but never as a matter of right. Moreover, all economic activity, agricultural or otherwise, involved winning the favor of the state; what the state gave, the state could take away. The implications for Iran’s political development were profound.
“Social classes did not enjoy any rights independent from the state,” the Oxford historian Homa Katouzian has persuasively argued, and “there was no law outside the state, which stood above society, despite a body of rules that were subject to rapid and unpredictable change.” Thus, “unlike in Europe, the state’s legitimacy was not founded in law and the consent of influential social classes.” From the satrap to the peasant, all lived in fear of and at the mercy of the state.
Pre-Islamic Persia had laws, to be sure, and with the Arab conquest came an elaborate religious code governing nearly every aspect of life. Yet neither the pre-Islamic law nor Shariah could order the relationship between state and society. Neither could act as a constitutional or fundamental law, a concept that simply didn’t exist in Iran. As Katouzian notes, “this is what made the arbitrary exercise of power possible, indeed normal.” State agents could punish without license from Shariah—or decline to enforce Shariah precepts when it pleased them.
The arbitrariness of power extended to its source at the throne. Rulers exercised power because they possessed divine grace, and they possessed divine grace because they exercised power. Rebellion was thus a fine way to seize power, so long as you succeeded. If you didn’t, you might have been beheaded if you were lucky, or had boiling oil poured down your throat if you weren’t. With no formal rules of primogeniture, the death of each shah triggered a succession crisis. The heirs-designate blinded or castrated male siblings to secure their own ascent to the throne.
Estebdad has left deep imprints on the Iranian mind. It is estebdad that must be credited for the genius of Persian poetry and literature and wit, so much of which said obliquely and elliptically what couldn’t have been said forthrightly, lest the writer get the boiling-oil treatment. Iranian manners, too, owe much to estebdad: The affected deference, that circular way of dialogue, the maddening refusal to speak directly—the key to all of these things is probably fear of arbitrary power. The downside is that estebdad has foreclosed the possibility of social trust.
Outsiders sometimes observe that while individual Iranians shine in every human endeavor, from art and literature to medicine and engineering, they rarely work well together. On the soccer field, for example, individual stars achieve heroic feats of kicking and dribbling, but Iranian sides founder before more cohesive foreign teams. From an early age, every Iranian boy is told that he is a little shah, and he grows up to encounter legions of other little shahs, all of whom live under the established, inescapable fact of the shah.
The main political consequences of estebdad were disorder and discontinuity. There were good shahs, great ones even. And there were bad ones. The problem was that government was never established on a principle or set of principles. There were no Permanent Things. Adalat, justice, wasn’t something that could be baked into a system. The best one could hope for was a just shah. Everything depended on the character and personality of the man sitting on the Peacock Throne. As political actors, Iranians toggled between high passion and magical idealism, on the one hand, and cynical passivity, civic indolence, and shocking venality, on the other. There was no moderate mean between these two extremes.
So it was that, when Western-style modernity and nationalism arrived, Iranians were caught flat-footed. Two-and-a-half millennia earlier, Persia had been the superpower of its day. But by the late 19th century, the country had reached a nadir. It was a time of illiteracy, malaria, and poverty, and the nation, especially the intellectual elite, was newly awakening to Iran’s dilapidation, material and spiritual. Shame as much as pride thus fueled the nascent Iranian nationalism. A poem of the era summed up the state of affairs:
Our army the laughingstock of the world.
Our princes deserving of the pity of beggars.
Our clerics craving the justice of the unbelievers.
Our towns each a metropolis of dirt.
Thanks to European imperialism and early globalization, Iranians came into closer contact with the West than ever before, and this only heightened their sense of humiliation and inadequacy. Diplomats, Orientalists, concessionaires, and missionaries brought with them the seeds of modernity along with their own commercial, scholarly, and imperial ambitions. These developments triggered an unprecedented legitimacy crisis in Iran at the turn of the 20th century.
Western-educated elites clamored for mashrutiat, government that was “conditional” on the consent of the people. Similar ideas percolated among some of the ulama, the high priests of Shiite Islam. Drawing on pan-Islamist ideas then gaining currency across the Middle East, leading ulama called for lawful government in which “the people—be they shah or beggar—would be equal,” as one influential cleric put it.
In 1906, the Majlis, or parliament, was established. But Iran’s brief experiment with constitutionalism was a disaster. The great powers, Moscow especially, were hostile to constitutionalism. The forces of estebdad wouldn’t relinquish so easily. And the constitutionalists were bitterly divided among themselves. The two decades that followed were marked by foreign invasion, tribal rebellions, and license instead of ordered liberty. Soon self-government came to be associated with terror, famine, and chaos.
In the early 1920s, an ambitious officer named Reza Khan stabilized the country’s borders, put down various rebellions, and forged a new nation-state from the shabby remains of the Persian Empire. The Majlis declared him shah in 1925, and he was crowned the following year. He dragged Iran, kicking and screaming, out of the depths of backwardness. The oil era had already dawned (in 1901), and the flow of black gold quickened his various projects. Roads were built, universities founded, a modern civil service born, even a new calendar adopted. Civil law and secular lawyers eclipsed Shariah and the clergy. Women were liberated, according to Reza Shah’s lights, whether they liked it or not. Estebdad remained the supreme principle, though it gradually softened, particularly under his son, Mohammad Reza Shah, the last monarch, who ascended the Peacock Throne in 1941 following his father’s abdication.
Reza Shah’s project would end six decades later in the Islamic Revolution. But how did Khomeini pull it off? Under the Pahlavis, Iranians had achieved an unprecedented degree of prosperity and social mobility. Toward the end, in the 1960s and ’70s, they grew accustomed to double-digit growth, vacations abroad, children educated at universities in Europe and America, international prestige. Life was good. Yet millions of Iranians managed to convince themselves that they would be better off with Khomeini at the helm. This was political ingratitude on an incomprehensible scale.
Khomeini’s powers of deception can’t be overstated. Few of those who supported him, particularly among the middle classes, appreciated that they were about to replace a benign autocracy with an Islamist state. Yet deception on a mass scale is impossible without a strong appetite for it on the part of the deceived.
Recall that estebdad had yielded centuries of disorder and discontinuity. Dynasties and shahs came and went, but there was nothing solid to hold on to. The pace of disruption and discontinuity accelerated under the Pahlavis. The prosperity and stability of the era were real enough. But modernity handed down from on high was dizzying. Mohammad Reza Shah, especially, lost sight of how conservative his people really were. Perhaps Iran wasn’t ready for Black-Clad Mehdi and the Hot Pants! and social-insurance schemes for Tehran prostitutes. Perhaps it wasn’t wise for the shah to be known to cavort with Madame Claude’s girls.
In 1971, the shah attempted to paint something like a vision of continuity with his celebrations of 2,500 years of Persian monarchy. He had the right idea anyway, though in execution it entailed little more than a decadent party in the desert. Khomeini’s vision of Islamic justice, melded with vague leftish talk about the triumph of the dispossessed, was more enticing. Amid the “confusion of a people of high medieval culture awakening to oil and money,” as V.S. Naipaul described Iran’s revolutionary generation, Khomeini promised community, enchantment, and, above all, continuity with a wholesome Islamic past.
Yet the Islamic Republic proved even more destabilizing and discontinuous with Iranian history than had the dynasty it replaced. Resurrecting the rule of the warrior-imams of the seventh century and fashioning a sort of neo-Islamic Man called for a police-and-surveillance state that was utterly alien to Iranians. Islamic continuity, moreover, came at the expense of national pride and memory. Khomeini and his followers had no love for the pre-Islamic elements of Iranian identity, and like all totalitarians, they set out to erase whatever was incongruous with their ideology.
A state that exercised arbitrary power was one thing; a state that sought to reshape the soul quite another. The people lost the individual and social liberties they had enjoyed under the shah but gained none of the justice and stability they pined for. The new regime made life a misery in the name of ideology while retaining all of the venality and corruption of a classical Persian court. Forty years later, Iranians have had more than their fill of the Islamic Republic.
The key to Iran’s political future lies in the tension between the ineluctability of estebdad and the longing for continuity. If the Islamic Republic is to give way to a decent order, sooner rather than later, Iranians must resolve the dilemmas that have brought them to this point. This requires honesty and a willingness to read Iranian history as it really is.
First, Iranian political culture demands a living source of authority to embody the will of the nation and stand above a fractious and ethnically heterogenous society. Put another way, Iranians need a “shah” of some sort. They have never lived collectively without one, and their political imagination has always been directed toward a throne. The constitutionalist experiment of the early 20th century coexisted (badly) with monarchic authority, and the current Islamic Republic has a supreme leader—which is to say, a shah by another name. It is the height of utopianism to imagine that a 2,500-year-old tradition can be wiped away.
The presence of a shah needn’t mean the absence of rule of law, deliberative politics, or any of the other elements of ordered liberty that the West cherishes in its own systems. As the late Bernard Lewis insisted when speaking of the Arab world, it is possible to have freedom and deliberation and checks and balances within nonrepresentative, nondemocratic institutions. Iran has had a Majlis for more than a century, at various points during which the body operated as a genuine legislative chamber. In a post–Islamic Republic Iran, the Majlis can be revived as a true legislative body. But a revitalized Majlis wouldn’t obviate the need for a living authority, an ultimate guarantor of the state and of Iranian freedom.
A shah, moreover, can galvanize opposition to the current regime. The failed 2009 Green uprising and the more recent New Year’s revolt showed that while leaderless mass movements can lay bare the regime’s legitimacy deficit, they can’t finally overthrow the Islamic Republic. Labor strikes and hijab campaigns and occasional skirmishes with the security forces are useful. But they can’t answer the question: “Who do you propose should rule us?”
Perhaps the opposition forces will conjure a leader at the right moment and in organic fashion. Or maybe an ambitious would-be shah will emerge from among the security apparatus. Yet the most plausible current candidate is probably Reza Pahlavi, Reza Shah’s exiled grandson, whose prestige and popularity have spiked in recent years, as Iranians born after the revolution reckon with what they lost to their parents’ collective folly. Among the revolutionary slogans in currency today, the one with the greatest political meaning and potential is “Long live Reza Shah!” The slogan is pregnant with nostalgia, yes, but also with political imagination.
Second, Iranian political culture demands a source of continuity with Persian history. The anxieties associated with modernity and centuries of historical discontinuity drove Iranians into the arms of Khomeini and his bearded minions, who promised a connection to Shiite tradition. Khomeinism turned out to be a bloody failure, but there is scant reason to imagine the thirst for continuity has been quenched. To weather the storms of modernity, Iranians need a point of orientation—perhaps a mast to tie themselves to. Islamism wasn’t it. Iranian nationalism, however, could be the answer, and, judging by the nationalist tone of the current upheaval, it is the one the people have already hit upon.
When protestors chant “We Will Die to Get Iran Back,” “Not Gaza, Not Lebanon, My Life Only for Iran,” and “Let Syria Be, Do Something for Me,” they are expressing a positive vision of Iranian nationhood: No longer do they wish to pay the price for the regime’s Shiite hegemonic ambitions. Iranian blood should be spilled for Iran, not Gaza, which for most Iranians is little more than a geographic abstraction. It is precisely its nationalist dimension that makes the current revolt the most potent the mullahs have yet faced. Nationalism, after all, is a much stronger force, and the longing for historical continuity runs much deeper in Iran than liberal-democratic aspiration. Westerners who wish to see a replay of Central and Eastern Europe in 1989 in today’s Iran will find the lessons of Iranian history hard and distasteful, but Iranians and their friends who wish to see past the Islamic Republic must pay heed.