That parents are the premier influence in a child’s life is one of those pieces of conventional wisdom that seem to have behind them both the incontrovertible weight of experience and the accumulated evidence of a vast literature. Everybody knows that decent, conscientious parents who are attentive and loving to their children will almost always have children who are decent and conscientious, attentive and loving in their turn. Conversely, troubled—or, as we now say, “dysfunctional”—parents tend to have kids who are troubled themselves.
Of course these are human rules, which means there are always exceptions; but the exceptions are hardly such as to shake anyone’s faith in the validity of the rules. Besides, whole libraries are stocked with books, both popular and scholarly, dwelling on the importance of what transpires in children’s early years and attesting, on the basis of mountains of research, to the link between parental practices and children’s performance—cognitive, psychological, social, and more.
Given all this, it is easy to understand why Judith Rich Harris’s The Nurture Assumption: Why Children Turn Out the Way They Do1 should have become the center of a storm of controversy since its publication last August. For here is a book that takes on this conventional wisdom and challenges it root and branch. Hailed by some, and already pressed into the service of dubious causes, it has also been roundly condemned and denounced by many others. I am going to argue for an alternative judgment—that The Nurture Assumption is, for reasons that have not been recognized, a stunning if also crucially limited achievement. But let me start by taking note of the reception the book has garnered so far.
There is, first, the question of the author’s credentials. Judith Rich Harris is not only a nonacademic but a failed academic, having been dismissed from Harvard’s graduate program 30 years ago on the grounds that she was unlikely to produce original research. In the decades after leaving Harvard, she had, as she herself puts it, “no mentors and no students,” was bound to her home by a lingering illness, and maintained her link to the professional world as a writer of textbooks in developmental psychology. “She is what journalists call a ‘rewrite man,’ ” John Leo wrote in a highly critical piece on her book in U.S. News & World Report, “an unknown with no credentials.”2
The response from a number of leading psychologists has been exceptionally harsh. According to Newsweek, which ran a cover story on the book, Harvard’s Jerome Kagan is “embarrassed for psychology” by The Nurture Assumption, and T. Berry Brazelton finds its thesis “absurd.” “My first reaction,” said Urie Bronfenbrenner, professor emeritus at Cornell, “is amazement that this book should be taken seriously.” Quite a few psychologists have damned Harris not only for her lack of credentials but for being derivative to boot. In the New York Review of Books, Howard Gardner, the theorist of multiple intelligence, called her book “overstated, misleading, and potentially harmful.”
True, not all professionals have stacked up against Harris. Some, including the evolutionary psychologist Steven Pinker, who wrote the book’s foreword, have sounded strong notes of support, and psychologists in the area of behavioral genetics have also rallied around. Still others have backed her on the broad grounds that at least she offers a corrective to prevailing orthodoxies.
Support for Harris has also been forthcoming from those enamored for reasons of their own with the idea that parents may be less important than we think. Positive comments have rolled in from Betty Holcolm, the author of Not Guilty! The Good News about Working Mothers, and from Carol Travis, who in a review of the book in the New York Times called it a necessary counter to “politicians elected on the claims that day care, divorce, and working mothers are bad for children.” Newsweek, in its cover piece, took note of how welcome The Nurture Assumption will be to some for questioning the link between divorce and “problem behavior like drug use and drinking.”
The task of sorting out the noise about The Nurture Assumption from what it actually says is made somewhat harder by Harris’s own penchant for overstatement. To read her book is to recognize immediately the discrepancy between what Robert Wright, in Time, called her “too-breezy formulations of her thesis” and the “key qualifications” she enters in the course of her discussion. Nevertheless, when the various obstacles are cleared away—the hype, the professional furor, the errors of omission and commission by author and critics alike—what remains is an extraordinarily ambitious attempt to reexamine, from the ground up, an entire century’s worth of findings on the forces that mold the child of today into the adult of tomorrow. For the purpose of assessing that effort, it is, or ought to be, a matter of indifference whether Harris is derivative or original, credentialed or a rank amateur.
The Nurture Assumption, writes Harris at the outset, has two purposes:
[F]irst, to dissuade you of the notion that a child’s personality—what used to be called “character”—is shaped or modified by the child’s parents; and second, to give you an alternative view of how the child’s personality is shaped.
This, then, is a mission in two parts—one destructive, the other reconstructive. Let us consider them in turn.
The idea that parents have the power to shape a child’s personality for life does not appear as a guiding precept across peoples and across time. It is, as Harris writes, culturally specific. In societies other than our own, a child’s life has been deemed to lie in the hands of fate, just as, in societies driven by religious belief, the course of an individual’s life has been consigned primarily to God. It is mainly in Western secular culture, and particularly in the past century or so, that a different idea has come to hold sway.
The “true father” of that different idea, Harris writes, is Sigmund Freud. It was he who “constructed . . . an elaborate scenario in which all the psychological ills of adults could be traced back to things that happened to them when they were quite young and in which their parents were heavily implicated.” But Freud’s insight resonated far beyond the schools of Freudian thought itself. Behaviorists, too, hypothesized that a child is molded by whatever happens in the early years; with the right conditioning, John B. Watson famously declared, he could turn any “dozen healthy infants” into “any type of specialist I might select—doctor, lawyer, artist, merchant-chief, and yes, even beggarman and thief.” Like Freud, behaviorists simply assumed that the first forces shaping the human dough would leave a permanent imprint.
Yet, as Harris observes—and here is where the argument begins in earnest—few of us today would believe that Watson’s experiment could actually succeed. One reason is the powerful evidence we now possess of the effects of an individual’s genetic code on his life. Although debates rage about the relative importance of heredity, hardly anyone denies that it plays some significant role in who we are. From the findings of the human genome project to the work of behavioral geneticists on adoptive versus biological children, and on twins reared apart, it is clear that certain things, including at least some personality traits, are inborn—that something has left imprints on the human dough before any human hands have come to touch it.
What, then, of the nongenetic portion of our personalities—the part shaped by environment? Most of us would say that the most important such influence is home life, and above all our parents. That, in a nutshell, is the “nurture assumption”—and it is also where we confront the radical heart of Harris’s argument. For it is her contention that the “tremendous amount of research” committed to supporting this assumption “is not what it appears to be: it does not prove what it appears to prove.”
In considering Harris’s refutation of the nurture assumption, it is essential to bear in mind the word prove. She does not dispute that parents “who do a good job of managing their lives, and who get along well with others, tend to have children who are also good at managing their lives and getting along with others.” Nor does she dispute that children “who are treated with affection and respect tend to do better at managing their lives and their personal relationships than children who are treated harshly.” These generalizations, Harris readily concedes, not only comport with our ordinary experience; they have been verified many times over by social-science research. What she disputes, rather, is that this research has established a causal link—that “the children of pleasant and competent people grow up to be pleasant and competent people because of what they learned at home and how they were treated by their parents.”
To demonstrate her point, Harris considers a wide range of specific cases. Although it is impossible to replicate her many-faceted critique in an essay, a couple of representative examples may convey both its gist and its flavor.
We can begin with one particular form of parent-child interaction: corporal punishment, or spanking. Does spanking make children more aggressive? Virtually all child-care experts would say yes—and so, one is confident, would many parents. In fact, spanking is now not only widely stigmatized but, in some places, actually criminalized. This radical change, in a practice that until recently was quite common, has not come about because spanking is impossible to distinguish from child abuse; nearly anyone can tell the difference between beating a child in order to hurt him and slapping him on the backside for running out into traffic. What has made spanking increasingly unacceptable is the idea that, constituting as it does aggressive parental behavior, it makes children more aggressive in turn.
The logic behind this widespread belief, Harris writes, is at first blush persuasive; and yet, upon inspection, most of the research on the subject turns out to be suspect—indeed, from the point of view of social science, “worthless.”
She points in the first place to methodological problems. For the most part, the research on spanking does not distinguish among styles of child-rearing. Some parents—particularly those in minority ethnic groups and in low-income neighborhoods—tend to spank more than do white parents in middle-class neighborhoods. Children in the former groups can be shown to behave more aggressively and to get into more trouble than children in the latter. But what does that tell us? Not much, especially when it is placed side by side with findings about another group—Asian Americans—who do “use physical punishment but . . . don’t have aggressive kids” (emphasis added). There are simply too many variables at play for any meaningful conclusion to be drawn from these data.
And that is not all. In most studies, spanking is a one-way street, something parents do to children. What this overlooks, in Harris’s judgment, is that the relationship between parent and child is not static but dynamic:
Within any ethnic group or social class, some kids are more aggressive than others and some get spanked more than others. If aggressive kids get spanked more, is the kids’ aggressiveness caused by the spankings or are the parents doing a lot of spanking because they don’t like the way the kids are behaving?
In most cases, the answer is “impossible to tell.”
Away to minimize this problem, Harris suggests, would be to follow children over a number of years. One recent study did just that, concluding that “when parents use corporal punishment to reduce antisocial behavior, the long-term effect . . . tends to be the opposite”—that is, antisocial behavior increases. This result made headline news, for it appeared to verify that spanking does indeed have long-term negative consequences. Overlooked, however, was the fact that another study, appearing in the same issue of the same professional journal, reached the opposite conclusion. Indeed, this latter study indicated that for “black children of any age, and for the younger children in the study regardless of race, . . . spanking actually led to a decrease in aggressive behavior.”
What are we to make of this? Harris zeroes in on a key difference in the two studies: the authors of the first were measuring aggression at home, while the authors of the second were measuring aggression outside the home. Putting both studies together suggests that, while spanking may become a vicious cycle that produces greater troublesomeness at home, “being spanked at home does not make kids more aggressive when they’re not at home.” And what that suggests is just how shaky is the claim that spanking induces aggressiveness as a personality trait, let alone that reducing spanking will, as the first study asserted, “reduce the level of violence in American society.”
This homely example points to a number of themes that emerge throughout Harris’s critique of the literature. Much of it, in her reading, fails to take intrinsic qualities of children into account; fails to disentangle the effects of parents on children from the effects of children on parents; and customarily overlooks the difference between behavior in the home and behavior in the outside world—the world in which, as Harris observes, “children will spend the rest of their lives.”
To take another, inevitably controversial, example: does growing up in a broken family leave permanent scars on a child’s personality? It is incontestable, Harris writes, that “children are generally happier if they have two parents,” and “if they have evidence that both parents care about them and think well of them.” But the happiness of children is not at issue in the national debate over divorce and fatherlessness in America. That debate is centered on something else: “Do children with fathers turn out better in the long run than children without fathers? And if they do turn out better, is it because they had a father?” To the many researchers, pundits, and laymen who think the case for answering yes has been conclusively established, Harris again offers a series of counterarguments.
She begins with Judith Wallerstein’s landmark 1980 study, Surviving the Breakup—a book that, to quote the subtitle of its most recent edition, “revolutionized America’s thinking” by showing the detrimental effects of divorce on children and adolescents. Without disputing Wallerstein’s evidence or interpretations, Harris points out that her science is fatally flawed by the absence of any control group against which to measure her findings—that is, of any non-divorcing families “with which to compare the children of her patients.”
But Harris also finds fault with researchers who escape her charge against Wallerstein, in particular the sociologists Sara McLanahan and Gary Sandefur (whose 1994 book, Growing Up With a Single Parent, figured prominently in Barbara Dafoe Whitehead’s explosive article in the Atlantic, “Dan Quayle was Right”). McLanahan and Sandefur’s conclusion, one of the most influential social-science theses of the decade, is that:
Children who grow up in a household with only one biological parent are worse off, on average, than children who grow up in a household with both of their biological parents, regardless of the parents’ race or educational background, regardless of whether the parents are married when the child is born, and regardless of whether the resident parent remarries.
By “worse off,” the authors mean that the children are more likely to drop out of school; that they are less likely to get jobs; that the girls are more likely to become unwed mothers; and so forth.
“Clearly,” Harris writes, McLanahan and Sandefur “believe that the parents’ living apart is the cause of the kids’ problems.” But is it? Could not those problems be explained “without reference to the children’s experience in the home,” but by external factors associated with single-parent families?
The first such factor is income. Divorce typically “leads to a drastic decline in a family’s standard of living,” and this makes it harder to afford things that can affect children’s social standing among their peers. It can also make it harder for them to go to college, leaving them “less motivated to graduate from high school and to avoid getting pregnant.” Most important, the loss of income affects “the neighborhood they grow up in and the school they attend.” Middle-class neighborhoods and good middle-class schools, the kind where “almost all the kids graduate . . . and hardly any have babies,” are simply out of reach for most single mothers.
Another factor that could help account for the bad outcomes, Harris suggests, is moving. With or without an intact family, moving is “rough on kids.” Children of multiple moves are “more likely to be rejected by their peers” and to have “more behavioral problems and more academic problems than kids who have stayed put.” When combined with loss of income, she concludes, changes in residence “could account for most of the differences between kids with dads and kids without them.”
What about the well-established finding that children of divorced parents are themselves more likely to divorce when they are adults? We have here a clear-cut test case of the nurture assumption, one in which parental behavior seems to exert a psychological influence well into adult life. Harris summarizes a study of 1,500 pairs of adult identical and fraternal twins that appears to confirm that thesis. “The divorce rate,” she reports, “was 19 percent among the twins whose parents had remained married” but “considerably higher”—29 percent—for twins whose parents had divorced. And yet, an analysis of these results, she writes, shows that
about half the variation in the risk of divorce could be attributed to genetic influences—to genes shared with twins or parents. The other half was due to environmental causes. But none of the variation could be blamed on the home the twins grew up in.
In citing heredity, Harris does not mean there is such a thing as a “divorce gene.” Rather, she has in mind “an assortment of personality characteristics, each rough-hewn by a complex of genes and shaped and sanded by the environment,” that increases the chances of an unwise choice or of difficulty in getting along with people—traits like impulsivity, for example, or aggressiveness, or a tendency to be easily bored. Genes do not cause people to divorce, but we cannot rule out the possibility that something genetic is going on in parent-child divorce statistics—any more than we can rule it out in other cases where parents with problems have children with problems, including what experts call “the intergenerational transmission of child abuse.”
Nor does Harris mean that parents need not worry about the effects of divorce. As I have already suggested, she is careful to stipulate the many ways in which it can be bad for children. But, she insists, the fact of divorce itself must be distinguished from the kinds of fallout—loss of income, social disruption, etc.—that are its frequent but not inevitable accompaniments. When it is so distinguished, we find that divorce, in and of itself, “has no lasting effects on the way children behave when they’re not home and no lasting effects on their personalities” (emphasis added).
Let me summarize thus far. “Socialization research,” Harris writes, “has demonstrated one thing clearly and irrefutably: a parent’s behavior toward a child affects how the child behaves in the presence of the parent or in contexts associated with the parent”—but not otherwise. A child who has assumed the role of spoiled youngest child in the home may act according to stereotype for as long as his parents live and whenever he is with them; but there is no evidence that he carries the burdens or benefits of this role outside.3 Similarly, a child who has been ill-treated at home may always hold that fact against his parents and act surly or withdrawn when he is around them; but this poisonous relationship need not translate into his relations with people outside. The content of what children learn at home varies, but whatever that content is, “They may cast it off when they step outside as easily as the dorky sweater their mother made them wear.”
What The Nurture Assumption means to challenge, in short, is the fundamental idea that the parent-child relationship is the primary influence on other human relationships to come—the “template,” in the phrase of the British psychologist John Bowlby. This idea is a bedrock principle for many, perhaps most, clinical psychologists, and the foundation of most popular treatments of human psychology and development, from Penelope Leach’s classic texts on child care to the likes of Susan Forward’s Toxic Parents. If it is wrong—if the nurture assumption is not in fact the template we think it is, and cannot explain why children turn out the way they do—what, if anything, can?
Harris (and here is the reconstructive aspect of her book) thinks she has an answer to this question, in the form of what she calls “group-socialization” theory. According to this theory, socialization is not something that parents “do” to children; it is something that children do to themselves, out of the innate human drive to “learn how to behave like other people in their own social category.” And they do it not by imitating and observing their parents but by observing and imitating their peers. Indeed, so intent are children on behaving like the other members of their group that in cases where a conflict arises between their home life and their peer life, the peer group will “trump” the home nearly every time.
Harris’s theory is inspired first of all by studies in linguistics, in particular the provocative finding that peers, not parents, appear to be the decisive influence on how an immigrant child speaks his new language. Just as the nurture assumption would not have predicted this fact, she writes, so too are other puzzles in development better explained by a group theory like hers. Harris’s evidence for this thesis is piecemeal and circumstantial rather than systematic and dispositive, and hence less dramatic than her attack on socialization research. But it will nonetheless cause a frisson of recognition in many a parent.
Consider a small but suggestive example: the problem of the picky eater. At dinner, the parents of this child consume, say, spinach and salad and fish; the child declines them all. The more his parents cajole and importune, the more rigidly he refuses. Day after day, in many homes with young children, mealtime is wartime for just this reason. All the parental “modeling” in the world cannot induce a four-year-old to eat what his parents eat
Yet eventually he does eat all these things and more. What causes the change? Very often it is that his friend in preschool brought salad for lunch, or his schoolmates in the cafeteria ate fish there. How many parents have been amazed to learn that their child has happily devoured lettuce, or veal, or mushrooms at a friend’s house after years of belligerently spurning them at home?
Or consider a form of behavior most people do not want their children to emulate: smoking. It is widely assumed that whether or not a teenager takes up smoking is heavily influenced by whether his parents smoke; indeed, writes Harris, “that is why most parents who smoke feel guilty about it, and many quit for that reason.” But in fact the adult model is not the most influential; rather, “the best predictor of whether a teenager will become a smoker is whether her friends smoke.” Then how explain the correlation between parents and teenagers who smoke? Here again, Harris argues, we cannot dismiss the role of heredity; the potential for addiction to nicotine is at least partly genetic. In other words, “exposure to peers who smoke is what determines whether or not a teenager will experiment with tobacco. Her genes determine whether or not she will get hooked.”
Or consider, finally, criminality. In general, writes Harris, behavioral studies indicate that a home environment shared by twins or siblings has little effect on how the siblings turn out relative to one another. But criminality is an exception: the chance that a given child will become a criminal is higher if he has a criminal sibling. To most observers, this result is de-facto evidence for the nurture assumption: the “match in criminality” between siblings or twins must have something to do with the influence of the home.
To Harris, however, the correlations suggest something different. When the data are examined more closely, it turns out that “the likelihood that two siblings will match in criminality is higher if they are the same sex and closer together in age. It is higher in twins (even if they are not identical) than in ordinary siblings, and higher in twins who spend a lot of time together outside the home than in those who lead separate lives.” Conversely, siblings are less likely to match in criminality if one has several years on the other—even though the home environment remains the same. The missing link, Harris suggests, would appear to be that “kids from the same home also share a neighborhood and, in some cases, a peer group.”
In all these examples, is Harris doing more than belaboring the obvious? After all, no one would seriously deny that children are influenced by forces outside the home—that is why parents worry so much about everything from what is on the Internet to the fact that their daughter’s best friend wears the shortest skirts in school. Harris, however, is contending not simply that peers are a competing force in a child’s development, but that in some or perhaps even in most cases, they are the primary such influence.
Of course, most children do, in part for genetic reasons, tend to resemble their parents in many ways. And of course, all kinds of things are learned at home—language, manners, cooking, playing the piano, what it is like to be a doctor or truck driver, and so on ad infinitum. These in turn become part of the knowledge that a child takes with him to the outside world. In cases where what is learned at home does not conflict with the behaviors and attitudes of the child’s peer group, Harris acknowledges, these influences can last a lifetime.
But there’s the rub. The one real power parents have to determine “the course of their child’s life,” in Harris’s view, is in deciding which neighborhood to live in and thus, indirectly, which sorts of peers he will associate with. In the early years, she notes, “parents have almost complete control” over this issue. But by the time children reach the age often or so, “all bets are off.” The most they can hope for is that those early relationships will have translated into good peers rather than bad ones, and that the school and neighborhood they have ended up in will minimize any conflict with what has been learned at home.
What Harris is saying, it is important to stress, goes against the grain of almost all contemporary literature on child-rearing. No mother diligently doing everything the advice books recommend in order to boost her child’s self-esteem will want to hear that her efforts are likely to have few if any lasting effects on his personality—and certainly fewer effects than such status-enhancing benefits as making sure he has proper orthodonture or is able to join sports teams with his friends. In a culture that not too long ago gave us the phrase “designer children,” the notion that parents have limited influence on their offspring does indeed seem positively perverse.
That, nevertheless, is Harris’s message, and (just to complete the perversity) she means it to apply in more than one direction. Just as “the idea that we can make our children turn out any way we want is an illusion,” no less illusory is the idea that our parents are the primary begetters of our own problems as adults. “As for what’s wrong with you,” she concludes in her parting words to the reader, “don’t blame it on your parents.”
How well has Judith Rich Harris succeeded in persuading us that what she says is true?
The answer depends, in large part, on the standard of proof to which the social sciences, and psychology in particular, can reasonably be held. Harris’s attack on the nurture assumption—the destructive half of her argument—consists, as we have seen, of a case-by-case critique of research that purports to show a connection between parental and child behavior. In some of these cases, she is able to demonstrate, the results suffer from assorted methodological limitations, such as the absence of a control group, or the failure to account for other factors like observer bias, parent-to-child effects, and the role played by heredity. Most of what Harris writes in this vein is not only illuminating but thoroughly persuasive.
In the case of other, better, studies, ones that do show trustworthy correlations between parental and child behavior, Harris resorts to a different tack: correlations, she writes, are “inherently ambiguous,” and in any event do not prove causality. This is a powerful polemical weapon, one that has formed part of the arsenal of skeptics at least since the 18th-century Scottish philosopher David Hume invoked it against the metaphysics of his own time. Hume’s best-known example was the humble fact that the sun rises and sets every day. Most people, he argued, draw an incorrect inference from this fact—namely, that the sun will rise tomorrow because it has risen each previous day. But an unfailing pattern does not, according to Hume, prove causality; it merely shows the “constant conjunction” of events.
Hume’s insight did more than remind philosophers that they ought to exercise care in their assertions; it sparked a lasting revolution in Western thought. It is because of Hume that much of modern philosophy evolved the way it did, from the British empiricist tradition to Continental opposition to that tradition in all its forms. Most famously, Immanuel Kant went on to formulate his own answer to Humean skepticism in the Critique of Pure Reason—a book that, as Kant put it in a celebrated phrase, owed its very existence to the fact that Hume’s insight had roused him from his “dogmatic slumber.”
It is fair to say that Harris hopes her own arguments, and others like them, will have a similarly lasting effect on the way psychology is done in the future—freeing us both from the clutches of Freud and from the limits of behaviorism, and simultaneously opening us to a new realism in which other facts about human development, particularly the role of heredity and the importance of peer groups, are given their proper due. But the success of this ambitious hope depends critically on the answer to one question: whether it is reasonable, in the case of psychology, to demand the epistemological standard of proof that Harris insists upon.
Are all those mountains of studies that socialization researchers have amassed truly “useless,” for the reason that correlation does not prove causality? I think not, on the simple grounds that some correlations are more meaningful than others. By meaningful I do not have in mind statistical significance, a technical concept, but rather what any reasonable man or woman would find meaningful.
Driving home her point about the limited utility of correlations, Harris produces an example by analogy. Suppose that broccoli-eating and wealth could be shown to go together consistently; would we infer from this fact that eating broccoli is what causes wealth, and that if we stop eating broccoli we will get poorer? In her view, the correlations used to support the nurture assumption are no more reliable than these. But the analogy is a bad one. The reason we would never infer that eating broccoli causes wealth is that we already know from independent evidence that the idea is absurd. Correlations among important social phenomena are something else entirely.
David Blankenhorn’s book, Fatherless America (1995), is a good case in point, though unfortunately Harris does not discuss it. A careful writer, Blankenhorn acknowledges at the outset the “difficulty of proving causation in the social sciences.” Nevertheless, he goes on to argue, “the weight of evidence” does suggest that fatherlessness lies at the root of a number of undesirable behaviors. In Blankenhorn’s words, “children living apart from their fathers are far more likely than other children to be expelled or suspended from school, to display emotional and behavioral problems, to have difficulty getting along with their peers, and to get in trouble with police.” Citing the work of Douglas J. Besharov, he also calls attention to the fact that “the spreading risk of childhood sexual abuse is directly linked to the decline of married fatherhood.”
The correlations here speak for themselves. If you show me, with all the proper controls for income, race, and the rest, a group of fatherless households, I will show you homes that are, on average, at higher risk for criminal behavior, child abuse, trouble in school, and all the rest. It is undoubtedly true, as Harris would insist, that other factors are affecting this outcome, including genetically-influenced behavior on the part of the child. But we hardly need to establish causality beyond a metaphysical doubt to see that if fatherlessness is a strong predictor of certain sorts of behavior, then fatherlessness is something we should be looking at if we want to reduce those behaviors.
For the purposes of metaphysics, David Hume’s observation was indeed profound. But would that have stopped even him from wagering that the sun would rise tomorrow? The fact that we can make similar wagers about properly derived social-science findings tells us that we do indeed know something meaningful after all. We can rule out the inference that broccoli-eating causes wealth; but we cannot similarly rule out the idea that fatherlessness is the root cause of a number of aberrant behaviors—indeed, the evidence fairly screams for just such a causal inference.
The second problem with the standard of proof Harris demands is that she applies it inconsistently. Socialization researchers are subjected by her to the highest scrutiny, often to impressive effect; behavioral geneticists are not. Although I have not read the famous twin studies that Harris frequently cites, or other studies of this kind that she marshals for her argument, it is abundantly clear that she does not train her analytic sights on them with anything like the intensity she devotes to their counterparts in socialization research. How, precisely, do behavioral geneticists define “personality traits”? Just how do they decide that certain individuals exhibit certain traits? Are we really to take the findings of these studies on faith, as if they were free of the embedded assumptions, overinterpretations, and other failings that Harris so adroitly identifies elsewhere?
Similarly, Harris appears to have swallowed whole the maxims of evolutionary psychology—that is, the idea that all human characteristics and abilities are the products of genetic inheritance as modified by Darwinian adaptation—without so much as a critical gulp. Though she is careful in some places to specify that she is drawing on this field merely to speculate, she seems not in the least bothered by the trenchant criticism to which it has been subjected from various quarters. It is deeply ironic that an argument insisting on a scientific standard in psychological research should cut such slack for a doctrine that is widely judged to be wholly unverifiable.
Harris’s double standard—strict for data conflicting with her argument, lenient for data comporting with it—leaves The Nurture Assumption open to attack on one especially crucial point. In the opening sentence of her book (quoted earlier), the nurture assumption itself is defined as “the notion that a child’s personality—what used to be called ‘character’—is shaped or modified by the child’s parents.” No explanation is offered, here or elsewhere, for Harris’s conflating of the terms “personality” and “character,” and “character” does not even appear in her index. Plainly, however, she does not believe that character, in the sense of a continuing moral self that endures from context to context, is a legitimate term in any discussion of human behavior. Rather, what she believes, as she puts it in a crucial passage, is that “morality, like any other form of social behavior, is tied to the context in which it is acquired.”
This issue, as I say, is a crucial one, involving the extent to which children carry the moral standards taught at home into their relationships outside, and whether, in a conflict between the two, the morality of the home will or will not trump the norms of the peer group. As we have seen in other contexts, Harris obviously thinks it will not.
Unfortunately, the sole evidence cited for this point, which is made several times in the book, is a 1928 study by H. Hartshorne and M. A. May called Studies in Deceit. In that study, by two researchers whom Harris describes as “ahead-of-their-time developmentalists,” “children were given opportunities to cheat or steal in a variety of settings: at home, in the classroom, in athletic contests; alone or in the presence of peers.” The ultimate finding, in Harris’s paraphrase, was that “children who were honest in one context were not necessarily honest in others.”
What the author of The Nurture Assumption does not tell us, though, is that (as Robert Wright has pointed out) a battle has raged over the Hartshorne-May study for decades. For those who believe that character does not exist, the study has come to have iconographic status. Others disagree. This is precisely the sort of social-scientific debate that Harris invokes time and again in other contexts to cast doubt on the nurture assumption. Yet in the case of Studies in Deceit, she is mute.
Why? The reason, one suspects, is that taking into account the dissent over Hartshorne-May would damage her insistence that in contests between parents and peers, peers will always win. Do children learn their moral standards at home and apply them even when they conflict with those of their peers? Or do they not? One cannot resolve this question simply by implying that just as children acquire their accents or styles of dress more from their peers than their parents, so they acquire other forms of behavior. (Indeed, this line of argument contains a causal fallacy of its own.) The question of what happens when a particular set of moral standards runs up against the beliefs of the peer group is a fascinating one, but it is a question that Harris ducks. That is no doubt why religion—which is at least one of the things we talk about when we talk about morality—makes only four, very brief, appearances in this heavily researched book.
What Harris would have us believe is that religion (in her analogy) is like cooking—it is something that “parents still have some power to give their kids” because it is for the most part confined to the home. This will not do. The phrase “resisting peer pressure” has entered the vernacular precisely because, contra Harris, children, and teenagers especially, are expected to carry with them their in-home moral standards—standards often based on religious principles. This matter is far too weighty, and its implications far too wide, to rest on the reed of a single study done over 60 years ago and still in dispute, and Harris’s evasiveness on it is a telling lapse.
Still, the fact that The Nurture Assumption falls short of shaking all of psychology to its foundations should not distract us from what Harris has accomplished. In her dogged questioning of an impressive number of studies, both professional and popular, she has put us in her debt. Some scholars may not stand in need of her remedial criticism, but others do. And as for the rest of her readers, especially those who come to her arguments with opinions of their own, they will surely be forced to think harder, and will in all likelihood come away feeling that they know both more and less than they thought they did. This is all to the good.
A number of Harris’s critics fear that The Nurture Assumption will have the effect of discrediting not just discrete cases of shoddy research but the important work now being done on our most urgent social problems, first and foremost the ongoing disintegration of family life. Those critics have a point. But that is all the more reason for absorbing the lesson of Harris’s book and evaluating with care what researchers tell us, perhaps especially when it appears to support our own views. Bad science in the service of a good cause is no service at all.
It is also feared that The Nurture Assumption is a book that will let parents off the hook, a license for self-indulgence issued to a society where adult self-indulgence already reigns—a society, as John Leo put it, “hip-deep in evidence of the pain and loss of underparented and unparented children.” Here, too, the critics have a point. One does not have to invoke the prepubescent predators of the country’s worst neighborhoods to see what Leo means: underparented and unparented children are scattered everywhere, from the barely supervised bands who crowd “after-care” programs in schools across the country to the shopping malls and fast-food joints where older children pass the time unattended because they are allowed to and because no one is waiting for them at home. And then there are the children in the better-off neighborhoods, children who can have anything they want—anything, that is, but a parent who will appear in the house before dark. In a world like this, again in Leo’s words, it is “no time to celebrate a foolish book that justifies self-absorption and makes nonparenting a respectable, mainstream activity.”
Yet there is, in truth, little in Harris’s pages to justify this charge against her, and even less to give aid and comfort to anyone who would embrace such a low view of parenthood. To the contrary, on a number of contentious issues Harris’s argument implies a position positively subversive of “nonparenting.”
Consider day care. Harris devotes only two paragraphs to the subject, plainly out of a desire to avoid its coils, but those two paragraphs are instructive. In the past, she writes, “when only families with problems put their kids into day-care centers, institutional care was thought to be bad for children”; today, by contrast, when day-care centers are widely used by “normal” families, “it no longer seems to matter whether babies or preschoolers spend most of their daylight hours there or at home.” What Harris means is that there is no evidence (as yet) that day care affects long-term personality development. But this is not at all the same thing as suggesting that it is all right to leave infants and small children in outside settings apart from their families for most of their waking hours.
In fact, evidence throughout The Nurture Assumption leads to quite the opposite conclusion. Recall what Harris has to say about the early years—that it is then and only then that parents can exercise “the one power” that “nearly all have”: “they can determine who their children’s peers are.” It is precisely in those years that parental influence over children is, according to her own argument, at its peak.
Are parents who rely on day care exercising their power to choose peers for their children? Do they socialize with the parents of other children in the group, share their norms and aspirations, even know what is going on in their homes? One does not need to flog day care itself. Where I live, many parents hire nannies or other in-home help; most of the children being cared for in this way have peers chosen not by the parents but by the babysitters (typically, they are the children being baby-sat by the sitters’ friends). The point is not that the choices are bad; it is that the sitters, and not the parents, are doing the choosing. Many parents have not the faintest idea whom their child has played with all day, and they know even less about what is going on in the playmate’s home. If Harris is even partly right about the importance of peer groups, then parents who are outside the home during those preschool years need to think twice about what they are doing.
There is much more in The Nurture Assumption to give pause to anyone tempted to embrace it for the wrong reasons, from Harris’s baleful look at bilingual education—“a dismal failure,” she writes, for reasons that group-socialization theory would have predicted—to her no less stringent critique of multiculturalism, a program which, by emphasizing the different ethnic and subcultural groups that children are already well aware of, undermines the task of education, which is to “unite students by giving them a common goal.” Nor does she shrink from offering policy suggestions of her own, from school uniforms to a radical reform of the foster-care system.
In the end, however, it is not the practical implications of Harris’s argument, intriguing though these may be, for which her book deserves to be read. Nor is it yet certain that The Nurture Assumption will end up representing a “turning point for psychology,” as Steven Pinker predicts; in any case, that is for the psychologists to fight out. The rest of us, particularly the parents among us, are left with something else to ponder.
At its best, The Nurture Assumption presents parents with a unique moral challenge. The parent-child relationship, Harris suggests, is in one crucial respect like the relationship between husband and wife. Does it matter how you treat your spouse? Of course it does. Does it matter because you have the power to mold, shape, and influence your spouse’s behavior? Or does it matter because, quite apart from any such end, marriage exerts its own moral claims? Harris’s answer is phrased in subjective terms: “I don’t expect that the way I act toward my husband is going to determine what kind of person he will be ten or twenty years from now. I do expect, however, that it will affect how happy he is to live with me and whether we will still be good friends in ten or twenty years.”
What if it comes down to this: that the only thing we can control for certain—that, indeed, we have “great power to determine”—is the kind of life our children have at home? If that is so, then we should read to them not because it will help them get into Harvard, but because they like it; we should provide a loving environment not because it will stimulate their social or cognitive development, but because it is the right thing to do; we should try to stay together, at least while they are growing up, not because failing to do so will leave them with permanent psychological scars, but because it matters immediately to their happiness and well-being. And if all that is so, we should enjoy our children not because they exist to reward us, but because they are a joy. Is this so terrifying an idea—or one that lets parents off the hook?
Like many other mothers (and some fathers), I have consumed more than my share of the literature on “parenting” today, from the latest bulletins on jump-starting a toddler’s fine motor control to the findings of the most recent White House conference on the importance of children’s cognitive abilities. Some time ago, though, I realized I no longer cared to read this stuff—not simply because I had grown bored (though that is also true), but because, in its peculiar combination of joylessness and self-congratulation, it had become profoundly irritating. All that prattle about parental self-sacrifice, about sleepless nights, about the end of life as the once-childless knew it; all that preening insistence that what “we” do, from toilet training to bedtime rituals to inculcating the perfect amount of self-esteem, will somehow make or break “them”—so accustomed are we to this literature that we no longer question how end-driven it all is, how drearily focused on results, how utilitarian, how relentlessly, ruthlessly, teleological.
What is, to me, unexpectedly most admirable in Judith Rich Harris’s work is its radical attempt to break free of the teleological—to force us to attend to the bond between parent and child in all its elemental simplicity. To “love according to my bond, nothing more, nothing less” may be the weightiest moral challenge most people will ever face. Shakespeare puts this famous phrase, tellingly, not in the mouth of a lover but in the mouth of a child speaking to her father. More even than the bond between husband and wife, the bond between parent and child, as Lear discovered too late, has at its very core this unconditional pledge of loyalty.
If only because it reminds us that we are stewards of our children, and not their Svengalis, The Nurture Assumption has performed a lasting cultural service. It defies comprehension that anyone could draw from this the lesson that how we treat them has thereby become a matter of indifference.
1 Free Press, 462 pp., $26.00.
2 Ironically, The Nurture Assumption grew out of an article Harris published in Psychological Review in 1995, an article that subsequently won a prize named after the same Harvard dean who had presided over her exit from graduate school.
3 One subset of Harris’s argument against the nurture assumption is her critique of 50 years of research into birth-order effects, the evidence for which, she contends at length, “has been knocked down again and again.”