The received wisdom about multicultural America goes something like this: “At the time of the Founding, America’s free population was not only white but almost entirely British, and the nation’s culture was based on their common heritage. That monocultural domination continued through the 19th and 20th centuries as other white European immigrant groups were assimilated into the Anglo mainstream. In the 21st century, with people of color soon to become a majority of the population, the United States faces unprecedented cultural diversity.”
Here is an alternative view: “America was founded on British political and legal traditions that remain the bedrock of the American system to this day. But even at the time of the Founding, Americans were as culturally diverse as they are today. That diversity was augmented during the 19th and early 20th centuries. Then came an anomalous period from roughly the 1940s through the 1970s during which cultural diversity was dampened in some respects and masked in others. Since the late 20th century, America has returned to its historic norm: obvious, far-reaching cultural diversity that requires room for free expression.”
You may reasonably question whether America at the Founding was truly as culturally diverse as it is now. After all, the free population consisted almost entirely of Protestants whose ancestors were English or Scottish.
And yet it was.
Historian David Hackett Fischer’s magisterial Albion’s Seed describes how the British came to America in four streams. From East Anglia came the Puritans seeking freedom to practice their religion. They settled first in Massachusetts. By the time of the Revolution, they had spread throughout New England and into the eastern part of New York and had become known as Yankees.
From the south of England came the Cavaliers, who had lost out during the English Civil War, accompanied by large numbers of impoverished English who signed contracts to work as their indentured servants. The first wave settled in Virginia’s tidewater, and the second around the Chesapeake Bay. They spread southward through the tidewater regions of the Carolinas and Georgia.
From the North Midlands came the Quakers, who, like the Puritans, were seeking a place to practice their religion unmolested. They settled first in the Delaware Valley and then spread throughout eastern and central Pennsylvania, with some of them drifting southward to Maryland and northern Virginia.
The fourth group came from Scotland and the northern border counties of England. Some of them arrived directly from their ancestral homelands, but the great majority arrived in the New World after an extended stopover in the north of Ireland—hence the label by which we know them, the Scots-Irish. They landed in Philadelphia but quickly made their way west on the Great Wagon Road to settle the Appalachian frontier running from west-central Pennsylvania to northeast Georgia.
The four groups did indeed share a common culture insofar as they had all come from a single nation with a single set of political, legal, and economic institutions. But our topic is cultural diversity as it affects the different ways in which Americans think about what it means to “live life as one sees fit.” That consists of what I will call quotidian culture: the culture of everyday life. In terms of quotidian culture, the four streams shared the English language, barely. They differed on just about everything else, often radically.
Religion was culturally divisive. Anglican Christianity among the Cavaliers retained much of the pomp and ritual of Catholicism, and it permitted a lavish, sensuous lifestyle that the Yankees’ religious heritage, Puritanism, and Quakerism forbade. But Puritanism and Quakerism were also very different from each other. The Puritans saw themselves as God’s chosen people—“the saints”—and the religion they practiced was as harsh and demanding as reputation has it, epitomized by the title of the most famous sermon of the 18th century, Jonathan Edwards’s “Sinners in the Hands of an Angry God.” The Sunday service, which might last five or six hours, even in unheated churches in the dead of a New England winter, consisted mostly of long lecture-like sermons and long teachings of the Word. It also included a ritual of purification, as members who were known to have committed specific sins were compelled to rise and “take shame upon themselves,” which sometimes included crawling before the congregation.
The Quaker First Day meeting was completely different. Whereas the congregation in a Puritan church was seated according to age, sex, and rank, Quakers were supposed to take the seat nearest the front according to the order of arrival. There was no multi-hour lecture or even a preacher. Anyone who was moved by the spirit could speak, including children, but the strictly observed convention was that such interventions lasted only a few minutes. Sometimes nothing would be said for the entire meeting, and those were often thought to be the best—“gathered” meetings during which the spirit of God was felt wordlessly by all. Instead of trying to live blamelessly to avoid the wrath of an angry God, Quakers worshipped a God of love and forgiveness. Sinners in need of forgiveness were “held in the light.”
Meanwhile, the Scots-Irish in the Appalachian backcountry combined passionate enthusiasm for Protestant teachings with equally passionate hostility toward the religious establishment. Thus an Anglican missionary to the region was told by one family that “they wanted no damned black gown sons of bitches among them” and was warned that they might use him as a backlog in their fireplace. Others to whom he intended to minister stole his horse, drank his rum, and made off with his prayer books.
The nature of the family varied across the four streams. Yankee and Quaker families were nuclear. To them, marriage was a covenant that must be observed by both husband and wife, and it could be terminated when one party failed to live up to the bargain. But Yankees and Quakers differed in the roles assigned to each party. Among the Yankees, marriage was a strict hierarchy with the man in charge; among the Quakers, marriage was seen as a “loving agreement,” a partnership between man and wife.
Among the Cavaliers, the father was the absolute head of an extended family that embraced blood relatives, other dependents, and sometimes slaves. Marriage was not a covenant but a union before God, and indissoluble. A Scots-Irish family was a series of concentric rings, beginning with the nuclear family and successively widening to include extended family and American versions of Scottish clans, lacking the formal structure of clans in their ancestral home but consisting of related families with a few surnames who lived near one another and were ready to come to one another’s aid.
Marital and premarital morality varied among the four streams. The criminal laws of Puritan Massachusetts decreed that a man who slept with an unmarried woman could be jailed, whipped, fined, disfranchised, and forced to marry the object of his lust. In cases of adultery, both Yankees and Quakers punished the man as severely as—sometimes more severely than—the woman. Among the Cavaliers, it was just the opposite: Men who slept with women not their wives were seen as doing what comes naturally and were treated leniently. Women were harshly punished for accommodating them.
Once people were married, Puritanism wasn’t all that puritanical. Surviving letters between Yankee spouses commonly expressed their love in ways that leave no doubt about their mutual pleasure in sex. It was the Quakers, more gentle and consensual in many aspects of marriage, who were more likely to see sex as sinful in itself. Many Quaker marriages included long periods of deliberate sexual abstinence.
Yankees, Quakers, and Cavaliers alike looked down on the morals of the Scots-Irish, who practiced an open sexuality that had no counterpart among the other three groups. That persecuted Anglican missionary to the Scots-Irish I mentioned was scandalized that, among other things, the young women of the backcountry “draw their shift as tight as possible round their breasts, and slender waists” in a deliberate display of their charms. He calculated that 94 percent of the brides in the marriages he performed in 1767 were already pregnant.
There’s much, much more. People in the four cultures had radically different parenting styles. They ate different foods and had different attitudes toward diet and alcohol. They dressed differently. Their approaches to formal education were different, and so were their opinions and practices regarding recreation, social rank, death, authority, freedom, and good order. Their work ethics and attitudes toward wealth and inequality were different.
That is why I argue that the differences separating Yankees, Quakers, Cavaliers, and Scots-Irish at the Founding were at least as many and as divisive as those that separate different ethnic groups in America today. Ask yourself about the differences in quotidian culture that now separate whites, blacks, Latinos, and Asians. In some respects, the differences are substantial—but seldom greater than the ones that separated the four original streams of Americans.
The cultural variegation of America had only just begun at the time of the Founding. The 19th century saw a series of surges in immigration that brought alien cultures to our shores. In the single decade from 1846 through 1855, 1,288,000 Irish and 976,000 Germans landed on the East Coast. They brought not only the Irish and German cultures with them—both different from all four of the British streams—but also Catholicism, which until then had been rare in the United States.
The Irish who arrived during that surge were not like earlier generations of immigrants, who had been self-selected for risk-taking and optimism. They were fleeing starvation from the potato famine. More than half of them spoke no English when they arrived. Most were illiterate. They did not disperse into the hinterlands but stayed in the big cities of the East. Since those cities weren’t actually that big in the mid-19th century, the Irish soon constituted more than a quarter of the populations of New York, Boston, Philadelphia, Providence, New Haven, Hartford, Jersey City, and Newark. Large urban neighborhoods became exclusively Irish and Catholic—a kind of neighborhood that America had never before experienced.
The Germans were the antithesis of the Irish, typically highly skilled craftsmen or farmers who practiced advanced agriculture. Michael Barone’s description of the culture they brought with them (in his book Our Country) is worth quoting at length:
As soon as they could they built solid stone houses and commercial buildings. They built German Catholic, Lutheran, and Reformed churches, and they maintained German-language instruction in private and public schools for decades. They formed fire and militia companies, coffee circles, and especially singing societies, staging seasonal Sangerfeste (singing festivals). They staged pre-Lenten carnivals, outdoor Volkfeste, and annual German Day celebrations. They formed mutual-benefit fire insurance firms and building societies and set up German-speaking lodges of American associations. Turnvereine (athletic clubs) were established in almost all German communities and national gymnastic competitions became common. German-language newspapers sprung up—newspaper baron Joseph Pulitzer, a German-speaking Hungarian Jew, got his start in one in St. Louis—and German theaters opened in New York, Philadelphia, Cincinnati, Chicago, and Milwaukee. Some German customs came to seem quintessentially American—the Christmas tree, kindergarten, pinochle.
In the 1870s, large numbers of Scandinavian immigrants began to augment the continuing German immigration, and most of them headed straight toward what Barone has called the “Germano-Scandinavian province,” consisting of Wisconsin, Iowa, Minnesota, and the Dakotas, overlapping into parts of Missouri, Kansas, Nebraska, and Montana. The mixed cultures of Scandinavia and Middle Europe in the small towns of those regions persisted long into the 20th century, famously chronicled by Willa Cather in the early 20th century and over the past 40 years by Garrison Keillor.
The Civil War created or intensified several kinds of cultural diversity. First, its conclusion marked the emergence of African-American culture from the shadows. Communities of American blacks in the South were no longer limited to the size of a slaveholder’s labor force but could consist of large neighborhoods in Southern cities or the majorities of populations in rural towns. All the categories of folkways that distinguished the various white cultures from one another also distinguished black American culture from the white ones.
The Civil War also led Southern whites, whether descendants of Cavaliers, indentured servants, or the Scots-Irish of the backcountry who had never owned slaves, to identify themselves as Southerners above all else. In many respects, they walled themselves off from the rest of the country and stayed that way for a century. White Southern culture was not only different from cultures in the rest of the country; it was defiantly different.
In the 1890s, America’s cultural diversity got yet another infusion from Eastern Europe and Mediterranean Europe that amounted to 20 million people by the time restrictive immigration laws were enacted in the 1920s. They came primarily from Italy, the Austro-Hungarian empire, and Russia, with this in common: Almost all of them had been second-class citizens in their homelands. The Italian immigrants came from rural, poor, and largely illiterate southern Italy and Sicily, not from the wealthier and more sophisticated north. Austro-Hungary’s immigrants were overwhelmingly Czechs, Serbs, Poles, Slovaks, Slovenians, and Jews, not ethnic Hungarians or Austrians. The immigrants from the Russian Empire were almost all Poles, Lithuanians, and Ukrainians, not ethnic Russians, with large proportions being Jewish as well. Occupationally, the Ellis Island immigrants had usually been factory laborers, peddlers, and tenant farmers, near the bottom of the economic ladder.
The size of these immigrant groups led to huge urban enclaves. In New York City alone, the Italian-born population at the beginning of the 20th century was larger than the combined populations of Florence, Venice, and Genoa, mostly packed into the Lower East Side of New York. A few blocks to their west were 540,000 Jews, far more than lived in any other city in the world. To enter either of those neighborhoods was to be in a world that bore little resemblance to America anywhere else. And that doesn’t count New York’s older communities of Irish, Germans, and African Americans.
Over the next 60 years, events combined to both dampen and mask cultural diversity. First, World War I triggered an anti-German reaction that all but destroyed the distinctive German culture. In the 1920s, new immigration laws choked off almost all immigration from everywhere except Britain and northern Europe, and even that was reduced. With each passing year, more children of immigrants married native-born Americans, and fewer grandchildren of immigrants grew up to carry on the distinctive features of their Old World culture. By the middle of the century, the percentages of Americans who were immigrants or even the children of immigrants were at all-time lows. Most of the once vibrant ethnic communities of the great cities had faded to shadows. No longer could you find yourself in an American street scene indistinguishable from one in Palermo or the Warsaw ghetto.
Among native-born Americans, our long-standing tradition of picking up and moving continued, with surges of the population to Florida and the West Coast. Then came World War II. Almost 18 million out of a population of 131 million put on uniforms and were thrown together with Americans from other geographic, socioeconomic, and ethnic backgrounds. The economic effects of war production also prompted a wave of African-American immigration from the South to the North and West.
These demographic changes occurred in the context of the culturally homogenizing effects of mass media. Movies were ubiquitous by the beginning of World War I, and most American homes had a radio by the end of the 1920s. These new mass media introduced a nationally shared popular culture, and one to which almost all Americans were exposed. Given a list of the top movie stars, the top singers, and the top radio personalities, just about everybody younger than 60 would not only have recognized all their names but have been familiar with them and their work.
After the war, television spread the national popular culture even more pervasively. Television viewers had only a few channels to choose from, so everyone’s television-viewing overlapped with everyone else’s. Even if you didn’t watch, you were part of it—last night’s episode of I Love Lucy was a major source of conversation around the water cooler.
In these and many other ways, the cultural variations that had been so prominent at the time of World War I were less obvious by the time the 1960s rolled around. A few cities remained culturally distinct, and the different regions continued to have some different folkways, but only the South stood out as a part of the country that marched to a different drummer, and the foundation of that distinctiveness, the South’s version of racial segregation, had been cracked by the Civil Rights Act of 1964. In December 1964, Lyndon Johnson evoked his mentor Sam Rayburn’s dream, expressed in 1913, of an America “that knows no East, no West, no North, no South.” Johnson was giving voice to a sentiment that seemed not only an aspiration but something that the nation could achieve once the civil-rights movement’s triumph was complete.
But even as Johnson spoke, Congress was only a year away from an immigration bill that would reopen America’s borders. Johnson’s own Great Society programs—plus Supreme Court decisions, changes in the job market, and the sexual revolution—would produce a lower class unlike anything America had known before. Changes in the economy and higher education would produce a new upper class that bore little resemblance to earlier incarnations.
Half a century after Johnson’s dream of a geographically and culturally homogeneous America, the United States is at least as culturally diverse as it was at the beginning of World War I and in some respects more thoroughly segregated than it has ever been. Today’s America is once again a patchwork of cultures that are different from one another and often in tension. What they share in common with the cultures of pre–World War I America is that they require freedom. In one way or another, the members of most of the new subcultures want to be left alone in ways that the laws of the nation, strictly observed, will no longer let them.
Contrary to the received wisdom, the least important part of this renewed cultural diversity is caused by our changing ethnic mix. By now, almost everyone is familiar with the Census Bureau’s projection that whites will be a minority of the American population before midcentury. This is a momentous change in America’s ethnic mix at a national level. But it has caused, and will cause, little change in quotidian culture in the vast majority of American towns and cities, because changes in the ethnic mix of specific places have been so intensely concentrated. Consider the three major ethnic groups that are generating the approaching minority majority: Blacks, Asians, and Latinos.
In the 2010 census, blacks constituted 12.6 percent of the population, a figure that has moved within a narrow range since the end of the Civil War. Blacks have been concentrated in the same places for many decades: the former states of the Confederacy and large urban areas in the Northeast and Midwest, plus large concentrations in Los Angeles and Houston. The diversity in quotidian culture introduced by African Americans is an important part of the American story, but it is confined to certain regions and cities, and the situation is unlikely to be any different 30 years from now than it has been for a long time.
Asians constituted a minuscule 5 percent of the population in the 2010 census. They are having an effect on quotidian culture in Silicon Valley, where they constitute about a third of the population, and in a handful of major cities. In other cities and towns, Asians are and will remain a few percent of the population, even if the proportion of Asians in the nation as a whole doubles or triples.
The approaching minority majority is primarily driven by the growing Latino population. In 1970, 9.3 million Latinos constituted less than 5 percent of the American population, concentrated in the border counties of Texas and Arizona, all of New Mexico, southern Colorado, southern California, and southern Florida, along with a large population of Latinos, mostly Puerto Ricans, in New York City.
From 1970 to 2010, the census shows an increase of 41.2 million Latinos. That’s a huge increase—but 71 percent of it was in the places I just mentioned, leaving just 29 percent of the increase in the Latino population to be scattered everywhere else in the country.
The upshot is that county-by-county maps of the Latino presence in 1970 and 2010 look remarkably similar. With the exception of a few cities—mainly Chicago, Washington, and Atlanta—places that had a minor Latino presence in 1970 still had a minor presence in 2010. So one should, simultaneously, hold in one’s mind two different thoughts about cultural diversity fostered by Latinos:
- In the places where Latinos already constituted a significant presence before the surge, their presence is even more significant. This is a major event in the culture and politics of a limited number of cities and throughout the American Southwest.
- America as a whole is not being Latinized. Outside the areas where the Latino presence is concentrated, Latinos constitute a small portion of the population—6 percent. That’s far short of a percentage that has much effect on quotidian culture. Furthermore, the surge in the Latino population is in a prolonged pause and might be over. The best guess is that towns and cities with small Latino populations now will continue to have small ones for the foreseeable future.
I am not arguing that the changing ethnic mix is an insignificant part of the return of American cultural diversity, but that it is a backdrop to the larger story. The primary source of quotidian cultural diversity throughout American history and continuing today, independently of one’s ethnicity, religion, wealth, politics, or sexual orientation, is the size of the place where people live.
It is difficult to exaggerate how different life is in a city of a million people or more and in a small city or town. I don’t mean that people in big cities lack friends or even that they cannot have an important a sense of community in their neighborhood. I refer instead to differences in quotidian culture that bear on the nature of the role of government.
If you’re an urban dweller and you’ve got a problem with a water bill or getting your trash picked up, you must deal with an anonymous city bureaucracy. The policeman who arrives when your apartment is burgled is someone you’ve never seen before and will never see again. If you get into a dispute with a neglectful landlord or an incompetent contractor, there is probably no personal relationship that you can use to resolve the dispute; you have to take it to court.
By its nature, the big city itself is an unfathomably complicated machine. It has large numbers of people with serious needs of every kind, for which there are a profusion of government agencies that are supposed to provide assistance. The technological and administrative complexity of the infrastructure that provides police protection, firefighting, water, sewers, electricity, gas, and transportation in congested and densely populated places is staggering.
Now consider the other extreme: a small town or city. It might be 500 people, 3,000, or 15,000; it’s surprising how similarly communities function below a certain size. There’s no sharp cutoff point. For operational purposes, let’s say I’m talking about cities and towns of 25,000 people or fewer.
Daily life in such a place has a much different feel to it than life in the big city. For one thing, people of different ethnicities and socioeconomic classes are thrown together a lot more. There are only a few elementary schools at most, sometimes only one, and usually just a single high school. The students’ parents belong to the same PTAs and attend the same Little League games. The churches are centers of community activities, and while there are some socioeconomic distinctions among their congregations, the churches mix people up a lot.
Hardly anyone in a town or small city is anonymous. Policemen, sales clerks, plumbers, and landlords are often people you know personally. Even when you don’t, you’re likely to know of them—if the plumber’s last name is Overholtz, your parents might have known his parents, or your friend’s daughter married an Overholtz a few years ago, or in a dozen other ways you are able to place that person in the matrix of the town.
The same thing is true of whatever interactions you have with government in a town or small city. In the big city, postal clerks are so often brusque and unhelpful that the stereotype has become notorious. In a town, the postal clerk is more likely to add the necessary postage when you’ve under-stamped and collect later. It’s not because the United States Postal Service assigns its friendliest postal clerks to small towns, but the result of age-old truths about human interactions: When you know that an encounter is going to be one-time, it’s easier to be brusque and unhelpful than when you expect the encounter to be repeated. Repeated encounters tend to generate personal sympathies, understandings, and affiliations.
The mayor and city-council members of a small town are people you can phone if you have a problem. If you live in a small city, solving your problem might involve as little as a phone call to the right person in a municipal bureaucracy that numbers a few dozen people at most. Not every problem will get solved that easily, but, as a rule, the representatives of government in a town or small city are more reluctant to play the role of an “I’m just following the rules” official than is someone working in the bureaucracy of a city of a million. They are more willing and able to cut their fellow citizens some slack. It’s a variation on the reason a village postal clerk is likely to be helpful. Bureaucrats in towns and small cities aren’t faceless. They have to get along with the citizens they govern every day. In fact, they can’t even get away with thinking of their role as “governing” their fellow citizens. They have no choice but to be aware that they are, in fact, public servants.
As for social capital—the potpourri of formal and informal activities that bind a community together—the range and frequency of things that still go on in towns and small cities is astonishing. Such places have not been immune from the overall reduction in social capital that sociologist Robert Putnam documented in Bowling Alone. But in towns and small cities that still have a stable core of middle-class and blue-collar citizens, the traditional image of the American community survives in practice. These are still places where people don’t bother to lock the doors when they leave the house and the disadvantaged are not nameless “people on welfare,” but individuals whose problems, failings, and potentials are known at a personal level.
As cities get larger, the characteristics I have discussed shift toward the big-city end of the scale, but it happens slowly. The earliest change, and an important one, is that socioeconomic segregation becomes more significant. When a city is large enough to support two high schools, you can be sure that the students who attend each will show substantial mean differences in parental income and education. The larger the population, the more that churches will be segregated by socioeconomic class.
But many of the activities that go under the rubric of social capital continue. The churches remain important sources of such social capital, and so do the clubs such as Rotary, Kiwanis, American Legion, Veterans of Foreign Wars, and others that are still active in cities of up to a few hundred thousand people (and sometimes beyond). Midsize cities often have strongly felt identities, with solidarity and pride that carry over into concrete projects to make the community better. Even in cities of 300,000 or 400,000, the local movers and shakers are a small enough group that they can be brought together in a variety of ways, as members of a local civic organization or more informally, and they are often able to deal with local problems without a lot of red tape.
I could discuss these characteristics of life for still another kind of community, the suburbs of the great metropolises, but by now the point should be made: The simple size of the places where people live creates enormous diversity in the daily life, in the relationship of citizens to the local government, in the necessity for complex rules, and in the ability of communities to deal with their own problems.
We aren’t talking about a small, quaint fraction of American communities that can deal with their own problems. Conservatives are often chastised for confusing today’s highly urbanized America with an America of a simpler time. I think the opposite mistake is a bigger problem: assuming that most of America is like New York or Chicago. As of the 2010 census, 28 percent of Americans still lived in rural areas or in cities of fewer than 25,000 people. Another 30 percent lived in stand-alone cities (i.e., not satellites to a nearby bigger city) of 25,000–499,999. Fourteen percent lived in satellites to cities with at least 500,000 people. Twenty-eight percent lived in cities with contiguous urbanized areas of more than 500,000 people—the same proportion that lived in places of fewer than 25,000 people.
My proposition is that people in places other than the megalopolises need a lot less oversight from higher levels of government than they’re getting. Their municipal governments need a lot less supervision from state and federal government than they’re getting. For cities under 500,000, a compelling case can be made that their citizens should be given wide latitude to make their own decisions with only basic state and federal oversight. Once we’re down to cities under 25,000, I think that case becomes overwhelming, with access to a few block grants (carrying only the most basic bureaucratic strings) being nearly the only role that higher levels of government need to play.
Many other forms of cultural diversity have been growing in recent decades. The American population has been sorting itself into conservative communities and liberal ones, gay ones and straight ones, religious ones and secular ones. I have devoted a book, Coming Apart, to describing the cultural divide now separating a new upper class and a new lower class that have coalesced over the past half century. But spelling out those kinds of sorting here would be overkill. My point is not really a matter of dispute: The re-diversification of America has produced a complex array of ways in which American communities differ from one another and those communities should be permitted to express those differences.
The essence of the American goal at the Founding, however imperfectly realized, was to create a society in which people are allowed to live their lives as they see fit as long as they accord the same freedom to everyone else, with the federal government providing a peaceful setting for that endeavor but otherwise standing aside. Beginning with the New Deal, and accelerating from the 1960s onward, that goal was intermingled with other priorities and other agendas. What made America unique first blurred, then faded, and is now almost gone. But the reality of America, at the Founding and today, is that “living life as one sees fit” has very different definitions in different cultural pockets of the country. It is time to consider whether it is not only desirable, but practicable, to return to governance that comes closer to the Founders’ ambition.