PRB | Population Bulletin
PRB On-Line: www.prb.org

America's Diversity and Growth: Signposts for the 21st Century

by Martha Farnsworth Riche

Population Bulletin, Vol. 55, No. 2, June 2000
Table of Contents

Introduction
Population Growth
Sources of Change
Where Americans Live
Racial and Ethnic Diversity
Age Profile
Family Life
Households and Families
Measures of Well-Being
Future Prospects
Suggested Resources
References

This Population Bulletin, published in June 2000, discusses many of the "signposts" of the U.S. population, including robust population growth, increasing life expectancy, continued immigration, changes in the family, increased education levels, and population growth outside urban areas.

Introduction

At the beginning of the 21st century, demographic trends seem to many Americans to signal new, potentially disquieting changes in the U.S. population. Americans at the beginning of the 20th century also worried about unfamiliar developments in the population.

Population trends inevitably reflect fundamental changes in the economy and in the world. Such changes influence peoples' choices as they form families, seek economic and physical well-being, and move to places where they see opportunities. But viewed from the perspective of a century, population trends seem marked as much by stability as by change.

At the beginning of the 20th century, many Americans were concerned about slowing population growth, particularly because it meant that immigration was shifting the country's ethnic balance. Then, as now, the United States was a nation of robust population growth fueled in part by immigration — but growth rates were lower than Americans had been used to, fertility was declining, and immigrants were coming from different countries than they had in the past. The story at the end of the century is similar. Population growth rates have slowed, although at least 2 million people are added each year. Once again large numbers of immigrants are coming from different places. And immigration plays a large part in population growth because U.S. fertility is low.

Geographic mobility within the United States has been another constant source of change over the last 100 years. Today's Americans continue to move to better their circumstances. At the beginning of the 20th century, industrialization was propelling the growth of big cities, particularly in the Midwest and Northeast. By mid-century, there was a large-scale population shift to what became known as the Sun Belt, while people all over the country moved from city centers to suburbs.

By the end of the century, advances in communications and transportation technology allowed many Americans to realize a dream of small-town life engendered by images of safe and supportive communities from a century ago.

The U.S. population is significantly more diverse racially and ethnically now than it was in 1900. Contemporary immigrants from Latin America and Asia are joining the African-Americans and American Indians who have always been part of the U.S. population. These immigration trends are likely to continue as the less developed countries that send most immigrants experience an unprecedented surge in the number of young people in their populations, thanks to improved child survival and high birth rates. Young adults in these countries, as elsewhere, have the greatest propensity to move, particularly when jobs are scarce at home. And many will seek jobs in the United States.

Racial and ethnic definitions are relatively fluid and depend in part on how people perceive themselves, and how they are perceived by the society in which they live. More unions between Americans of different racial and ethnic groups are resulting in more children of mixed racial heritage, for example, and it is impossible to predict how these children will choose to identify themselves once they are grown. If race and ethnic definitions remain the same, and so do immigration, fertility, and mortality patterns, minority groups will continue to grow faster than the non-minority population. According to current projections, non-Hispanic whites will make up barely one-half of the population by 2050 and will lose their majority status by 2060.

For individual Americans, the most important population trend of the 20th century may well have been the stunning increase in their life expectancy. Combined with relatively low fertility rates over most of the century, the result is a new age profile for the population. In the 21st century, the nation's institutions and expectations will have to adapt to a population in which there are roughly equal numbers of people in all age groups, rather than the old pattern in which there were many more young people than older people.

Longer life expectancy has had a significant effect on Americans' family lives, because it extends the number of years people live after their children are grown. Now Americans spend less than half their adult life rearing children, compared with most of adult life for couples at the beginning of the 20th century. This change has important implications for women especially, as they are now free to "have it all," sequentially, if not concurrently. It also allows for new, companionable relationships between adult children and still-active parents, before caregiving is called for at the other end of the life span.

In some ways, family patterns at the end of the century are like those that prevailed at its beginning. Young people then were more likely to wait for marriage until they were fully launched in adult life, and significant numbers gave evidence of never marrying at all.

An important new trend in the closing decades of the century was the replacement of early marriage by cohabitation. Cohabitation also has become common for Americans between marriages. And, marriages now are more likely to end in divorce than from death. Nevertheless, the rising divorce rate that was a concern in 1900 seemed to have stabilized by the 1980s.

The prevalence of single-parent families is about the same at the end of the 20th century as it was at the beginning, but the reasons are very different. In the early part of the century, single parenthood was usually the result of death; in the latter part it was generally the result of choice by at least one parent.

For married parents, the longer average life expectancy has extended the years of married life after children were grown. Elderly parents (predominately women because of their longer life expectancy) became far less likely to move in with adult children, especially after 1950. This combination of social trends and mortality improvements have made two household types more common than the traditional married couples with children: married couples without children and people living alone.

Although many aspects of American life are similar at the beginning and end of the 20th century, one set of trends is very different. In 1900, the United States was still largely an agricultural economy, although one in which industrialization was well underway. By the end of the century, the United States had become largely a service economy, although one in which manufacturing still plays an important part. The American population has responded to this shift with continual improvement in its educational attainment. At the beginning of the century, literacy (generally considered four years of schooling) was the principal educational benchmark. At the end of the century, a high school diploma has become the benchmark, and increasing shares of the population acquire college degrees.

Economic rewards are greater, too, especially for Americans with postsecondary education. As they did at the beginning of the century, women now have about the same opportunity as men to attend college. And, civil rights programs begun in the 1960s opened new doors to minorities. The combination of a broader range of job opportunities and acceptance of more women and minorities in the work place has granted a wider array of Americans economic success. But it has left many behind, including people who have jobs but earn so little that they are officially poor.

Population Growth

Population growth has shaped the United States from its beginnings, and it continues to do so. In January 2000, The Christian Science Monitor reported that "almost alone among the developed nations, the U.S. population is growing robustly."1 The country's population keeps growing through relatively high fertility, new arrivals from other countries, and increasing life expectancy.

Figure 1
U.S. Population Growth, 1900 to 2000 and Projections to 2100

The current rate of population growth — 1.0 percent a year — is just over one-half that of 1900 (1.9 percent a year). But the rate is being applied to a much larger population now. At the beginning of the 20th century, the population numbered nearly 76 million; at the end, it numbers almost 273 million — more than tripling over the century (see Figure 1). At current rates of growth, the United States is adding twice as many people to its population each decade as it was a century ago.

In the 1990s alone, the U.S. population grew by nearly 25 million people — more than in every decade but one in the nation's history. Only the 1950s added more people — 28 million — at the height of the post-World War II baby boom that began in 1946 and ended in 1964.

The population outlook for the 21st century is for more growth. According to current projections from the U.S. Census Bureau, the U.S. population is expected to more than double to 571 million by 2100, although the growth rate is expected to slow to 0.7 percent a year.2 The United States should add more people during the first decade of the new century — nearly 27 million more Americans — than it did during the last decade of the old century. In 2025, the United States still will be the world's third most populous nation — after China and India, and just ahead of Indonesia.3

The prospect of such robust growth intensifies concerns about pollution and other environmental threats, as the increasing numbers of Americans can be expected to heighten the demands on shared resources like land, air, and water. At the same time, many seem to think this growth barely sufficient, as they echo longstanding beliefs that population growth is inextricably linked to the nation's prosperity.4

Sources of Change

Three factors determine population change within a geographic area: how many people are born to the population (fertility), how many die (mortality), and how many enter from, or move to, another area (net migration).

Fertility

During the 1990s, the total fertility rate (TFR) — the average number of children a woman would have given the current birth rate for each age group — remained relatively constant. It hovered just below 2.1 births per woman — the level necessary for the population to replace itself. The most recently published TFR of 2.06 (for 1998) is the highest of the decade, and is considerably higher than the average rate of 1.5 for the industrialized world as a whole. The reasons for and effects of the United States' higher fertility are hotly debated among social scientists and political polemicists. Ranked by current TFR, the United States is in the company of such countries as Thailand and Albania, rather than Canada and the United Kingdom (see Box 1).5

At the beginning of the 20th century, U.S. fertility levels were in the midst of the long-term decline that accompanied the industrialization and urbanization of the country. Native-born white women born between 1871 and 1875 had an average of 3.5 children by the time they were ages 45 to 49 in 1920. Similar women born between 1906 and 1910 had just 2.2 children by 1955, when they were ages 45 to 49.6 But the 20th century saw a marked, though temporary, increase in American fertility during the "baby boom" years that followed World War II. Women born between 1930 and 1935, who were in their prime childbearing ages during the baby boom, had about 3.2 children by the time they reached their 50s.

Americans born during the baby boom were the nation's most numerous generation ever, and they broke new demographic ground as they entered each life stage. As adults, many baby boomers postponed childbearing to later ages, leading to what some people called a "baby bust" during the 1970s and early 1980s. Between 1973 and 1988, the U.S. TFR remained below 2.0, in part because large numbers of baby-boom women continued their education beyond high school and entered the labor force. The median age at first birth for American women rose from age 21.8 in 1960 and age 22.3 in 1975, to age 23.5 in 1980.

The generation following the baby boom waited even longer to enter motherhood. The average age at first birth rose to 24.3 by 1998. The share of women not having any children rose, while the share having more than three children dwindled, which helped keep the TFR around 2.0.7

Educational attainment has a significant influence on how many children women have. Women tend to delay childbirth until they have completed their education. This delay reduces the number of potential years in which women can become pregnant. Also, the more education women have, the more likely they are to join the labor force and to start a career, which both delays and reduces their childbearing. The increasing educational attainment of the American population has contributed to its lowered fertility.8

Differences in educational attainment also explain some of the difference in fertility among women of different racial and ethnic origins. Hispanics, for example, have the lowest average educational attainment and the highest average fertility of U.S. racial and ethnic groups. Asians and non-Hispanic whites have the highest educational attainment and the lowest fertility.9

Even during the baby-bust years, the U.S. population continued to grow through an excess of births over deaths. The high fertility rates of earlier generations meant that there were more women having children, even if they were having fewer of them. In addition, net immigration contributed more people of childbearing ages each year.

Table 1
U.S. Life Expectancy at Birth and at Age 45, by Sex, 1900 to 1998
Life expectancy in years
At birth At age 45
Year Both sexes Men Women Both sexes Men Women
1900 47.3 46.3 48.3 24.8 24.1 25.4
1910 50.0 48.4 51.8 24.5 23.8 25.4
1920 54.1 53.6 54.6 26.3 25.8 26.7
1930 59.7 58.1 61.6 25.8 24.9 26.9
1940 62.9 60.8 65.2 26.9 25.5 28.5
1950 68.3 65.6 71.1 28.5 26.6 30.6
1960 69.7 66.6 73.1 29.5 27.1 32.1
1970 70.8 67.1 74.7 30.1 27.2 33.1
1980 73.7 70.0 77.4 32.3 29.2 35.2
1990 75.4 71.8 78.8 33.4 30.7 35.9
1998 76.7 73.9 79.4 34.2 32.0 36.3
Sources: U.S. Census Bureau, Historical Statistics of the United States, Colonial Times to 1970 (1975) and National Center for Health Statistics, National Vital Statistics Reports 47, no. 25 (1999): Table 16.

Mortality

Life expectancy grew throughout the 20th century, thanks to advances in medical care, improvements in sanitation and public health, and increases in general knowledge about how to protect health. During the early part of the century, mortality rates fell faster for infants and children than for older Americans, which increased the number of children who survived to have children of their own. Later in the century, health improvements focused on adult conditions and diseases, and increased the numbers of people who survived to old age.

Demographers estimate that about half of today's population would not be alive if mortality rates were the same in 2000 as they were in 1900. Mortality decline contributed more to population growth over the last 100 years than did immigration.10

Around 1900, a baby born in the United States could expect to live barely 47 years — baby boys to age 46 and baby girls to age 48. By 1960, boy and girl babies could expect to live at least 20 additional years — males to age 67, females to age 73. By 1998, a newborn boy could expect to live to age 74, a newborn girl to age 79 (see Table 1). Some researchers consider current estimates too conservative, suggesting that biological, medical, and gerontological breakthroughs make life expectancy for today's baby girls closer to 95 or 100 years, with baby boys not far behind.11 Current Census Bureau projections estimate, however, that babies born in 2100 will have a life expectancy of 88 years for boys and 92 years for girls.

These improvements reflect a dramatic evolution in the leading causes of death. In 1900, infectious diseases such as influenza, pneumonia, tuberculosis, and typhoid were among the leading causes of death. Between the early 1900s and late 1990s, the death rates for influenza and pneumonia fell by 83 percent (from 202 deaths per 100,000 persons to 35 deaths per 100,000), while death rates for tuberculosis and typhoid descended to near microscopic levels.

The sharp reduction in deaths from infectious diseases means that more people survive into later life, thus increasing their likelihood of dying from degenerative diseases associated with older ages, such as heart disease, cancer, and stroke. Death rates from cancer more than tripled between 1900 and 1990, for example. Death rates for cardiovascular diseases that cause heart disease and stroke also rose — 51 percent between 1900 and 1960 — but then declined to levels just above those at the beginning of the century — about 350 deaths per 100,000 persons. Increased public awareness of risk factors for heart disease and strokes — like smoking and high blood pressure — and better medical treatment have helped reduce deaths from cardiovascular diseases, although they remain the leading causes of death in the United States.12

Many people assume that Americans who survive into old age are simply gaining years that will be plagued by disability and frailty. Research conducted during the 1980s and 1990s found, however, that each new American generation entering its 80s had fewer disabilities than the generation that preceded it. These results suggest that healthy life expectancy, defined as life without disabilities that constrain the activities that are part of daily living, may be rising as rapidly as life expectancy.13 A recent study showed that a 20-year-old white woman could expect at least another 50 years free of disabilities (see Figure 2).

Migration

The relatively low rates of fertility and mortality, which characterize all more developed countries, make migration a prime source of U.S. population growth. Net migration (immigration minus emigration) contributed about 30 percent of the increase in the population during the last decade of the 20th century, about the same as during the first decade of the century.14

Figure 2
Active Life Expectancy Among 20-Year-Old Women, by Race and Ethnicity, 1990

Between 1901 and 1910, 8.8 million people were granted immigrant status in the United States — setting a record that may only be broken when all the data are in from the 1990s. Between 1991 and 1998, 7.6 million people had been admitted as immigrants. In contrast, only 0.5 million people were granted immigrant status between 1931 and 1940, when the United States was in the depths of the Great Depression. Only 1.0 million arrived in the decade that followed.15

The U.S. population is generally defined as its resident population, that is, all the people who live in the United States. Recent censuses also include Americans who are overseas as government employees and thus have a U.S. address that can be reported by the government. The foreign-born population, which includes large numbers of business people, diplomats, temporary workers, and other foreigners (or non-immigrants) living in the country temporarily, grew from 19.8 million in 1990 to 25.2 million in 1998.16 The 1998 figure is almost two and one-half times larger than the 10.3 million foreign born living in the United States in 1900. Because the U.S. population is so much larger now, the foreign-born account for a smaller share of the population: just 9 percent in 1998, compared with nearly 14 percent in 1900.

Figure 3
U.S. Immigration, 1900 to 1998

Immigration was particularly high at both the beginning and the end of the 20th century, with about 1 million immigrants arriving annually during the peak years (see Figure 3). Both periods were characterized by a fundamental restructuring of the nation's economy, then from agriculture to industry, and now to services and information. Both periods saw immigrants coming from parts of the world with ethnic backgrounds different from those of the majority. Thus, both waves of immigration added to general unease about how fundamental changes in the economy and the population would alter the country.

In the early 1900s, the origins of U.S. immigration flows shifted from Northern Europe to Eastern and Southern Europe. People with unfamiliar languages and different religions swelled the growing industrial cities. An anti-immigrant backlash among the Protestant, Northern European, and rural American majority prompted the U.S. Congress to close the country's open door to a crack in the 1920s.17 In 1965, momentum provided by the civil rights movement caused Congress to reopen the door by removing racial and ethnic restrictions on immigrants. Now immigrants are coming from Latin America, Asia, and elsewhere, making the population ever more diverse (see Figure 4).

Immigrants also make the United States younger than other industrial countries because people are most likely to migrate while they are young adults. These young immigrants are also in their prime family-building years, and they contribute further to population growth through their children. In 1998, nearly 20 percent of all U.S. births were to foreign-born women.

Where Americans Live

Residential mobility has been a fundamental characteristic of the American population since colonial times, and the 20th century took this tradition in new directions. At the beginning of the century, industrialization propelled urban growth, and trade favored cities that were at the intersection of transport networks, particularly in the Northeast. As the century wore on, improvements in agricultural productivity continued to reduce the need for agricultural labor, and made small farms less economically viable.

A stream of people flowed from farms to urban communities small and large. By mid-century, Americans all over the country were choosing new neighborhoods in suburbs that ringed city centers. For a variety of reasons, including the diffusion of air conditioning and the relocation of manufacturing industries to the South from the Northeast and Midwest because of foreign competition, large numbers of people moved to states in the South and Southwest.18

These trends contradicted Americans' persistently stated preference for life in small towns and cities, so it was not surprising that in recent decades many chose to act on this preference. During the 1970s, population grew more in nonmetropolitan areas than in metropolitan areas, and many demographers hailed a "rural renaissance" that seemed to vanish during the 1980s. Still, the population is continuing to grow in such sparsely populated but attractive places as the Mountain states and in nonmetropolitan counties near large towns and cities. Perhaps more important than debates over a possible "rural renaissance," new communications technology and new ways of doing business simply offer Americans a greater array of residential and geographic choices.19

Regional Shifts

The nation's population increased in every state over the 20th century, but at century's end, Americans were more evenly distributed among the four regions than they were at the beginning, largely due to steady and rapid population growth in the West. At the beginning of the century, fewer than 6 percent of Americans lived in the West; today, more than 22 percent live there (see Figure 5). Westerners now outnumber people living in the Northeast and are nearly as numerous as Midwesterners.

Figure 4
U.S. Immigrants by Region and Country of Birth, Selected Years, 1901 to 1997

More than three out of five Americans (62 percent) lived in the Northeast and Midwest in 1900. All four regions grew throughout the century, but population growth came to a near stall in these two regions in the 1970s, when many Northerners headed to states in the South and Southwest, often referred to as the Sun Belt.

The Midwest's share of the nation's population declined the most — from 34 percent to 23 percent, while the Northeast dropped from 28 percent to 19 percent to become the nation's least populous region.

Figure 5
U.S. Population by Region, 1900 and 1999

The South's share of the nation's population fluctuated over the century, declining slightly through mid-century as large numbers of poor rural people, particularly African-Americans, moved to such northern and midwestern cities as New York, Detroit, and Chicago. In the latter part of the century, however, growing economic opportunities and more tolerant attitudes toward civil rights attracted new migrants: by 1999, the South had 36 percent of the nation's population — four percentage points more than in 1900.

This geographic shift changed the electoral weight of states, leading national political strategists to rethink the logistics of presidential campaigns. The four most populous states in 1900 are still in the nation's top 10, though ranked slightly lower: New York, Pennsylvania, Illinois, and Ohio. California, now the nation's most populous state, was ranked only 21 in 1900, while Florida, now the fourth most populous state, was ranked 33. In 1900, nine of the 10 states with the largest populations were in the Northeast or the Midwest. Today, three of the four largest states are in the South or the West (see Table 2).

At the beginning of the 21st century, the fastest growing state populations are mostly in the West, with rapid growth in Georgia and Florida as well. In a mirror image, the slowest growing states are found everywhere but the West. Three states — Rhode Island, Connecticut, and North Dakota — lost population during the 1990s.

Urban and Rural Population

Americans tend to forget that the nation's first settlements were towns, and that the urban population kept pace with the rural population in the early 19th century, accounting for a steady 6 percent to 7 percent of the population, before the urban share began to spurt ahead. The settlement of the West in the 19th century was structured around towns, with streets laid out in checkerboards awaiting the new arrivals.20 But the 20th century saw the completion of Americans' transformation into an urban population. In 1900, 40 percent of the population lived in urban areas; by 1990, 75 percent did.

Urban growth over the 20th century took place in cities of all sizes. In 1900, 8.5 percent of the population lived in a city of 1 million or more inhabitants, while nearly 14 percent lived in communities with 2,500 to 25,000 people.21 In 1990, cities of at least 1 million inhabitants still accounted for about 8 percent of the U.S. population. The most impressive urban growth was in small and medium-sized cities. These cities accounted for 67 percent of the population by 1990 (see Figure 6).

Three cities had over 1 million people in 1900: New York (3.4 million), Chicago (1.7 million), and Philadelphia (1.3 million). By the end of the century, six more cities had joined them, all in the West or Southwest: Los Angeles, Houston, San Diego, Phoenix, San Antonio, and Dallas. Except for Los Angeles, none of these cities were among the 50 largest in 1900. Many older cities in the Northeast and Midwest have fewer people now than they had in 1950. St. Louis, Buffalo, and Boston, in fact, had fewer people in 1998 than they had in 1900.

Comparisons among cities are not necessarily meaningful because some cities, like Boston, have relatively permanent boundaries, while others, like Phoenix, keep incorporating new growth on the edges. In some places, "edge cities," as journalist Joel Garreau has named them, are more vibrant in terms of economic and population growth than the cities they border.22 In any case, improvements in transportation and communication networks have blurred city, suburb, and country such that people who live in one may work in another. This resulted in the concept of the metropolis, which recognizes the economic and social interdependence of populations that are physically close.

Metropolitan America

The rise of metropolitan areas has been the most important development in population distribution in recent decades. The 1910 Census was the first to identify such "greater cities," and the 1990 Census was the first to report that a majority of the population lived in "major" metropolitan areas (those with more than a million people).23 The nation's nonmetropolitan population continues to grow, but slowly, and its share of the total population continues to decline while the space it occupies on the map of the United States shrinks. The entire population of New Jersey lives within metropolitan area boundaries, as does more than 95 percent of the population in California, Connecticut, and Massachusetts.24

Table 2
Ten Most Populous U.S. States in 1999, and Rank in 1900
1999 Rank State Population 1900 Rank
1 California 33,145,121 21
2 Texas 20,044,141 6
3 New York 18,196,601 1
4 Florida 15,111,244 33
5 Illinois 12,128,370 3
6 Pennsylvania 11,994,016 2
7 Ohio 11,256,654 4
8 Michigan 9,863,775 9
9 New Jersey 8,143,412 16
10 Georgia 7,788,240 11
Source: U.S. Census Bureau. Accessed online at: www.census.gov/population/estimates/state/st-99-2.txt, on Dec. 31, 1999.

Metropolitan areas consist of a central city and its suburbs, and metro boundaries are defined primarily by commuting patterns measured by the decennial census. Statisticians also evaluate population density and other characteristics when deciding which counties to include in a metropolitan area. During the 1950s and 1960s, suburbs were often referred to as "bedroom communities," because people slept there while they worked, shopped, and found their entertainment in the city center. By the end of the 1990s, more than 60 percent of the metropolitan population was suburban.25 Moreover, many leisure and economic activities had also moved to the suburbs, particularly new enterprises such as technology and information-based industries. Suburban "sprawl" was no longer a concern confined to Southern California.

Aside from simply measuring their growth, the delineation of metropolitan areas acknowledges that while adjacent populations may have different residential and employment characteristics, they have common concerns that require a framework in which to assess and address them. Such issues as clean air and water affect all the people in a metropolitan area, whether they live in a high-rise building in the central city, a townhouse in a suburb, or an old farmhouse in a nearby rural area.

Figure 6
Figure 6: U.S. Population Living in Rural and Urban Areas, 1900 and 1990

The availability of such different residential choices for some Americans but not others disturbs those concerned about "demographic balkanization." This term describes the process through which people are separated by race, ethnicity, class, and age across regions and metropolitan areas. These demographic differences are reinforced by internal and international migration.26 This situation reflects several trends. One is that metropolitan areas with diversified economies, including advanced service and knowledge-based industries, are thriving, as are those that attract people interested in recreation or retirement. Areas that lack these advantages are stagnating. The economically vibrant areas are more likely to attract and retain college-educated people, who have a broad range of residential and employment preferences, while the stagnant areas tend to have less-educated populations — those less able to move to areas with better job opportunities.

There are also racial differences, partly as a result of historic residential patterns, such as the large African-American population in the Southeast and the large Hispanic population in the Southwest. These differences were enhanced as the majority of late 20th-century minority immigrants headed to a few states, mostly on the East or West Coasts. Some researchers have suggested that the resultant competition for jobs and housing in coastal states led unskilled U.S.born workers, particularly non-Hispanic whites and non-Hispanic blacks, to migrate to places in the country's interior. Others say that industrial restructuring was largely responsible because it reduced many blue-collar jobs in former manufacturing centers, and prompted less-educated workers to move in search of new opportunities.27 In any case, the racial and ethnic composition of the country continues to be very different across regions and states.

Within metropolitan areas, people continue to live near others who share their income, age, and educational levels. As time passes, many of these neighborhoods have "aged," and suburbs that once featured young families with children are grappling with the very different needs and preferences of older people. Meanwhile, the revitalization of central cities and older suburbs has made many close-in neighborhoods newly attractive. Some inner-city neighborhoods have been renovated and "gentrified" by higher-income populations; some older suburbs are undergoing "mansionization," as well-off workers interested in short commutes enlarge smaller homes and add luxurious amenities. In both cases, neighborhood demographics can change — from low-income to high-income, from one racial group to another, or from older residents to younger families — creating different sets of needs and expectations.

Geographic diversity has required organizations that depend on attracting large numbers of people to develop strategies that unite people across demographic boundaries. Businesses such as shopping centers or professional sports undertake extensive research to find the values, attitudes, and preferences their potential audiences share, no matter where they live, and to identify ways to reach them. Similarly, political campaigns, educational institutions, churches, and even hospitals are reaching out to find their "own" populations across geographic lines.

Not all individuals, organizations, or activities have the power to cross such lines. The movement of residential and economic activity to the suburbs trapped many, particularly low-income minorities, in residences distant from job opportunities and lacking adequate transportation to get to jobs elsewhere. Most cities and other independent jurisdictions within metropolitan areas are not represented by an overall metropolitan governance structure that can address their interests along with those of more wealthy residents from neighboring jurisdictions. Many cities have lost tax revenues along with employers and upper-income residents. As a result, race and class divisions between central cities and suburbs intensified around the country in the latter part of the 20th century. The emerging dominance of the suburbs has rendered city populations less politically and socially powerful, primarily because they are poorer and disproportionately minority.

Racial and Ethnic Diversity

The U.S. population has always been multiracial, and it is becoming even more diverse now than it was at its founding two centuries ago. Although the nation's founders considered the United States a country by and for white people, significant shares of the original population were American Indian or black. The early censuses only counted Indians who were taxed, so estimates of the Indian population are vague. Blacks were enumerated for apportioning political representation among the states, and represented roughly 20 percent of the population that was counted in 1790.28

Immigration trends and legal restrictions on immigration and citizenship brought the share of whites in the population to about 90 percent by 1900. Nearly all of the remaining 10 percent was African-American. The decline in the American Indian population that began with the Europeans' arrival reached its nadir of 237,000 that year, and there were only 114,000 people of Asian origin, mostly Chinese.29 (Although the nation inherited a large Hispanic population with the acquisition of territories in the South and Southwest, these populations were not measured separately until 1970.)

At the end of the 20th century, these proportions were significantly different. Non-Hispanic whites now represent 72 percent of the population, while the minority population is more diverse as well as more numerous. Non-Hispanic African-Americans slightly outnumber Hispanics, but each group accounts for about 12 percent of the population. Asian and Pacific Islanders account for nearly 4 percent. Although the number of American Indians (and Alaska Natives) nearly tripled over the century, they account for less than 1 percent of all Americans.

Sources of Diversity

U.S. racial and ethnic origin groups generally grow in two ways: from natural increase (the excess of births over deaths) or net immigration (immigration minus emigration). Natural increase is largely responsible for the changes in the absolute numbers and percentage share of the white, black, and American Indian populations. Immigration along with natural increase has driven the increase in Hispanics and Asians. A third source of growth is change in self identification, but this has been most important for American Indians. In recent decades, the public's increasing recognition of Indian heritage and identity has encouraged a growing share of people who are both white and American Indian in origin to identify themselves as American Indian.

Figure 7
Projected U.S. Racial and Ethnic Composition, 1999 to 2050

In the mid-1960s, the U.S. Congress changed the makeup of immigrants when it eliminated the admission criteria that had been introduced in the 1920s. The old quota system sought to maintain the national origins of the population at the turn of the century and thereby favored the entry of European-origin whites and excluded most Asians and other nonwhites.

The elimination of quotas fostered new immigration flows from less developed countries, where large numbers of people wanted to emigrate in search of economic opportunities and political freedom. At the same time, the nation instilled a new guiding principle for immigration policy — family reunification — which perpetuated and augmented immigration from less developed regions. Between 1980 and 1998, nearly three quarters of all new immigrants came from Asia and Latin America, while about 20 percent came from Europe and about 4 percent from Africa. This was a significant reorientation of immigrant source countries. As recently as 1950, roughly two out of three immigrants came from Europe and Canada. Consequently, recent immigration has contributed to the significant growth in the numbers and proportion of the nation's largest minority populations.

The relative contribution of recent immigration to each racial and ethnic origin group is reflected in the foreign-born population. In 1998, about 43 percent of the foreign-born were Hispanic, 26 percent were white, 25 percent were Asian and Pacific Islander, and 7 percent were black.30

The 26 million foreign-born U.S. residents made up 9 percent of the total U.S. population in 1998. In sharp contrast, 63 percent of the Asian and Pacific Islander population had been born outside the United States, as had 35 percent of the Hispanic population. Only 5 percent of non-Hispanic blacks were foreign-born, as were 3 percent of non-Hispanic whites, and less than one percent of American Indians.31

Some immigrants are fleeing political upheavals, even violence, in their native lands, but most are seeking economic opportunities. Recent decades have seen a spurt in the number of young adults in less developed countries. Widespread expansion of public health knowledge and practices reduced mortality rapidly, especially for infants and youths. As a result, more than half the population of less developed countries is under age 30.32 With or without the transformation of economies under an increasingly global regime, it would be hard for these countries to fit such a surge of young adults into their labor force. For example, almost as many young Mexicans reached age 15 each year during the 1990s as young Americans, yet the Mexican economy was only one-tenth the size of the U.S. economy. It is not surprising that there were not enough jobs in Mexico, let alone well-paying jobs, for such an exceptionally large generation.

Current projections suggest that immigration will keep American minority groups growing briskly. The most recent projections from the Census Bureau foresee more than 1 million immigrants annually. After subtracting projected emigration, the Bureau estimates that immigration will add 468,000 Hispanics and 229,000 Asians annually to the U.S. population until 2025, along with 161,000 non-Hispanic whites and 93,000 non-Hispanic blacks. Hispanic immigration is expected to ease over the next quarter century, while that of non-Hispanic whites and blacks, and especially Asians, is projected to increase.

The Census Bureau projects that the share of minorities in the population will rise from 28 percent in 1999 to 47 percent in 2050.33 By 2010, according to these projections, Hispanics will outnumber non-Hispanic African-Americans to become the nation's largest minority population. Hispanics will make up nearly 15 percent of the U.S. population in 2010 and nearly 25 percent by 2050 (see Figure 7). By 2060, non-Hispanic whites are projected to account for less than one-half of all Americans. By 2100, nonwhites and Hispanics are projected to make up 60 percent of the U.S. population, with Hispanics alone accounting for 33 percent.

These projections are also based on differences in fertility and mortality among racial and ethnic origin groups. Fertility rates are generally higher for minority populations. First, recent immigrants tend to maintain the relatively higher fertility of the countries from which they came. Second, minority populations tend to be relatively younger — the product of both immigration (young adults are most likely to migrate) and higher fertility. In 1998, racial and ethnic minorities contributed 40 percent of all U.S. births, even though they represented only 28 percent of the total population (see "Why Is Fertility Higher in the United States Than In Europe?").

Although immigration keeps adding to the number of foreign-born Hispanics and Asians, the number of U.S.born Hispanics and Asians is projected to increase at an even faster rate. This natural increase will eventually reduce the percentage of these populations that is foreign-born. The percentage of blacks who are foreign-born is projected to increase, however, from 5 percent in 1998 to 12 percent in 2100, fueled by immigration from subSaharan Africa. That continent's extremely high fertility rates and young population age structure favor continued rapid population growth in coming decades. Especially given the struggling economies in these regions, increasing numbers of young Africans are likely to emigrate to the United States and other developed countries in search of employment.34

Minority populations also tend to have higher mortality rates, due in part to lower socioeconomic status and more limited access to health care. During the 1990s, however, the gaps between life expectancy for different population groups narrowed slightly. Current Census Bureau projections assume that this trend will continue. Among Americans born in 2100, American Indian women are projected to live the longest (94 years). They are followed, in this order, by Asian American women, Hispanic women, white women, black women, Asian American men, Hispanic men, American Indian men, white men, and black men (with the lowest life expectancy at 87 years).35

All three major sources of diversity — immigration, fertility, and mortality — make the minority population considerably younger than the non-Hispanic white population. The majority, non-Hispanic white population had a median age of 38.1 in 1999, nearly a dozen years older than the Hispanic population (26.5). American Indians (28.3) were almost as young as Hispanics, while the median ages of the African-American population (30.3) and the Asian and Pacific Islander population (32.0) were not much higher.

In 1998, about one-third of the minority population was under age 18, compared with just one fourth of the non-Hispanic white population.

A mirror image of this disparity exists for people at the other end of life. These differences mean that policies or programs may have very different audiences according to the age group they serve. Because older people are more likely to vote and are more likely to be non-Hispanic white (who are more likely to vote than minorities), the voting population is consistently more "majority" than the population as a whole. In contrast, education programs for children deal with an increasingly minority population in many parts of the country.

America's racial and ethnic transformation is not evenly distributed across the country. It is most visible in certain states and communities. The four major minority groups are at least one quarter of the population in 853 (or 27 percent) of the nation's 3,142 counties and county equivalents. But minorities make up less than one tenth of the population in 1,639 counties — 52 percent of all counties. As a result, many non-Hispanic whites have little day-to-day contact with people of different racial and ethnic backgrounds. In 2000, minorities are a numerical majority in Hawaii and New Mexico, but they represent less than 5 percent of the population in Maine, Vermont, New Hampshire, and West Virginia.

The political disputes between urban and rural legislators that characterized the first part of the 20th century grew in part from the difference in the racial and ethnic makeup of their populations. Rural areas were largely populated by the descendants of people who had come to the United States in colonial times or in the late 18th and early 19th centuries. By the beginning of the 20th century the land had been settled, and new immigrants from southern and eastern Europe found economic opportunity in the growing cities, along with blacks migrating from the South. That trend has continued for the newest newcomers. As a result, minorities constitute 47 percent of the nation's central city population, up from 35 percent in 1980. (They constitute 22 percent of the suburban population, up from 13 percent in 1980.)

African-Americans are the most widely dispersed minority population, yet most live in the South and in large metropolitan areas. Hispanics are concentrated in the Southwest, in the Northeast, and in Florida. Each one of these Hispanic populations has a different country-of-origin composition. Asians and Pacific Islanders are most numerous in California and Hawaii, as well as in the New York area. And American Indians and Alaska Natives (especially Indians who live on tribally governed reservations) are most prevalent west of the Mississippi River — primarily in California, Oklahoma, Arizona, Alaska, Washington, and Oregon.

Several places are truly multi-ethnic, with high concentrations of more than one minority group. For example, African-Americans, Hispanics, and Asians each make up at least 10 percent of the populations of several counties around New York City, Los Angeles, and San Francisco, while New Mexico has counties with both Hispanic and American Indian concentrations. Researchers who examined residential data from the 1990 Census suggest that there is greater potential for racial and ethnic co-residence in multi-ethnic metropolitan areas, especially in newer areas in the West and the South. The tendency for Asian and Latino immigrants to locate in the suburbs and the growth of some suburbs as employment centers are creating more heterogeneous metropolitan populations at the same time as they create more heterogeneous suburban communities.36

Evolving Identities

Because it is largely a social construct, the concept of race is relatively fluid. Americans choose the race and ethnic groups with which they identify (and by which they are identified in published statistics). Patterns of identification are often driven by the political context. In many ways, recent patterns reflect the decisions of people who wanted to gain power and political representation by joining with other underrepresented people beneath one large umbrella. No one knows how Americans will choose to define themselves in the future, but there are several indications of possible change. Tests of a new question for the 2000 Census found different patterns for different racial and ethnic groups. Asian and Pacific Islanders were most likely to report a broad rather than a narrow identity, African-Americans the least likely.37

Census tests have also shown that many Hispanics identify their race as Hispanic (or Latino), although Hispanics are not considered a racial group but rather an ethnic group that shares a common culture and a language. The nation's original Hispanic-origin population was largely displaced from its property and influence in the Southwest when the United States took sovereignty from Spain and Mexico. These Hispanic citizens' low status and social discrimination helped justify the fight for official minority status by their descendants, along with recent immigrants from Mexico, Puerto Rico (another population acquired through treaty), and Central America.

Investigations of ambiguous census responses indicate that roughly 90 percent of the Hispanic population would be considered white under current statistical guidelines. Most of the remainder would be considered black. Consequently, socioeconomic changes could make racial definitions more important to Hispanics than national origin, especially as the group's growing numbers make it less of a minority. Hispanic business ownership is growing, for example, and the successes or failures of individual business owners could lead them to identify more with the dominant white population, or more with the protected black minority.38

Intermarriage is another wild card, especially because statistical forms no longer require children of mixed marriages to choose just one of their multiple identities. In 1998, about 5 percent of U.S. married couples included spouses of different races or Hispanics married to non-Hispanics. Moreover, most of these marriages were relatively recent, so a trend to more of them, and thus to more multiracial children, seems underway. Next to American Indians, Hispanics are most likely to marry outside their group.39 As more of the population becomes multiracial, or acknowledges its multiracial origins, race may become less socially and politically relevant, especially as the white majority diminishes. Or, again depending on socioeconomic changes, race could become more salient, and intermarriage could simply create a new "minority" of multiracial people.40

Language and culture are perennial barriers between population groups. But the cohesion of Hispanics, who come from many different countries, is due partly to their shared language and culture and their desire to maintain them. Modern communication tools make this more feasible for all of today's immigrants, but especially for Hispanics who are much closer, geographically, to their countries of origin than other immigrants. Just as the first colonists wished to re-create the communities they left in northern Europe, many of today's Latino immigrants ask the Anglo (non-Hispanic) culture to accept, and even to incorporate, elements of Latino culture. The role of Spanish in an English-speaking society is not clear, nor is the outcome of today's debates over English-only language policies and bilingual education. As the economy continues to globalize, however, the nation may find that Hispanic and other immigrants who have ties with their home languages and cultures to be an increasingly valuable resource.

Age Profile

The changes in fertility and mortality of the 20th century have produced an entirely new age profile for the United States, and for industrial countries in general. The growth in life expectancy combined with stable fertility rates has produced a population with a greater share of older people and a declining share of young people. These changes are shifting the U.S. age profile to one where there are roughly equal numbers of people in every age group.

One of the demographer's basic tools is called an age pyramid because the picture of a population by age has traditionally had a relatively small group of older people at the top, a middling amount of middle-aged people in the middle, and the bulk of the population made up of young adults, teenagers, and children. In 1900, a pyramid was a good representation of the U.S. population (see Figure 8). By 1980, the shape was less defined. The declining number of births during the Depression cinched that pyramid around the middle, the postwar baby boom widened it just above the bottom, while the baby bust cut away at the bottom. Now the American age structure no longer resembles a pyramid but rather a pillar: There are more older people and more middle-aged people, relative to young people, than ever before.

Figure 8
U.S. Population by Age and Sex, 1900, 1980, 2000, and Projections for 2020

The change in the median age of the population provides a quick insight into the magnitude of this transformation. In 1999, the median age was 35.5 years — the "oldest" it has ever been, but "younger" than it is projected to be ever again. In 1900, the median age was 22.9; in 2100, according to current Census Bureau projections, it will be 40.3. That's not to say that the path will rise evenly: The median age declined slightly in 1960 and 1970, because of the baby boom between 1946 and 1964, and a similar phenomenon could disturb the growth path again.

This same baby boom will shortly cause a sharp spurt in the proportion of the population ages 65 and over. In 1999, nearly 13 percent of the population was made up of "senior citizens" — roughly 35 million people. By 2025, this proportion is projected to jump to nearly 19 percent, or nearly 63 million Americans. The growth in the older population is projected to continue with the generations that follow, reaching 23 percent in 2100, but the pace of growth will be slower.

The nation's youth population has also increased, but not as rapidly as the older population, so its share of the total population is decreasing. In 1910, people under age 18 accounted for 38 percent of the population; this share was only 26 percent in 1999, and is projected to decline slowly but steadily over the 21st century. At the 21st century's end, the population under age 18 and the population ages 65 and older are expected to be about the same size.

Given the parallel rise in healthy life expectancy, it is likely that demographers a century hence will no longer be using age 65 to delineate the nation's older population. When 65 was set as the age threshold for Social Security benefits in 1935, average life expectancy at birth was not even 62 years. By 2050, if the assumptions underlying the Census Bureau's projections are correct, the proportion of Americans ages 85 and older will be almost as large the proportion that were age 65 or older in 1930, when the Social Security system was being developed. Also by 2050, the proportion under age 18 will be considerably smaller: 23 percent, down from 35 percent in 1930. So demographic change will contribute to the array of options Americans will assess to define dependency thresholds in youth and old age.

Public policy assumes that dependency begins or ends at certain age thresholds, and the dependency ratio is currently defined as the number of people under age 18 or ages 65 and over per 100 people ages 18 to 64. Such thresholds tend to be set according to history, custom, and the conditions prevailing at the time. The Social Security system as originally proposed, for example, set 70 as the age for collecting benefits, but unemployment was so high during the Great Depression that age 65 was eventually chosen as a way to free up some jobs for younger workers.41 With the large baby-boom population now in mid-life, the dependency ratio is lower than it was in any decade during the 20th century, except the 1930s. By the end of that decade, the dependency ratio was 60 — the child dependency ratio was 49 and the old-age dependency ratio was 11. The dependency ratio for the 1990s was 62, but the child dependency ratio had declined to 42 and the old-age dependency ratio had risen to 20 (see Figure 9).

Once the baby-boom generation is on the far side of age 65, the dependency ratio will be higher than in any decade during the 20th century, except between 1900 and 1915, and during the 1960s. In 1900 the ratio was 80, with children accounting for almost all of it (73); in 1960, the ratio was 82 (the child dependency ratio was 65). In the 2030s, 40s, and 50s, the dependency ratio is expected to be just below 80, with the child dependency ratio accounting for little more than one-half.42 The economics and politics of the times will determine whether the public will wish to reassess these age thresholds. Current trends suggest that actual dependency for young people now extends well beyond age 18, while dependency for many older people now comes much later than age 65.

Both legally and statistically, age 18 is still the age of maturity. However, the intersection of economic and demographic change has transformed life for young adults. Relatively few Americans ages 18 to 24 have taken on the major adult roles of financial independence, marriage, or parenthood. Instead, this life-stage has turned into one with a great many demographic activities (demographic density) undertaken in no particular order (demographic diversity).43 In this context, "density" includes such demographic actions as leaving school; departing the parental home for independent living; moving from one county, state, or region to another; getting married, having children, and becoming employed — all of which occur disproportionately in the young adult years. And "diversity" in this context refers to the increasingly varied sequence in which young people transition to adult work and family roles, including reversing them.

Figure 9
Dependency Ratios for Child and Older Populations, United States, 1900 to 2000, and Projections to 2040

During most of the 20th century, high school graduates divided into two groups, a minority who went on to college, and a majority who went directly to jobs or homemaking. Now a large majority of high school graduates continue their education, in a variety of educational settings and often in combination with other activities. In October 1997, 67 percent of the previous year's high school graduates were enrolled in college, up from 59 percent in 1988, and 49 percent in 1978.44 Fully 90 percent of part-time students ages 16 to 24 in October 1998 were also in the work force, as were 54 percent of full-time students.45

These statistics could be looked at in a variety of ways: more young adults are continuing their education, fewer young adults are going straight to work, more young adults are combining school and work, and many young adults are doing none of the above. Many of the latter may be "stepping-out" to travel or otherwise investigate life possibilities before choosing an educational focus or becoming permanently attached to the labor force. Others may find it hard to get traction on a fast-moving economy. Certainly, recent evidence suggests that it is taking longer for young men at least to make the transition from high school to stable employment. One set of sociologists found, for example, that it took seven years before 40 percent of male black high school dropouts were working full-time year-round, while 70 percent of black male college graduates were working full-time year-round within a year of graduation.46 The contrast was even more stark when examining career stability over a second year.

Spending more time in school and taking longer to become attached to the work force has caused many young adults to delay leaving their parents' home for independent living.47 In March 1999, fully 60 percent of the civilian population ages 18 to 24 was living with parents or other relatives (66 percent of males, 56 percent of females). About half of the remainder (19 percent) were married or were single parents (13 percent of males, 25 percent of females), while the rest were living alone or with non-relatives.48 At the same time, young adults continue to be the most geographically mobile age group. Between 1997 and 1998, 11.1 percent of the civilian population ages 18 to 24 moved (across county lines), compared with 5.8 percent for all age groups.49

Overall, the ages 18 to 24 span an extended period in which young adults are dependent upon their parents and society. These years have become a post-adolescent life-stage in which young people prepare for adult life by engaging in a variety of activities, in a variety of places, simultaneously or consecutively, and in no particular order.

This is but one aspect of the transformation in childhood and adolescence over the 20th century. In 1900, the infectious diseases of childhood were still major threats to survival, and 162 babies out of 1,000 died before their first birthday. Over the next half century many of these diseases were practically eliminated in the United States, and infant and child mortality continue to improve. The infant mortality rate was down to 29 deaths per 1,000 births by 1950, and to 7 per 1,000 by 1997. The death rate for children ages 1 to 4 fell from nearly 20 deaths per 1,000 children in 1900 to less than 2 deaths per 1,000 in 1950. In 1997, the child death rate was just 0.358 deaths per 1,000 (published as 35.8 per 100,000).50

Schooling became increasingly common after 1900, and child labor and school attendance laws became more stringent. In 1910, only 59 percent of youths ages 5 to 19 attended school (only 45 percent of African-Americans). By 1950, elementary school attendance was virtually universal: 96 percent of all children ages 7 to 13 attended school. High school attendance was more common too: Nearly 80 percent of city youths ages 16 and 17 were in school, but only 70 percent of rural non-farm youths, and 67 percent of farm youths were in school.51 By 1999, schooling was nearly universal up to age 18.

Children of the early 20th century were much more likely to be working. In 1900, 26 percent of boys ages 10 to 15 were "gainfully occupied" (employed), as were 10 percent of girls that age.52 (The age threshold for adulthood was age 16 in 1900.) By 1999, this proportion was near zero — child labor is prohibited by law. Roughly half of teenagers ages 16 to 19 are employed, but most work part-time.53 In short, at the end of the century, schooling is not only the primary, but also virtually the unique activity of people under age 18.

If childhood has become a less varied life-stage, the growing numbers of Americans over age 65 are leading more varied lives. The concept of retirement was largely developed in the 20th century. In the early 1900s, the majority of men ages 65 and older were still in the labor force. Nearly three in four older men were gainfully employed in 1890, as were nearly three in five in 1930.54 By 1999, just one in six older men and one in 11 older women were employed, according to the Bureau of Labor Statistics. Most people retire before age 65, and for most of them, retirement is a process rather than a clearly delineated event — influenced by their income, their health, and their preferences. Some may combine a pension from a longtime employer with income from self-employment or part-time work, while others may cease income-producing work altogether.55

As corporations downsized their work forces toward the end of the 20th century, many offered older employees incentives to retire early. Many of these employees were hired back on a temporary or part-time basis because they had essential knowledge or skills. Other employers, faced with a tight labor market, turned to older, retired people who wanted to supplement their incomes and pursue new interests. A television commercial for McDonalds' fast food restaurants exemplified this phenomenon, depicting an elderly man getting ready for his first day of work in a scenario that echoed his first day at work nearly a half century earlier. In recognition of this trend, income tax legislation in 2000 removed the "earnings test" that penalized Social Security beneficiaries for working between ages 65 and 70.

The number of years Americans tend to live after age 65 has increased from about 12 years in 1900 to 16 years for men and 19 years for women in 1997, and is projected to increase even more in the 21st century. As recently as 1970, published data from the census lumped all people ages 65 and older into a single population, but as the numbers increased and life expectancy lengthened, demographers began to perceive that the older population comprised multiple life-stages. For convenience, these can be differentiated by age: 65 to 74, 75 to 84, and 85 and older.

In the next few decades, the baby boom will make growth particularly rapid among the population ages 65 to 74, often referred to as the "young-old." This group will be the majority of older Americans until about 2030, when the aging of the baby boom will contribute to making people ages 75 and older the majority of older Americans. By 2050, the combined impact of the baby boom and continuing mortality declines at advanced ages will focus population growth on people ages 85 and over.

Because the improvement in mortality rates at older ages is relatively recent, mostly occurring after 1960, Americans still tend to view old age with stereotypes based on the past. Many Americans assume, for example, that increases in life expectancy are simply adding unhealthy years to the end of life and, consequently, that the coming wave of older Americans will create an extraordinary need for more health professionals, hospitals, and long-term care facilities. But research from Duke University suggests that healthy life expectancy is growing just as fast as life expectancy and that each new generation entering old age is less "old" than its predecessor.56

A new standard of energy and vitality in the population (based as much on a population that is better educated about health as it is on medical advances) seems to have pushed old age well into the 70s and beyond.

Given the large growth in the older population, Americans will no doubt revise their assumptions about what "old" means for individuals and institutions. Assumptions that people will spend their entire work life in one occupation, or that career ladders can only be climbed in one direction, may prove ill-suited to a potentially longer work life. Houses that have been designed for families raising children are now being re-thought for people who want to live in their own home until very late in life. Recognizing the parallel with young adulthood, another life-stage characterized by demographic density and role diversity, the concept of youth hostels has been extended into elder hostels for people who wish to combine traveling with learning. Similar transformations will no doubt occur in other aspects of life.

These transformations may take place more rapidly in some parts of the country than in others because the age profile of the population varies from state to state. Children under 18 account for 33 percent of the population in Utah, for example, and 31 percent in Alaska. In contrast, five states have a relatively large share of people ages 65 and over. They make up at least 15 percent of the populations in Florida, Iowa, Pennsylvania, Rhode Island, and West Virginia. Florida is a longtime magnet for retirees, but these other states have "aged" primarily because a disproportionate share of their younger people have moved elsewhere.

Migration determines a state's age profile in several ways. States that have growing economies attract economic migrants, who tend to be young adults. This is a de facto way of attracting children, as the states that attract young adults gain their children too. Consequently, migration-magnets like Georgia, Nevada, and Arizona tend to have relatively young populations, while states like Iowa and North Dakota that have little or no in-migration tend to have older population profiles. Older migrants also make a difference to areas like the Upper Great Lakes or the Ozarks that offer reasonably priced housing and recreational amenities. Most Americans, however, grow old in familiar surroundings, generally within range of their children.

Family Life

Family life also changed considerably over the 20th century, first as rural life gave way to city life, then when mortality declines extended average life expectancy beyond the life expectancy of the traditional, child-rearing family. At the beginning of the century, people raised children throughout most of their adult life. At the century's end, the combination of longer life expectancy and lower fertility rates have reduced the child-rearing portion of adult life to less than half.57 As a result, families stretch across more generations, but generally have fewer people in each generation. In short, today's living family tree is taller than it used to be, but its branches are shorter.

Marriage and Divorce

Marriage patterns changed direction twice during the 20th century. At the beginning of the century, couples often waited to marry until they could afford a home, and about 10 percent of adults never married at all. As wage-earning replaced farming, and young couples could rent a city apartment rather than delay marriage until a house or farm were acquired, people married at slightly younger ages. Between 1900 and 1940, the median age at first marriage declined from 25.9 to 24.3 for men, and from 21.9 to 21.5 for women. Similarly, the proportion of people ages 25 to 34 who had not married dropped from 30 percent to 24 percent between 1900 and 1940, according to decennial census data.

For reasons still debated today, the 1940s ushered in an extended period in which men and women married much earlier, and in greater percentages, than their parents or grandparents. In 1956, the median age at first marriage was at a 20th-century low of 22.5 for men and 20.1 for women. The proportion of people ages 25 to 34 who had not yet married declined to 12 percent by 1970. This early-marrying generation populated the new suburbs, gave birth to the baby boom, and made the "traditional" nuclear family, symbolized by television's Ozzie and Harriet, a national norm.

Starting around 1960, the median age for a first marriage began to rise again, reaching 26.7 years for men and a record high of 25.0 years for women by 1998. Changes in the economy encouraged young adults to invest more time in education, and expanded employment opportunities for women increased the proportion of unmarried young adults. In 1998, 36 percent of people ages 25 to 34 had never been married, nor had 16 percent of people ages 35 to 44.58 In contrast, a record low 4 percent of people ages 65 and older had never married — these generations mostly married during the 1950s when marriage rates were high. Based on subsequent trends, it is likely that this record will stand and that the proportion of people who reach older ages without ever marrying will return to higher levels.59

Women gained better control of their fertility beginning in the early 1960s with oral contraception. This was also a factor in the return to later marriage, as it contributed to new norms regarding sexual relationships outside of marriage. In particular, cohabitation became increasingly common before or between marriages, especially for young adults. Thus, it appears that only the age at first marriage has increased, not the age at first union.60 For some young couples, marriage takes place when they are ready to have children. For others, the connection between marriage and childbearing is less relevant, contributing to a rise in single-parent families. According to one recent estimate, about 40 percent of black cohabiting couples and about 65 percent of white couples end their cohabiting union in marriage.61

How marriages end also has changed considerably. In 1900, most marriages still concluded with the death of a partner, and single-parent families tended to be those in which a parent had died before all the children were grown. Already, though, the divorce rate was triple that of the mid-19th century.62 As divorce rates continued to rise, they erased what would have been a trend toward longer-lived marriages due to the decline in mortality.63 Nevertheless, frequent remarriage has kept the proportion of the population that is currently married high, especially for men. If cohabitation among previously married people were taken into account, "remarriage" rates probably have remained stable.64

In 1998, there were twice as many marriages as divorces — 2.2 million marriages and 1.1 million divorces.

In 1920, by comparison, there were seven times as many marriages as divorces — 1.3 million marriages vs. 171,000 divorces. Most of this shift took place in the 1960s and 1970s.

After reaching a high in 1981 — 5.3 divorces per 1,000 persons — the divorce rate declined steadily, to 4.2 per 1,000 persons in 1998, still more than twice the rate of 1920.65 In 1972, nearly 15 percent of American adults ages 18 and older had been divorced (regardless of whether they had remarried). By 1996, 23 percent of adults had been divorced. Meanwhile, the share currently widowed declined from 4.6 percent of those ages 15 and older in 1900 to 2.5 percent in 1998, which reflected the increase in the life expectancy.66

These changes in marriage and divorce have affected children both economically and emotionally. Public concern has focused particularly on children living with a single parent, whether the parent has never married or is divorced. Although analysts have pointed out that the likelihood of living with only one parent did not change much over the century, single parenthood was largely involuntary a century ago, rather than volitional, as it often is today. Thus, researchers are trying to understand not only what the effects of marriage trends on children are, but also what is driving parents' decisions.

The number and percentage of single-parent families increased at a rapid rate in the 1960s and 1970s, and then more slowly through the 1990s. Despite considerable population growth, fewer children lived with both parents in 1999 than did in 1960 — 49 million, down from 56 million — as the share of children living with both natural parents declined from 88 percent to 68 percent. Later marriage and more frequent divorce both played a part in the decline in children living with both parents. Marrying later simply increases the number of years women are exposed to the risk of childbearing outside marriage, as does the high divorce rate.

Divorce also increases the percentage of parents who are caring for their children alone, or with a new partner. Both men and women are spending fewer years as parents — reflecting a decline that began before 1900. Men are slightly more likely than women to parent a new partner's child, although the gender difference was smaller at the end of the 20th century that it was at its beginning.67

Meanwhile, the increase in cohabitation before and between marriages means that many families officially deemed "single-parent" actually contain two adults. In 1990, one in seven children seemingly living in a single-parent household actually resided with a cohabiting couple. In some cases, the couples were parents who had simply chosen not to marry. More often, the children were from a former union of one partner, as an increasing number of cohabiting couples included at least one divorced person.68

The economic impact on children who live with just one parent, still usually the mother, is generally severe. In 1999, about 23 percent of all children lived in single-parent families with their mother and about 4 percent with their father only. Their circumstances vary, especially according to the age and former marital status of the parents. The rate of births to teenaged girls declined over the last four decades of the century, for example, but the share of these births that are outside marriage increased steadily. These children of single parents are at particular risk, as their young mothers are less likely to be high school graduates, less likely to be employed, and much less likely to receive financial support from the father than women who have been divorced. Single-parent families headed by divorced women also pay an economic penalty. During the 1990s, women's family income (adjusted for the size of the family) declined by an estimated 24 percent after divorce, men's by an estimated 6 percent.69

The social impact on children is complex, and attributing causation is somewhat controversial.70 Researchers have found that growing up in a single-parent or stepparent family is associated with a higher probability of dropping out of high school, giving birth as a teenager, and for young men, taking longer to find a job or to pursue education after high school.71 Other researchers argue that genetics, not family experiences, are responsible both for the single parenting and the children's social outcomes. Whatever the cause, the long-term mental health of adults who experienced their parents' divorce as a child appears to suffer, compared with those who grew up with two parents. Though many, if not most, children of single parents grow up unharmed, so many children grow up living with only one biological parent that even if only a minority have problems, the cost to them and to society is considerable.72

Households and Families

Twentieth-century demographic trends have reshaped the nation's households and families. Both are noticeably smaller. In 1900, the average household had 4.8 people, and the average dwelling contained 5.7 people.73 At the end of the century, the average household contains 2.6 people. So few households have livein help or boarders that the related statistic of persons per dwelling is no longer published.

At the beginning of the century, "household" and "family" were interchangeable terms, and included people who lived together but were not necessarily kin, as well as people who lived in quasi-households such as boarding houses. This assumption that households were cohesive units made sense in a mutually dependent agricultural environment, but was less applicable in urban areas where people might live in the same dwelling but have very independent lives. Starting in 1900, statistics began to distinguish "economic" families from private, or "natural" families, but the definition of "household" and "family" used in government statistics today was not common until mid-century.74 A "household" consists of one or more persons sharing living quarters. A "family" is made up of two or more persons living together who are related by birth, marriage, or adoption.

In 1950, 90 percent of all households were families; now, less than 70 percent of households are families. Family households have changed too. In the 1950s and 1960s, the most common family arrangement was a married couple with one or more children under age 18 — more than one-half of all families fit this description. By 1990, the combination of longer life expectancy and lower fertility made married couples without children under age 18 living in the home the most common family type. In 1999, married couples living with their own children were just 35 percent of all family households. In the meantime, single-parent households (mostly female-headed) grew from 4 percent of family households in 1950 to 13 percent in 1999 (see Figure 10).

The rise in women's labor force participation has also played a role in transforming the nation's families. The "traditional" family of working father, homemaker mother, and one or more children now accounts for only 13 percent of all married-couple households, and only 7 percent of all U.S. households. Families with children in which both parents work make up 31 percent of all married-couple households, and 16 percent of all households.

Figure 10
U.S. Family Households in 1950 and 1999

The effect of the changing family composition on children's well-being has sparked considerable interest and debate among social scientists and commentators. Some suggest there is a tradeoff between parent's time and family income: Children with both parents employed may enjoy a higher living standard, but at the expense of less parental attention. A closer look at the trends reveals a more complicated situation. First, the earnings of mothers who entered the labor force in recent decades may have simply offset the stagnant or declining earnings of many fathers.75 In 1999, one in four working wives earned more than their husbands — over 30 percent of wives earned more in couples (with or without children at home) where both partners worked full-time. Also, the decline in the number of children per couple has contributed to parents (particularly fathers) actually spending more time, on average, with each child than they did at the beginning of the century. New research also shows that employed mothers have managed to reserve nearly as much time for their children as mothers who don't work outside the home by, for example, reducing time spent on housework.76

A vast and growing majority of non-family households consist of persons living alone. This is not surprising considering that people age 65 or older are the largest share of single-person households, and that their numbers are growing rapidly. Until the 1950s, a widowed parent, most often a mother, often would move into an adult child's household. Since 1960, older people's increasing financial independence has been accompanied by increasing residential independence. In 1999, about 30 percent of the population ages 65 and older lived alone. Single-person households are more common than they were in the 1960s in every age group — one in 10 Americans ages 25 to 44, the most common ages for marriage, lives alone.

Longer average life expectancy is producing many multi-generational families, including a significant rise over the last quarter century in the number of households in which grandparents are raising grandchildren. In 1970, about 3 percent of children under age 18 lived in a home maintained by grandparents. By 1997, the share had increased to nearly 6 percent.77 Because these "skipped generation" families are more likely to be poor and to receive public assistance, they were the focus of the only new question added to the 2000 Census.

The public has also become concerned about "sandwich generation" families, in which adults simultaneously provide a home for both their parents and their children. While many Americans are involved in the care of their elderly parents in some way, only about 1 percent of American family households include both dependent children and dependent adult parents.78 The proportion of Americans ages 65 and older who live in a relative's household has been nearly halved since 1970, amounting to less than 7 percent in 1999.

The great majority of Americans ages 65 and older either head their own family household or are married to the household head — 61 percent in 1999. Most of these families are married couples, but an interesting phenomenon is the growing number of young or middle-aged adults who move into the home of an elderly, usually widowed parent. The adult child tends to be divorced or to have a low income, so the parent's greater household wealth and the child's companionship and assistance are a logical and perhaps happy combination.79

Some researchers are concerned that the smaller families and more fragile marital ties of the baby boom and subsequent generations will leave the growing numbers of elderly with fewer kin. Others have pointed out a countervailing trend, in which divorce, remarriage, and family blending have expanded the range of kinship ties, especially for people who were actively involved in raising stepchildren.80 In any case, family life now extends well beyond the traditional household of parents raising children, and adult family relationships across household lines are increasingly important. So are relationships of children with parents who live elsewhere, or with stepparents who are helping to raise them.

Americans now spend an average of only 35 percent of the years between ages 20 and 70 in parenting, but there are considerable differences by gender, by race, and by marital history. Women spend slightly more of their lives parenting, men slightly less, because women tend to retain custody of their children when parents divorce or separate. Because men are more likely to remarry than women, men spend about twice as much time as women living with children who are not related by blood, sometimes along with their own biological children. But men (especially African-American men) are also more likely to live apart from their own children (see Figure 11). These complex family relationships also can complicate the allocation of financial resources among children from different unions who live in different households.

Measures of Well-Being

The American population is participating in a new and diverse economy at the end of the 20th century. Success in that economy requires more education than ever before, and education opens more doors than ever before, especially for women and racial and ethnic minority groups. Many Americans have prospered well beyond their expectations, but many have found themselves shut out of the new technology-driven prosperity.

Trends in Education

Individual Americans can't control the evolution of the economy, but they can control what they bring to it, and their upward economic mobility is due largely to steady improvements in their educational attainment. Throughout American history, each generation has tended to have more education than the generation that preceded it. Indeed, widespread public investment in education is generally credited as being a major factor in creating the large and prosperous middle class that characterized the 20th century.

About 85 percent of Americans born around the beginning of the 20th century completed at least four years of school — enough to be defined as literate. By World War II, this level of education was nearly universal. A high school diploma was the next milestone, and the proportion of Americans reaching it rose from around 25 percent for those born around 1900 to more than 80 percent of those born in the 1960s.81 In 1999, 83 percent of all adults and 88 percent of young adults ages 25 to 34 have a high school diploma or its equivalent.

As the economy continues to reward people for investing in education, more adults are attending and completing college. In 1999, a record 25 percent of all Americans ages 25 and older had at least a bachelor's degree, compared with just 5 percent in 1940. Far more U.S. adults have completed college (44 million) in 1999 than have failed to finish high school (29 million). Still, strong differences persist by race and Hispanic origin. Asian and Pacific Islanders have the most education: In 1997, 42 percent had completed at least four years of college, compared with 25 percent of whites, 13 percent of blacks, and 10 percent of Hispanics.82

Figure 11
Years and Type of Parenting, Adults Ages 20 to 60 by Race and Sex, 1990

There are differences between men and women too. Women born around 1900 had about the same opportunity as men to attend and complete college, but women fell behind in subsequent decades as professional opportunities widened for men but not for women. Male college enrollment surged after World War II because the new GI bill paid college costs for (primarily male) veterans. Since mid-century, the transformation in skills demanded by the economy, and changes in attitudes and law first narrowed the gender gap and then reopened it in the opposite direction, Women now surpass men in attending and completing college.

Work Force Trends

In the last four decades of the 20th century, racial and ethnic minorities and women have succeeded in securing a foothold in many aspects of society. Possibly their greatest success is their growing presence in the work force. Although women and minorities still make up a disproportionate share of the poor, they now occupy significant space on all rungs of the workplace ladder.

Only one woman in five was in the paid labor force in 1900, only one woman in four in 1940. But starting at mid-century, women's labor force participation rates began to climb, reaching 60 percent in 1998. With educational attainment roughly equal to men's, women were well-positioned to take advantage of the new jobs fostered by technological change and the growth in service employment. Strong advocacy efforts contributed to outlawing longstanding barriers to promoting women to senior positions, and changing attitudes allowed many to advance. Moreover, the increasing share of adult life spent outside parenting made it possible for many women to combine work and family roles sequentially, if they did not need or want to do so simultaneously.

Lower levels of education and persistent discrimination, as well as language barriers for recent immigrants, have made it harder for minority population groups to seize new opportunities. Unemployment is an ongoing problem for minority youth, and the steep decline in well-paying jobs for men with little education has been particularly hard on black men. Nevertheless, a vigorous civil rights movement opened doors for many minority group members with the requisite education and skills. In communities with large minority populations, it is now common to see minority men and women filling high-status occupations. Indeed, sufficient numbers of minorities advanced in recent decades for some researchers to declare that race has declined significantly (but has not disappeared) as a factor in determining men's labor market opportunities.83 (The gap in women's earnings and opportunities across racial and ethnic lines has been much smaller.)

Trends in Americans' occupations illustrate the impact of the changing economy on the population. In 1900, most Americans still lived in rural areas, and nearly two-fifths of the work force were farmers, farm managers, farm workers, and the like. A little more than a third worked in blue-collar, non-farm jobs, mostly in manufacturing. But as the United States industrialized and urbanized, the proportion of Americans in farming-related jobs has shriveled — from 21 percent in 1930, and 6 percent in 1960, to just 2 percent in 1999.

In the second half of the 20th century, more Americans moved into white-collar jobs — particularly managerial, professional, and technical jobs. About 29 percent of Americans worked in white-collar jobs in 1930; by 1960, the share had grown to 42 percent. Today, America's post-industrial economy employs nearly three in five workers in managerial, professional, technical, or administrative jobs, compared with just one in four in blue-collar jobs. Service jobs — a wide-ranging occupational category that includes private household workers, firefighters, police officers, and barbers — account for another 14 percent of U.S. workers.

The economic transformation has also condensed Americans' work lives. When education was less important and less widespread, large numbers of children were economically active. In 1900, 26 percent of boys and 10 percent of girls ages 10 to 15 worked for pay; now nearly all children this age are in school. Similarly, retirement was not a viable concept in the agricultural economy of the early 20th century. In 1890, nearly three in four men ages 65 and over were working; so were 58 percent in 1930.84 But Social Security and stable jobs enabled men in particular to fund an increasingly lengthy retirement. In 1999, barely 17 percent of men ages 65 and older were in the paid labor force.85

Wealth and Poverty

American household income reached an all-time high toward the end of the 20th century: the median household income in 1998 was $38,885, up 21 percent over a quarter of a century (adjusted for inflation). Population trends played a role, with record-low household size, a record-low dependency rate, and a record-high proportion of the population in the paid work force. Moreover, increasing shares of the population are now wage and salary workers, and work in urban rather than rural areas — both characteristics are correlated with higher money incomes and with higher levels of education. So, demographic trends alone would have produced higher incomes as Americans continued to re-orient their lives in search of higher incomes.

Median income varies among population groups. In 1998, married-couple families, often with two earners, had an annual median income more than twice that of female-headed families and non-family households, both of which tend to have one or no earners (see Table 3). Family income more than doubled in the last half of the 20th century, reaching an all-time high of $47,469 in 1998.

Household income was highest for Asians, followed by non-Hispanic whites, and it was lowest for Hispanics. But these population groups tend to have different numbers of earners in their households (Asian Americans tend to have more household members earning income than the average, for example, while African-Americans have fewer than the average). Annual per capita income is more descriptive of racial and ethnic differences in how much individuals earn. Non-Hispanic whites had the highest per capita income in 1998 ($22,952), closely followed by Asians; Hispanics had the lowest annual income ($11,434).

Table 3
Median Household Income Among Selected Groups of Americans, 1998
Household type Number of households (millions) Median annual income
All households 103.9 $38,885
Family households 71.5 47,469
Married-couple family 54.8 54,276
Female-headed family 12.8 24,393
Nonfamily households 32.3 23,441
Female-headed 18.0 18,615
Male-headed 14.4 30,414
Characteristics of household head
White non-Hispanic 78.6 42,439
Black 12.6 25,351
Asian/Pacific Islander 3.3 46,637
Hispanic 9.1 28,330
U.S.-born 92.9 39,677
Foreign-born 11.0 32,963
Region of residence
Northeast 19.8 40,634
Midwest 24.5 40,609
South 37.0 35,797
West 22.5 40,983
Source: U.S. Census Bureau, Current Population Reports P60-206 (1999): Table A.

This order parallels the relative median age of each of these population groups, as well as their relative educational attainment. Thus, some of the difference in per capita income is explained by age and education — people with more experience and education tend to earn more money, but the income gap also reflects other factors. Asians, for example, have more education than non-Hispanic whites, but have lower incomes.

The U.S. economy worked its way through a major structural change in the 1980s and 1990s, as employers adjusted to the new computing and communications technologies, and a trade-dominated globalization of many economic activities. Poverty rates rose in the early 1980s and again in the early 1990s, as business downturns worsened the impact of structural change. But by the end of the prosperous 1990s, real income finally surpassed previous highs, and more people moved out of poverty. The total number of people in poverty fell by almost 5 million between 1993 and 1998. In 1998, 12.7 percent of Americans were poor, compared with the record low of 11.1 percent measured a quarter century earlier.

Poverty continues to characterize some population groups more than others. For example, the poverty rate for non-Hispanic whites in 1998 was one-third the rate for Hispanics and blacks, and noticeably lower than the poverty rate for Asians and Pacific Islanders. These persistent racial differences reflect many demographic differences, as well as discrimination in the workplace, and they are reinforced by differences in ownership of assets and other kinds of wealth, including home ownership and inheritance prospects.86 Another poverty constant is the tendency of female-headed families to be poor — and the number of female-headed families has increased significantly in recent decades. The challenge of being both breadwinner and caretaker can be particularly overwhelming for women who lack education and other economic resources.

Two relatively recent trends are the improving situation of older Americans, especially in recent decades, and the deteriorating situation of children. In the 1960s, older Americans were much more likely than young Americans to live below the poverty threshold. This situation reversed in the mid-1970s. By 1998, 10.5 percent of the much larger population of Americans ages 65 and over was poor (down from 35.2 percent in 1959) — the same percentage as for working-age Americans. In contrast, 18.9 percent of children were poor.

Public policy is largely responsible for shielding many older Americans from poverty, as now nearly all are eligible to receive Social Security benefits, which are indexed for inflation. Moreover, the large and growing share of American men who held wage and salary jobs during the relatively prosperous post-World War II period were able to accumulate both pensions and savings. Consequently, the elderly poor are largely women, who are not as likely to have accumulated many retirement benefits. Also, women tend to outlive their husbands who are likely to have had higher lifetime earnings.

Because women earn less than men, on average, even a widow who worked long enough to qualify for Social Security benefits on her own is likely to get more income if she chooses to get the widow's portion of her husband's Social Security benefits rather than her own.

Public policy has not addressed the needs of children in the same way, although marriage and divorce trends have exposed increasing numbers of children to poverty. Moreover, the increased demand for a labor force with more education and training means that even two-parent families are likely to be economically stressed. Young couples are often paying for their education with entry-level salaries at the same time they are trying to establish a home and family. A parent who stays home to care for children loses potential income. For other young families, the high costs of child care make it uneconomical for both parents to work.

These problems are particularly acute for the "working poor," people who have jobs but earn so little that they and their families are officially poor. In 1998, 2 million adults ages 22 to 44 worked full-time all year but still were poor, while another 4 million poor worked at least part-time during the year. These Americans — 57 percent of all poor persons ages 22 to 44 — work in jobs with low pay and few benefits.87 Some of them are going to school, some of them are caring for children, some of them are ill or disabled, and some simply lack the skills to get a better job. Many will eventually acquire training and experience that will enable them to escape poverty, but those who have children will have raised them in poverty.

The working poor represent one aspect of the increasing income diversity that seems to characterize the post-industrial economy. In 1998, the share of income held by the bottom two-fifths of households was less than 13 percent, compared with 15 percent in 1967. The share held by the second and third fifths also declined. In contrast, the share held by the top fifth rose from 44 to 49 percent (see Figure 12).

Americans move with remarkable frequency between these income groups, especially as they make such common demographic transitions as marriage and divorce, or school, work, and retirement.

Future Prospects

The U.S. population will continue to grow, but its future profile will depend on which of the three sources of growth contributes the most: fertility, mortality, or migration. Fertility rates would have to increase significantly to maintain the traditional age profile of a pyramid, with young people in the majority. If the pace of mortality improvement is maintained, population growth will continue to increase at older ages. And if migration stays at the same level or increases, the racial and cultural mix of the population will change yet again, given the likely sources of new immigrants.

Figure 12
Percent of Aggregate Household Income by Household Income Level, 1967 and 1998

Currently, fertility rates are relatively stable at about two children per woman. The fertility increase during the post-World War II baby boom is sufficient evidence, however, that couples may decide to have larger families in the future. Perhaps couples will decide to have more children once they perceive that lives have lengthened enough for parents to focus sequentially, if not simultaneously, on children and career. Perhaps the family-centered values of immigrants from new cultures will cause fertility to converge, not to the lower non-Hispanic white level, but to a higher level. In any case, public policy toward such basic concerns of potential parents as housing, health, and childcare costs is likely to influence decisions to have additional children, especially since investments in careers and homes, as well as children, are concentrated in the early phases of the adult income cycle.

In recent decades, improvements in mortality have contributed to U.S. population growth by lengthening Americans' lives. Scientists generally agree that mortality will continue to improve over the next half century, increasing the numbers of older Americans well beyond the level of the current projections by the Social Security Administration.88 Beyond that, they disagree as to whether natural limits to life expectancy exist, whether devastating new diseases will appear, or whether significant public investments in health research will continue to vanquish old diseases.

Public policy is most important in determining the nature and level of immigration. It is difficult to imagine that immigration will fall much below current levels, if only because population is growing so rapidly in less developed countries. More than 1 billion of the world's 6 billion people are between ages 15 and 24, and nearly 2 billion more are under age 15 — and have yet to have their own children. More than 95 percent of the world's youth currently live in less developed countries where fertility rates, though lower than they were a half century ago, are still much higher than in the United States. As successively larger generations of young people seek their economic future, it is likely that they will continue to look to wealthier countries for work. In fact, public policy might favor admitting more young immigrants, rather than lengthening the work life to accommodate the increasing number of years lived after the official retirement age.

Households and families may continue to change shape as mortality improvements increase the number of years adults live after raising their children. Older people who are not currently married are most likely to live alone, but marital ties have been looser for today's younger generations and many of them have lived in a variety of settings. They may well return to a variety of living arrangements as they age, depending on their preferences for privacy vs. companionship, and on their economic resources.

Today's midlife adults, for example, had fewer children than current older generations, and have been more likely to divorce. Remarriage after divorce or death of a partner is much less common for older women than older men, if only because women are more numerous at the older ages. Divorce also diminishes family resources for many men in later life, as sociologists have found that relationships between adult children and parents who did not care for them as children are weak.89 Demographers who study how people share time and money across generation and household boundaries have found that such transfers tend to go from the older generations to the younger. So with more people entering older age not married and dependent on their own resources, it is conceivable that new forms of household organization might develop in response to new demographic realities.

The nation's racial and ethnic composition is also changing in a new way. As minority population groups grow relative to the majority group, more Americans are marrying across racial lines. In 1998, about 5 percent of married couples included spouses of different races or origin. Between 1970 and 1998, the number of interracial couples rose from 300,000 to 1.4 million, and the number of Hispanics married to non-Hispanics rose from 600,000 to 1.7 million.90 Not surprisingly, the numbers of multiracial children have been growing, leading the government to permit people multiple choices for identifying their race in the 2000 Census. The large numbers of Americans who are both white and American Indian responded to the previous requirement to choose one race in unpredictable ways. Some people identified themselves as being one race on some forms and as another race on other forms, or they switched from one race to another over their lives. Large numbers of young adult American Indians suddenly appeared in the 1990 Census, particularly in large cities, after the movie "Dances With Wolves" presented the Indian perspective in a way that made people with Indian ancestors proud to acknowledge them. The large numbers of Americans who are both white and black responded to the forced choice with black, in response to Americans' longtime practice of categorizing them that way.91 Large numbers of Hispanics have tended to choose "other" when forced to make a racial choice.

The new unforced choices in racial and ethnic categories will yield 63 different combinations of race and Hispanic origin. How people respond to this choice, now and in the future, may create a new population group: multiracial people. Or, it may blur traditional group boundaries and change the way people think and act about race. Since the U.S. racial picture is so varied from place to place, it is likely that there will be no single answer. Certainly, many parts of the United States have yet to experience extensive racial diversity, and communities that suddenly find a new population group within their midst still tend to react with consternation. Such recent controversies as flying Confederate flags over statehouses test whether the population is willing to view its heritage and interests in terms of "our" (that is, all the population) or merely "my" group's heritage or interests.

The geographic distribution of the population may take new directions under the impact of new information and communication technologies, just as it did in the 20th century in response to such inventions as automobiles and air conditioning. There are already indications that some people are using new technologies to act on Americans' long-expressed preference for small cities and towns. Entrepreneurs and self-employed people are particularly free to move where they wish if their business is portable or communicable. For the larger number of employed workers, technology may allow them to work from home or car, thus allowing more dispersal of the large white-collar work force. Similarly, electronic commerce increasingly allows people to shop from their homes at a convenient time for goods they don't need to see or handle.

These developments offer new ways to organize metropolises, possibly relieving problems of pollution or congestion. The increasing proportion of people who are in older ages will also call for new urban solutions, if only because current urban layouts require most people to drive — effectively imprisoning people who are no longer confident about their driving ability. And as increasing numbers of households will have completed raising their children, a greater variety of housing options may emerge.

As the working-age population shifts from a pyramid dominated by young adults to a pillar with roughly equal numbers of people in all ages, worklife patterns will continue to diversify. Certainly the old model of climbing a career ladder in just one direction — up — makes little sense in such a population, if only because too many would be jostling for space on the upper rungs. Instead, the longer work life is more likely to be characterized by a variety of transitions, including midcareer changes and midlife education.

Demographers have already noted the increasingly messy transitions that now characterize the entry into work life.92 For one thing, the loss through trade or technology of well-paying manufacturing jobs has shifted more young adults into service and other occupations that either pay less or demand more experience and job-related training.93 In addition, the increasing numbers of people working while enrolled in higher education are not necessarily working in jobs on the lower rungs of their eventual career ladder. It is plausible that similarly messy transitions could characterize the end of work life, as careers come to a natural end while people still have the energy and inclination to work, along with the need to fund a longer retirement.

One likely development is an increasing number of women with an intensive attachment to the work force in the second half of "normal" work life. Over the coming decades, there will be roughly equal numbers of prime working-age women (ages 25 to 64) on each side of age 45. Women in the older half of the working ages are primarily women whose children are grown. Large numbers of midlife women may see both an opportunity to participate fully in the economy and a need to make up for lost time in building resources to support them in old age. With an estimated one marriage in two ending in divorce, women may feel it prudent to build their own nest egg and not count on a husband's support in retirement. More women in the labor force now are contributing to pension plans, and recent estimates from the Social Security Administration show that within a few years, a majority of married women will do better claiming benefits on their own account than as part of a couple.94

Overall, the U.S. population will experience the effects of the globalizing world economy, and contribute to them. Recent immigrants have new opportunities to retain and even profit from ties at home. The Internet and other new communication tools, for example, allow for frequent "conversations" as well as the ability to hire people back home to do computer piece-work or even to "telecommute." The growth of the computer software industry in India via American entrepreneurs from that country is a prime example. In that sense, the broad diversity of the population offers the United States a favored position in such an economic world. As French demographer Jean-Claude Chenais writes of contemporary American Society, "For the first time in history, a single country has a population made up of all the world's 'races' ('white,' 'black,' 'yellow,' and 'red'), of all its religions (Christian, Buddhist, Muslim, animists, etc.), and of all its languages."95 Such diversity has obvious economic benefits, but it also offers a new test for the nation's cohesion and commitment to democracy.

Suggested Resources

  • Glaab, Charles N. A History of Urban America, 2d ed. New York: Macmillan, 1976.
  • Farley, Reynolds, ed. State of the Union: America in the 1990s, 2 vols. New York: Russell Sage Foundation, 1995.
  • Hughes, James W., and Joseph J. Seneca, eds. America's Demographic Tapestry: Baseline for the New Millenium. New Brunswick, NJ: Rutgers University Press, 1999.
  • Jones, Maldwyn A. American Immigration, 2d ed. Chicago: University of Chicago Press, 1992.
  • Taeuber, Conrad, and Irene B. Taeuber. The Changing Population of the United States. New York: John Wiley & Sons, Inc., 1958.
  • Wells, Robert V. Revolutions in Americans' Lives: A Demographic Perspective on the History of Americans, Their Families, and Their Society. Westport, CT: Greenwood Press, 1982.

Websites

References

  1. Laurent Belsie, "In the America of 2100, Less Elbow Room," Christian Science Monitor, Jan. 21, 2000, sec. USA: p. 8. Australia has a higher annual population growth rate (1.3 percent), while Canada and New Zealand have slightly lower growth rates (0.8 percent). But since the U.S. population is so much larger, it is indeed alone among more developed countries in terms of numbers of persons added to the population each year.
  2. U.S. Census Bureau, "Projections of the Total Resident Population by 5-Year Age Groups, and Sex with Special Age Categories: Middle Series, 1999 to 2100," (NPT3). Accessed online at www.census.gov/population/www/projections/natsum.html on Jan. 20, 2000.
  3. Carl Haub and Diana Cornelius, 1999 World Population Data Sheet (Washington, DC: Population Reference Bureau, 1999).
  4. See, for example, Ben Wattenberg, "European Union? European Ostrich!" Jewish World Review, April 11, 2000. Accessed online www.jewishworldreview.com/cols/wattenberg1.asp on April 18, 2000; Ben J. Wattenberg, The Birth Dearth (New York: Pharos Books, 1987); Nicholas Eberstadt, "World Population Implosion?" The Public Interest (Fall 1997): 322; and Peter G. Peterson, Grey Dawn: How the Coming Age Wave Will Transform America and the World (New York: Times Books, 1999). For a balanced discussion in an international context, see "Maintaining Prosperity in an Ageing Society," OECD Policy Brief, No. 51-998, Organization for Economic Cooperation and Development, Paris.
  5. The TFR is a hypothetical measure that assumes that the age-specific birth rates of a given year will apply throughout a woman's lifetime. It does not necessarily correspond to the average number of children a given group of women actually had.
  6. Conrad Taeuber and Irene B. Taeuber, The Changing Population of the United States (New York: John Wiley & Sons, 1958): 267.
  7. National Center for Health Statistics, "Fertility, Family Planning, and Women's Health: New Data >From the 1995 National Survey of Family Growth," Vital and Health Statistics Series 23, no. 19 (Hyattsville, MD: National Center for Health Statistics, May 1997): 34, Table 1; and unpublished tables.
  8. See, for example, Suzanne M. Bianchi and Daphne Spain, "Women, Work, and Family in America," Population Bulletin 51, no. 3 (Washington, DC: Population Reference Bureau, December 1996): 11–12.
  9. Frederick W. Hollmann, Tammany J. Mulder, and Jeffrey E. Kallan, "Methodology and Assumptions for the Population Projections of the United States: 1999 to 2100," Population Division Working Paper No. 38, U.S. Census Bureau (Jan. 13, 2000): 10; and National Center for Health Statistics, "Births: Final Data for 1998," by Stephanie J. Ventura, Joyce A. Martin, Salley C. Curtin, T.J. Mathews, and Melissa Park, National Vital Statistics Reports 48, no. 3 (Hyattsville, MD: National Center for Health Statistics, March 28, 2000): Table 4.
  10. Kevin M. White and Samuel H. Preston, "How Many Americans Are Alive Because of Twentieth Century Improvements in Mortality?" Population and Development Review 22, no. 3 (September 1996): 41-573; and Jeffrey S. Passel and Barry Edmonston, "Immigration and Race: Recent Trends in Immigration to the United States," in Immigration and Ethnicity: The Integration of America's Newest Arrivals, ed. Barry Edmonston and Jeffery S. Passel (Washington, DC: The Urban Institute Press, 1994): 31–54.
  11. See, for example, James W. Vaupel, "The Average French Baby May Live 95 or 100 Years," in Longevity: To the Limits and Beyond, ed. Jean-Marie Robine, James W. Vaupel, Bernard Jeune, and Michel Allard (New York, Springer-Verlag, 1997).
  12. U.S. Census Bureau, Historical Statistics of the United States, Part 1 (Washington, DC: GPO, 1975): Series B 149-166; U.S. Census Bureau, 1999 Statistical Abstract of the United States, 119th ed. (Washington, DC: GPO, 1999): Table 1420; National Center for Health Statistics, "Births and Deaths: Preliminary Data for 1998," by Joyce A. Martin et al., National Vital Statistics Reports 47: 25 (Oct. 5, 1999): Table 11; and "Achievements in Public Health, 19001999: Decline in Deaths from Heart Disease and Stroke — United States, 1900–1999," Morbidity and Mortality Report 48, no. 3 (Aug. 6, 1999): 64-956.
  13. Kenneth G. Manton, Larry Corder, and Eric Stallard, "Chronic Disability Trends in Elderly United States Populations: 198294," Proceedings of the National Academy of Sciences 94 (1997): 259–398.
  14. Passel and Edmonston, "Immigration and Race." Passel and Edmonston estimate that immigration contributed 28 percent of population growth from 1900 to 1910.
  15. Immigration and Naturalization Service, 1997 Statistical Yearbook (Washington, DC: GPO, 1999): Tables 1 and 4; and Immigration and Naturalization Service, "Legal Immigration, Fiscal Year 1998," Annual Report 2 (May 1999): Table 1.
  16. U.S. Census Bureau. Tables accessed at www.census.gov/population/estimates/nation/nativity/fbta001.txt, on Jan. 20, 2000.
  17. Philip Martin and Elizabeth Midgley, "Immigration to the United States," Population Bulletin 54, no. 2 (Washington, DC: Population Reference Bureau, 1999): 17–22.
  18. Taeuber and Taeuber, Changing Population of the United States: 11226.
  19. Peter A. Morrison, ed., A Taste of the Country: A Collection of Calvin Beale's Writings (University Park, PA: The Pennsylvania State University Press, 1990): 12-18; and Kenneth M. Johnson, "The Rural Rebound," PRB Reports on America 1, no. 3 (Washington, DC: Population Reference Bureau, 1999).
  20. Charles N. Glaab, A History of Urban America, 2d ed. (New York: Macmillan, 1976): 2123.
  21. Taeuber and Taeuber, Changing Population of the United States: 11415.
  22. Joel Garreau, Edge City: Life on the New Frontier (New York: Doubleday, 1991).
  23. Taeuber and Taeuber, Changing Population of the United States: 127; and William H. Frey, "The New Geography of Population Shifts: Trends Toward Balkanization," in State of the Union, vol. 1, ed. Reynolds Farley (New York: Russell Sage Foundation, 1995): 276.
  24. U.S. Census Bureau, Statistical Abstract of the United States, 1999: Table 42.
  25. U.S. Census Bureau. Accessed online www.census.gov/population/estimates/metrocity/ma98-01.txt and www.census.gov/population/estimates/metrocity/ma98-05.txt, on Dec. 17, 1999.
  26. William H. Frey, "The New Geography of Population Shifts."
  27. William H. Frey, "Interstate Migration and Immigration for Whites and Minorities, 1985–90: The Emergence of MultiEthnic States," Population Studies Center, Research Reports 93-297 (Ann Arbor, MI: University of Michigan, 1993); and R.A. Wright, M. Ellis, and M. Reibel, "The Linkage Between Immigration and Internal Migration in Large Metropolitan Areas in the United States," Economic Geography 73, no. 2 (April 1997): 23-454.
  28. Taeuber and Taeuber, Changing Population of the United States: 71. Up to the closing of the legal slave trade in 1808, blacks formed a larger part of the population than they have at any time since. Peter H. Wood, Black Majority (New York: Alfred A. Knopf, 1974): xiii.
  29. U.S. Census Bureau, Historical Statistics, Series A: 91104.
  30. About one-third of white foreign-born residents are from the Middle East, which is included with Asia in official statistics.
  31. U.S. Census Bureau. Accessed online at www.census.gov/population/estimates/nation/nativity/fbtab003.txt, on Jan. 20, 2000.
  32. Patrick Heuveline, "The Global and Regional Impact of Mortality and Fertility Transitions," Population and Development Review 25, no. 4 (December 1999): 881-702. Fertility declined after mortality, but with a lag of about 20 years.
  33. Census Bureau projections have assumed that all children will have the same racial identity as their mothers. For alternative projections that account for interracial marriage and additional racial categories for children, see James P. Smith and Barry Edmonston, eds., The New Americans (Washington, DC: National Academy Press, 1997): 76-134.
  34. U.S. Census Bureau. Data on international migration are difficult to collect, partly because much migration is clandestine, partly because countries' recordkeeping practices differ. However, unpublished data from the United Nations show a considerable increase in emigration from subSaharan Africa in recent years, and Western European countries report considerable inflows. Current United Nations projections estimate a considerable increase in the numbers and proportion of young people in Africa: The share of the population ages 10 to 19 will increase from 16 percent to an estimated 22 percent between 2000 and 2020. Elsewhere in the world, the share of the population this age will either remain stable or decline, especially in more developed countries. See, for example, the chapter on population in The World's Women (New York: United Nations, forthcoming 2000).
  35. U.S. Census Bureau, "Methodology and Assumptions for the Population Projections of the United States: 1999 to 2100," Population Division Working Paper No. 38, issued Jan. 13, 2000: 12 and Table C.
  36. William H. Frey and Reynolds Farley, "Latino, Asian, and Black Segregation in MultiEthnic Metro Areas: Findings from the 1990 Census," Research Report No. 93-278 (April 1993), University of Michigan Population Studies Center, Ann Arbor, Michigan; William H. Frey and Douglas Geverdt, "Changing Suburban Demographics: Beyond the 'Black-White, City-Suburb' Typology," Research Report No. 98-422 (July 1998), University of Michigan Population Studies Center, Ann Arbor, Michigan.
  37. The inclusion of people from Asian countries in one racial group is largely political in origin, as Asians have tended to identify more with their fellow countrymen than with Asians in general. This may account for the greater fluidity in their self-reporting. For the test, see U.S. Census Bureau, "Results of the 1996 Race and Ethnic Targeted Test," Population Division Working Paper No. 18, May 1997.
  38. U.S. Census Bureau, Survey of Minority-Owned Business Enterprises: Hispanics (June 1996). Accessed online at www.census.gov/prod/2/bus/mob/mb92-2.pdf, on May 5, 2000.
  39. Kelvin M. Pollard and William P. O'Hare, "America's Racial and Ethnic Minorities," Population Bulletin 54, no. 3 (Washington, DC: Population Reference Bureau, 1999: 12.
  40. One estimate of the multiracial population sets its numbers as high as 8 million to 18 million people in 2000, considerably more than the net population undercount that has concerned both politicians and professionals. Joshua R. Goldstein and Ann J. Morning, "The Multiple Race Population of the United States: Issues and Estimates," in Proceedings of the National Academy of Sciences 97, no. 11 (2000): 623-035.
  41. In his classic work, Relief and Social Security (Washington, DC: The Brookings Institution, 1946), Lewis Meriam rejects this account, and asserts that 65 was chosen because few men were employed by that age. However, many of the original bills set 70 as the minimum age for collecting benefits, perhaps because a plurality of the systems set up by the states (24 of them) used 70. See Abraham Epstein, Insecurity, a Challenge to Americans (New York: H. Smith and R. Haas, 1936): Chap. 26.
  42. Judith Treas, "Older Americans in the 1990s and Beyond," Population Bulletin, 50, no. 2 (Washington, DC: Population Reference Bureau, 1995): 6.
  43. Ronald R. Rindfuss, "The Young Adult Years: Diversity, Structural Change, and Fertility," Demography 28, no. 4 (November 1991): 493–512.
  44. National Center for Education Statistics, Digest of Education Statistics, 1998: Table 183. Accessed online at www.nces.ed.gov on May 5, 2000.
  45. U.S. Bureau of Labor Statistics, unpublished tables acquired April 28, 2000.
  46. Valerie Kincade Oppenheimer, Matthijs Kalmjn, and Nelson Lim, "Men's Career Development and Marriage Timing During a Period of Rising Inequality," Demography 34 (1997): 311–30.
  47. See, for example, Frances Goldscheider, Arland Thornton, and Linda Young-DeMarco, "A Portrait of the Nest-Leaving Process in Early Adulthood," Demography 30 (1993): 68-399; and Frances Goldscheider and Calvin Goldscheider, "Leaving and Returning Home in 20th Century America," Population Bulletin 48, no. 4 (Washington, DC: Population Reference Bureau, 1993).
  48. U.S. Census Bureau, Special tabulations of the March 1999 Current Population Survey.
  49. U.S. Census Bureau, Press Release, Jan. 14, 2000. Accessed online at www.census.gov on Jan. 14, 2000.
  50. Forest E. Linder and Robert D. Grove, Vital Statistics Rates in the United States, 1900–1940 (Washington, DC: GPO, 1943): Table 1; Robert D. Grove and Alice M. Hetzel, Vital Statistics Rates in the United States 1940–1960 (Washington, DC: GPO, 1968): Table 55; and National Center for Health Statistics, "Deaths: Final Data for 1997," by Donna L. Hoyert, Kenneth D. Kochaneck, and Sherry L. Murphy, National Vital Statistics Reports 47, no. 19, (June 30, 1999): Table 9.
  51. Taeuber and Taeuber, Changing Population of the United States: 192.
  52. Ibid.: 213.
  53. U.S. Bureau of Labor Statistics, unpublished tables acquired April 28, 2000.
  54. Taeuber and Taeuber, Changing Population of the United States: 213.
  55. John R. Besl and Balkrishna D. Kale, "Older Workers in the 21st Century: Active and Educated, A Case Study," Monthly Labor Review 119, no. 6 (June 1996): 1828; and Treas, "Older Americans": 22.
  56. Kenneth G. Manton, Larry Corder, and Eric Stallard, "Chronic Disability Trends in Elderly United States Populations: 1982–1994," Proceedings of the National Academy of Sciences 94 (March 1997): 259–398.
  57. See, for example, Rosalind Berkowitz King, "Time Spent in Parenthood Status Among Adults in the United States," Demography 36 (August 1999): 37-785.
  58. For a historical appreciation, see Robert V. Wells, Revolutions in Americans' Lives (Westport, CT: Greenwood, 1982): 231; and Hugh Carter and Paul C. Glick, Marriage and Divorce: A Social and Economic Study, revised ed. (Cambridge, MA: Harvard University Press, 1976): 61–64.
  59. Goldstein and Kenney estimate that some 90 percent of these and later cohorts will eventually marry, thus maintaining the historical pattern that prevailed for all but the middle of the 20th century. Joshua R. Goldstein and Catherine T. Kenney, "Marriage Delayed or Marriage Foregone: New Cohort Forecasts of First Marriage for U.S. Women." (Paper presented at the annual meeting of the Population Association of America, Los Angeles, March 23, 2000).
  60. See, for example, Larry Bumpass, Andrew Cherlin, and James Sweet, "The Role of Cohabitation in Declining Rates of Marriage," Journal of Marriage and the Family 53: 91-325.
  61. Wendy D. Manning and Pamela Smock, "Why Marry? Race and the Transition to Marriage Among Cohabitors," Demography 32 (November 1995): 50-920.
  62. Wells, Revolutions in Americans' Lives: 95.
  63. Ibid.: 232; and Taeuber and Taeuber, Changing Population of the United States: 156.
  64. Bianchi and Spain, "Women, Work, and Family": 12.
  65. National Center for Health Statistics, Vital Statistics of the United States 1987, vol. III — Marriage and Divorce (Washington, DC: GPO, 1991): Table 21; and National Center for Health Statistics, "Births, Marriages, Divorces, and Deaths: Provisional Data for 1998," National Vital Statistics Reports 47, no. 21 (July 1999): Table 1.
  66. National Opinion Research Center. Data accessed online here, on March 8, 2000; and U.S. Census Bureau, Statistical Abstract, 1999: Table 1418.
  67. Dennis P. Hogan and Frances Goldscheider, "Men's Flight from Children in the U.S.? A Historical Perspective." (Paper presented at the annual meeting of the Population Association of America, Los Angeles, March 25, 2000).
  68. Bianchi and Spain, "Women, Work, and Family": 11.
  69. Ibid.: 12.
  70. For an expert review of competing research-based findings, see Andrew J. Cherlin, "Going to Extremes: Family Structure, Children's Wellbeing, and Social Science," Demography 36 (November 1999): 421–428.
  71. See Sara McLanahan and Gary Sandefur, Growing Up With a Single Parent: What Hurts, What Helps (Cambridge, MA: Harvard University Press, 1994).
  72. Cherlin, "Going to Extremes": 427.
  73. See Wells, Revolutions in Americans' Lives: 152, which compares statistics over 200 years. In 1764, the average household size in Massachusetts was 6.0, with an average of 7.2 persons per dwelling.
  74. Taeuber and Taeuber, Changing Population of the United States: 170.
  75. See Maria Cancian and Deborah Reed, "The Impact of Wives' Earnings on Income Inequality: Issues and Estimates," Demography 36 (May 1999): 17-384.
  76. Suzanne Bianchi, "Maternal Employment and Time with Children: Dramatic Change or Surprising Continuity?" (Presidential address, annual meeting of the Population Association of America, Los Angeles, March 24, 2000.)
  77. U.S. Census Bureau, "Co-resident Grandparents and Grandchildren," by Ken Bryson and Lynne M. Casper, Current Population Reports P23-198 (May 1999): 1.
  78. In 1997, about 800,000 U.S. families (about 1 percent of all families) included dependent children and at least one grandparent and were maintained by a parent of the child(ren). See U.S. Census Bureau, "Co-resident Grandparents": Figure 3.
  79. Diane J. Macunovich, Richard A. Easterlin, Eileen M. Crimmins, and Christine Macdonald, "Echoes of the Baby Boom and Bust: Recent and Prospective Changes in Living Alone Among Elderly Widows in the United States," Demography 32 (February 1995): 1728.
  80. See, for example, Kenneth W. Wachter, "Kinship Resources for the Elderly," Philosophical Transactions of the Royal Society of London 352 (1997): 181-117.
  81. Robert D. Mare "Changes in Educational Attainment and School Enrollment," in State of the Union: America in the 1990s, Vol. 1, ed. Reynolds Farley (New York: Russell Sage Foundation, 1995): 162.
  82. U.S. Census Bureau, Statistical Abstract of the United States: 1999: Table 263.
  83. See, for example, William J. Wilson, The Declining Significance of Race, 2nd ed. (Chicago: University of Chicago Press, 1980); and Arthur Sakamoto, HueiHsia Wu, and Jessie M. Tzeng, "The Declining Significance of Race Among American Men During the Latter Half of the Twentieth Century," Demography 37 (February 2000): 41–52.
  84. Taeuber and Taeuber, Changing Population of the United States: 213.
  85. U.S. Bureau of Labor Statistics, unpublished tables acquired April 28, 2000.
  86. William P. O'Hare, "A New Look at Poverty in America," Population Bulletin 51, no. 2 (Washington, DC: Population Reference Bureau, 1996): 12.
  87. Tabulations for the author by PRB of the March 1999 Current Population Survey.
  88. Michael A. Stoto and Jane S. Durch, "Forecasting Survival, Health, and Disability: Report on a Workshop," Population and Development Review 19 (September 1993): 557–581.
  89. See, for example, Frances Goldscheider, "The Aging of the Gender Revolution: What Do We Know and What Do We Need to Know?" Research on Aging 12 (1990): 53-145.
  90. Pollard and O'Hare, "America's Racial and Ethnic Minorities": 12.
  91. For a good history of legal categorization children of white and black couples, see Ira Berlin, Slaves Without Masters (New York: Pantheon, 1974): 1614; and Martha Hodes, White Women, Black Men (New Haven: Yale University Press, 1997).
  92. See Rindfuss, The Young Adult Years.
  93. See, for example, Kurt Schrammel, "Comparing the labor market success of young adults from two generations," Monthly Labor Review (February 1998): 39.
  94. Barbara A. Butrica, Steven H. Sandell, and Howard M. Iams, "Using Couple Data to Project the Distributional Effects of Social Security Policy Changes," Social Security Bulletin 3 (1999): 20–27.
  95. Jean-Claude Chenais, "L'immigration et le peuplement des Etats-Unis," Population 54, no. 45 (1999): 632 (author's translation).

Martha Farnsworth Riche served as director of the U.S. Census Bureau between 1994 and 1998. She was a founding editor of American Demographics, the nation's first magazine devoted to interpreting demographic and economic statistics for corporate and public executives. She has also served as director of policy studies for the Population Reference Bureau. Riche lectures, writes, and consults on demographic changes and their effects on policies, programs, and projects. Recent publications (all published in 1999) include "From Pyramids to Pillars: The New Demographic Reality"; "Cultural and Political Dimensions of the U.S. Census: Past and Present" in The American Behavioral Scientist; and "America's Changing Demographic Tapestry" (with Judith Waldrop), in America's Changing Tapestry: Public Policy Changes.

Box 1

Why Is Fertility Higher in the United States Than in Europe?

Back to Text

The United States has higher fertility than any other country in the industrialized world. At the end of the 1990s, the total fertility rate (TFR) was about 1.4 children per woman in Europe, for example, while the U.S. rate was about 2.1. Yet surveys find that women in all these countries say they want about the same number of children, most often two. Why is fertility higher in the United States?

One explanation for the higher U.S. fertility is that many European countries have racially homogeneous populations compared with the United States. In the United States, fertility rates differ among the nation's varied racial and ethnic population groups. In 1998, the U.S. TFR of 2.1 children per woman was made up of several different rates: non-Hispanic white, 1.8; black, 2.2; American Indian, 2.1; Asian and Pacific Islander, 1.9; and Hispanic, 2.9.

Demographers usually assume that fertility rates of different racial and ethnic groups will converge as the experience and circumstances of women in different groups become more similar. The gap between the rates of U.S. black and white women has narrowed in recent years, for example.

Current Census Bureau projections assume, however, that rates will be higher for minority women over the next 25 years at least. Because minorities will make up an increasing share of the population, these racial and ethnic differences are likely to keep U.S. fertility higher than that in Europe and other more developed countries. Some demographers believe that European fertility may not remain so low, however, which could narrow this fertility gap. The TFR, after all, does not indicate how many children a woman will actually have. It is a hypothetical estimate of a woman's lifetime childbearing: It is the average number of children women will have if, between ages 15 and 49, they bear children at the same rate as women did this year. The TFR is a useful indicator of how people's actions "this year" will affect population growth. But it is not a good indicator of their actions next year, or the year after, as recent history demonstrates.

In the 1970s and 1980s, for example, a large share of American women chose to attend college or get jobs rather than to marry and have children in their early 20s, and the U.S. TFR declined to a record low. At the time, some analysts thought these women were postponing births, while others thought that these Americans would never have as many births as the previous generation. Although the final statistics aren't in, it looks as if most of these women were having children later in their 20s and in their 30s — not eschewing motherhood altogether. Baby-boom women born between 1949 and 1953 had just over two children, on average, by their late 40s.1

Some demographers think that a similar shift of childbearing to older ages is partly responsible for historically low fertility rates in Europe now, although others think that fertility might remain this low and that women in these countries will never make up the births they postponed.2

Many social and economic factors in Europe today might encourage women to delay or forgo having children. High unemployment rates frustrate young Europeans' high expectations for salaries and professional advancement. Housing is expensive and scarce. Work schedules are relatively inflexible for women with children.3 Will fertility rates in Europe rise closer to the U.S. level if combining children and careers gets easier? No one knows for sure, but U.S. fertility remains the highest in the industrialized world for the foreseeable future.

References

1. National Center for Health Statistics, unpublished data.
2. John Bongaarts and Griffith Feeney, "On the Quantum and Tempo of Fertility," Population Council, Working Paper No. 109 (1998); and Ron Lestaeghe and Paul Willems, "Is Low Fertility a Temporary Phenomenon in the European Union?" Population and Development Review 25, no. 2 (June 1999): 211–28.
3. Jean-Claude Chenais, "Fertility, Family, and Social Policy in Contemporary Western Europe," Population and Development Review 22, no. 4 (December 1996): 72-939.


Copyright 2003, Population Reference Bureau. All rights reserved.