We Owe More to Clean Living Than We Thought

After years of poring over statistics on mortality around the world and across the centuries, Dr. Samuel Preston, professor of sociology and director of Penn’s Population Studies Center, found himself wondering what the population of the United States would look like if its mortality rates had not changed. What he discovered was astonishing: About half of us wouldn’t be here today.

Preston, who believes that our increased life expectancy is among the great advances of this century, says that the numbers “told an interesting story of human progress” — one with important implications for the nation and the world.

Back in 1900, the average life expectancy was just 48 years; today it is 76. About 18 out of every 100 children died before the age of five in 1895, as opposed to fewer than one in 100 today.

The ripple effect from this change alone is enormous, as Preston and Kevin M. White, a medical student and graduate student in demography, noted in a recent article that appeared in Population and Development Review. Theycalculated that if turn-of-the-century mortality rates still prevailed, we would be a nation of 139 million instead of 276 million. A quarter of those alive today would have been born but died, and the other quarter would not have been born at all — for the simple reasons that “parents, grandparents, or earlier ancestors would have died before giving birth to them or their progenitors.”

Just what made the great leap in life expectancy possible is still debated among sociologists, public-health officials, and others concerned with child mortality around the world. In the first four decades of this century, a host of factors played a role in keeping babies and small children alive, including better nutrition, cleaner water and improved sewage systems, the pasteurization of milk, better housing, and various preventative health practices. As people began to understand what causes the spread of disease — and translated that knowledge into public and private health practices — mortality rates began to plunge.

In 1900, Preston says, the son of a farmer and the son of a wealthy merchant had roughly the same chances of reaching adulthood. As the century progressed, occupational differences in mortality became noticeable, as Preston has noted in a chapter that he wrote with Dr. Douglas Ewbank, the adjunct professor of sociology who serves as associate director of Penn’s Population Aging Research Center, that appeared in a 1991 book titled What We Know About Health Transition: The Cultural, Social, and Behavioral Determinants of Health. In that chapter, Preston and Ewbank argued that the access to knowledge about infection — its causes and its prevention — affected mortality rates profoundly, albeit in ways that were difficult to document.

Since the great decline in child mortality occurred before 1940 — before the widespread use of sulfa drugs or the discovery of penicillin — something else was going on to ensure that children grew to adulthood. While the medical profession had a major impact on those rates after 1940, says Preston, it can’t take much credit before then, apart from such advances as smallpox and diptheria vaccines. Improved public sanitation helped, and as people began to accept the germ theory of disease, they adopted new hygienic health practices at home, where infants were most at risk: boiling bottles and rubber nipples, washing hands, isolating sick children, ventilating rooms, and protecting food from flies.

The drop in the death rate did far more to increase the population than did immigration during that period, says Preston, adding that it’s interesting to speculate “how many Americans owe their lives to 20th-century health progress.” While it’s impossible to pinpoint which of us are the lucky beneficiaries, chances are the total includes about half of those reading this article.

Share Button

    Related Posts

    Demographic Winter Is Coming
    Sanctions Imposed on Law Professor Amy Wax
    Tyshawn Sorey Wins Pulitzer Prize