Three broad factors shaped Americans’ experience of health and disease during the nineteenth century: the kinds of diseases that afflicted large numbers of people; the local, domestic setting for sickness and caregiving; and the growing dominance after 1880 of orthodox medical doctors and the institutions in which they housed their particular style of medicine.

The Threat of Infectious Disease
Overall, infectious diseases borne by air, water, food, and insects were the most widespread threat to Americans’ health throughout the nineteenth century. This is not to say that genetic diseases, mental illness, or chronic degenerative diseases, such as cancer, were inconsequential or ignored. Nor did Americans discount the increase in occupational diseases, especially respiratory ailments plaguing coal minors and textile mill workers, which affected increasing numbers of workers after 1870. But no kind of sickness surpassed the power of infectious disease both to inflict acute suffering and to mobilize efforts at relief. These diseases sometimes took epidemic form, such as the nearly annual surges of yellow fever along the Atlantic seaboard in the early 1800s; the widespread outbreaks of Asiatic cholera in 1832, 1849, and 1868; and the deadly malarial outbreaks in the Mississippi and Ohio River valleys of the late 1870s. At other times, infections were endemic, such as widespread instances of tuberculosis, especially in cities.

Because the microbiological sources of such infections were imperfectly understood until the last two decades of the century, all groups of Americans were to some extent at risk. Such diseases, however, fell with particular force on poorer citizens, who lived in crowded conditions ideal for the spread of infection and whose malnourishment further weakened their resistance. African Americans, especially in the antebellum South, and the increasing number of urban immigrants in the North and Midwest clearly were two such groups at comparatively high risk. Moreover, Native Americans throughout the century were caught up in cycles of first-time exposure to infectious diseases (most dramatically smallpox) carried to them by Anglo settlers, which killed them in large numbers.

The sudden symptomatic onset characteristic of epidemic disease, along with the dread inspired by it, drastically shaped the health experience of large numbers of Americans for most of the century. And yet endemic infectious disease—sickness typically present among a group of people and therefore comparatively familiar—doubtless resulted in more suffering and death in the long run than even the fiercest of epidemics. Poverty, habits of diet, work, and leisure, and differential immunities also shaped the spread of endemic diseases in the population. Many African Americans were comparatively immune to certain strains of malaria because of a genetic trait that caused sickle cell anemia. The poor in warmer climates were susceptible to parasitic diseases or such dietary deficiency ailments as pellagra; Native Americans and Irish immigrants were plagued by alcoholism.

Other demographic characteristics were important as well. The vulnerability of children, for example, to measles, diphtheria, scarlet fever, and other potentially lethal viral and bacterial infections made childhood a time of life as vulnerable to serious disease as extreme old age. Women of childbearing age faced not only the physical stresses of pregnancy and birth but also the severe risk of postpartum infection that made sickness associated with childbirth probably the greatest single threat to the health of most women across class and racial lines. Malarial fevers were endemic to certain portions of the South, and in the colder northern states pneumonia and gastrointestinal infections often proved fatal. Tuberculosis, especially after 1850, was deeply seated in both the North and the South, and through the middle decades of the century it appears that various forms of kidney failure or heart failure (which likely were behind the widespread diagnoses of “dropsy” and “rheumatism”) accounted for high levels of morbidity and mortality in all regions.

Diagnostic terms that shift over time pose a problem for knowing what really afflicted sick people. Neither diagnostic procedures nor disease categories were standardized to any reliable degree throughout the century; published case studies, too, are idiosyncratic by modern standards, making retrospective diagnosis highly speculative. Broadly speaking, Americans’ life expectancy (at birth) appears to have risen during the century, from perhaps thirty to thirtyfive to forty to forty-five years. While still reflecting the toll taken by childhood diseases, the gradual rise suggests that by the end of the century, Americans were somewhat less likely to succumb to the major infections. The extent to which this change occurred, however, was not due to breakthroughs in therapeutic drugs or other interventions used by doctors. The change took place, first, because nutrition improved for most people, conferring added resistance to disease; and, second, because larger numbers of Americans were receptive to the heightened regulation of public health and to new standards of personal cleanliness as a shield against becoming sick.

Family, Community, and the Culture of Health
The way in which Americans cope with sickness and the skills and substances that were brought to the sickbed are crucial to understanding the experience of health and disease. For nearly everyone, regardless of wealth or social background, care for the sick was profoundly local in its resources and was delivered in a domestic setting. Although there were exceptions to this general rule—tubercular patients journeying to restorative places, late-century urban immigrants ending up in warehouse-like hospital wards— for the most part, sick people received care and got better or worse in familiar settings at the hands of people known to them. Moreover, although we now know that nearly all medicines in widespread use were ineffective against the major infectious diseases, Americans were notably partial to taking drugs as a first resort, whether preventive or remedial. Certain therapies in use throughout the century did have beneficial effects confirmed a century later: vaccination against smallpox, quinine as a hedge against malaria, carbolic acid as an antiseptic. In the main, however, Americans used a far greater range of substances that had dramatic effects on physical symptoms, altering or masking them, if not the therapeutic power that people imagined. Like drugs, ideas about the origins and nature of disease were drawn from a combination of sources—professional, vernacular, and exotic. Americans were open to overlapping spiritual, moral, and behavioral explanations of why they were sick, and they rarely dismissed any theory that seemed plausible. In sum, they gave and received care in a context that might not lead to cure (and might unwittingly lead to harm) but nevertheless bolstered a sense of the ability to act, choose, and try.

Family was central to this health care context in ways that had important consequences for Americans’ experience of health and disease. Birth and death remained domestic events for most people. Women of all classes were the primary birth attendants and caregivers; at death, they prepared the body and oversaw burial and mourning. In most homes, therefore, women were the sources of local knowledge about both sickness and basic therapies. In their diaries and letters, women frequently commented on health and illness—sharing medicinal recipes, critically comparing physicians, and in general organizing the household’s watchfulness in the face of disease.

Americans traded views on the relative “healthiness” of their surroundings, making the assessment of climate and hearsay into a touchstone of well-being that extended from the household to the community at large. The relationship of disease to religious faith also was a matter of general popular concern, cutting across social and gender divisions. The will of God or the power of the supernatural was never far from people’s sense of the meaning of sickness. Although the most inclusive public expressions of the tie between religious faith and the onset of disease—days of fasting and prayer during epidemics, for example—diminished somewhat during the century, Americans from various backgrounds privately continued to see clear ties between physical symptoms and possible moral or social transgressions. They acted in ways that affirmed an unbreakable link between spiritual and physical well-being, searching for ways to atone, to revitalize faith, or to give testimony to the wisdom of God’s trials a d the mercy of his acts. Preachers and holy healers, as well as doctors, gathered around the sickbed. The demonstration of religious faith, through prayer or a reliance on a traditional world of spirits or by seeking other portals to the supernatural world, was for many social groups a distinct way to mobilize collective resources of health.

Indeed, there arose in many areas a blended culture of faith healing and mental resistance to disease that in no way denied disease’s physical seat in the body. In the decades before 1880, especially, there was surging interest among middle-class Americans in the connections between God’s grace and the wonders of the natural world. They took an interest in learning about the natural environment, cultivating physical exercise and physiological information. Americans “botanized,” collecting samples of local flora and fauna; they became interested in weather, many keeping meteorological records; and they joined “physiological clubs.” Combining such knowledge with prayer or spiritualism, along with drugs, these Americans created a broad context for health and disease that was testimony to health as a spiritual condition as well as a physical one.

These domestic practices took place in a medical marketplace characterized by a vast array of healers who put forward their ideas and substances for popular use. A few cities and other local jurisdictions taxed or otherwise kept tabs on certain healers: midwives, for instance. But regulations were fragmentary or unenforced, and during the greater part of the century, sects of healers flourished or withered in a competition where the byword was “caveat emptor.” Powerful drugs were available with no restriction and at comparatively low cost. The domestic context for healing—indeed, for much of basic medical knowledge itself—gave households the ultimate authority to decide which brand of medicine would best serve.

The engine that drove this competitive medical world for all but the last decade of the century depended on several factors. First, no single medical sect was able to demonstrate that it was most effective across the entire range of risks and maladies; thus none could place sufficient pressure on people or lawmaking bodies to give them sole license to heal. Second, the dominant political climate in general, especially during the antebellum years, tended to be one in which professional hegemony over popular choice looked like a bid for monopoly power and was regarded with a powerful skepticism. Finally, medical sects tended to borrow certain therapies and procedures from each other and, in practice if not in theory, acted in ways that blurred lines between them. For example, even though orthodox physicians officially scorned hydrotherapists’ enthusiasm for water as a panacea, in practice, after the 1840s many physicians recommended pure drinking water to their patients as a “tonic” and advocated bathing as a disease preventive rather than a risky behavior. To take another example, physicians themselves split into warring factions during the early decades of the century, with a minority of homeopaths (who favored infinitesimal doses of drugs that mimicked disease symptoms, rallying the body’s natural healing powers) contesting therapies with allopaths (who prescribed large amounts of harshly acting drugs that reversed symptoms). In actual practice, however, many physicians adopted an eclectic stance, using remedies dictated less by dogma than by their own or their patients’ experience. Although it was risky and open to abuse by charlatans, this wide-open medical world also was testimony to the power of popular ideas about health and medicine to dominate decisions about care.

The Significance of Physicians
In the deepest sense, the struggle among sects of healers during the century had a significance that went far beyond the competition for economic rewards. It was a struggle over the definition of medical knowledge itself—how it would be intellectually organized and institutionally structured and how health itself was to be defined. The rise of orthodox, mostly allopathic physicians—healers holding medical degrees and claiming descent from such ancient authorities as Galen and Hippocrates—as a result of this debate is a particularly sharp way to focus the key changes in health and disease that, gathering force in the 1840s, became dominant during the last twenty years of the century.

The U.S. Census counted 40,755 people identifying themselves as physicians in 1850, a number that rose to 64,414 in 1870 and to 104,805 twenty years later. These were practitioners who claimed to possess a medical degree or who professed to practice in an orthodox fashion. Physicians were by far the largest group among medical sects, and, with the possible exception of such virulently anti-allopathic healers as Thomsonian botanical doctors, followers of doctor and entrepreneur Samuel Thomson (1769-1843) in the antebellum years, they were the most aggressively organized in their efforts to dominate the medical world. Along with the prestige of their ancient lineage and their dedication to scholarly tradition, physicians were notable as early as the 1820s for the social visibility of their medical societies, journals, and, most important, formal education. Even in the dispersed, domestic-centered world of medical practice, orthodox medical schools grew rapidly during the years after 1820, with the 10 schoo s in that year increasing to 44 in 1850 and some 106 by 1890. These schools increasingly focused physicians’ claims to superiority, becoming centers for orthodox medicine’s search for a more effective medicine, for new standards of professionalism, and thus, especially by the 1880s, for the authority to define broadly what mattered in health care.

A great part of physicians’ struggle for hegemony was a struggle over the tenets of orthodoxy itself. A chief marker of orthodoxy before the 1830s was its comparatively heavy reliance on broad (and, with hindsight, strikingly a priori) theoretical constructs of disease and the corresponding principles that explained how a person became vulnerable to it. Exact definition of these principles varied somewhat among physicians, but most held that an individual’s health depended on moderation in all things, which would maintain a balance of bodily qualities, variously described as “humors” or as properties of “vitalism,” components of healthiness understood as partly organic, partly mental phenomena. Moreover, individuals possessed a basic constitution that was specific to gender, race, and age and largely unalterable, though tempered by a propensity for sickness that consisted of diet, work, prudent (or imprudent) behavior, and the use of medicines. Such abstract principles permitted physicians easily to conflate moral judgments with observations of what was “natural” and led them to think in terms of polarities: dangerous versus healthy behavior, strong as opposed to weak constitutions, and stimulating compared with depleting medicines.

This approach to defining disease and health care began to change by the 1830s, in large part because of the pressure on orthodoxy from alternative forms of medicine that were less harsh and dogmatic. But change also was rooted in increasing numbers of physicians being dissatisfied with the inability of traditional theory to guide actual practice. A new clinical empiricism, with roots not only in the innovative, anatomy-based “Paris school” of French doctors but also in the trials and errors of American physicians who were frustrated with the mismatch between orthodox theory and health care, led numbers of physicians to curb (or at least postpone using) their traditionally aggressive (“heroic”) therapies in favor of observing, recording, and thus reevaluating what they saw at the bedside. Following this slow but profound change in practice, rigid theories of medicine based on balancing physical constitutions and the like began to be replaced by a more flexible sense of medicine as rooted in a dynamic of disease and health, pathologic conditions, and physiologic factors—each depending on the other and requiring careful, systematic study and broad, experimental application.

By mid-century, these changes—fitful and often frustrating to doctors and patients alike—were especially visible in three arenas of orthodox health care. It should be noted in passing that the Civil War, occurring in the midst of this change, shaped certain aspects of it. The growth of medical schools in the South, like that of other institutions, was retarded by the war’s destruction. The large numbers of white Southern men killed in the war, along with the new population of freed slaves seeking medicine outside the bonds of slavery, changed the gender and racial profile of Southern patients. African American physicians began to appear in larger numbers toward the end of the century, founding their own medical schools when white schools refused black students. Physicians of all descriptions relocated their practices because of the war, and, whether returning to their communities or not, many doctors retained and developed new techniques, particularly with regard to surgery. Except in these broad ways, however, the Civil War neither initiated watershed changes in mainstream health and health care nor diverted them.

The first key change that would dramatically reshape the American medical scene by the 1890s involved the relatively greater prominence of basic science—initially physiology— in orthodox medicine. As physicians became more willing to hold back from immediate bedside intervention and to observe and record their findings, medical educators began arguing for making basic science—expanded by the 1870s to include pathology, pharmacology, and the beginnings of organic chemistry—an essential first step in learning. This movement was reflected in the expansion of medical education from two years to four, with basic sciences taught before clinical or bedside techniques. The  rise of bacteriology as a science in the 1880s was a prime example of the effects of this change. As physicians and people in general began to appreciate the role of microorganisms in infectious diseases, it began to make sense to focus on the essential relationship of the well body and the legion of germs. Reformers argued that an effective application of bacteriology to medicine made it imperative to study people in aggregate, collecting data about what was statistically normal to a population, not what was deemed typical or natural to individual patients. Medical careers in basic sciences and in laboratory work thus began to open up by the 1890s.

The clinical promise of basic science and its popular appeal continued to grow, as did a second key change in mainstream health care: the rise of hospitals combining general caregiving with the practical education of physicians. Faculties in larger medical schools had been taking their students on hospital rounds since the mid-1830s, but such experience varied greatly in quality from mentor to mentor, not to mention the fact that hospitals before the 1870s were more custodial than therapeutic institutions. As formal schooling became more complex, however, both students and faculty sought a more regular way to integrate actual patient care into the new configuration of medical learning.

As urban populations of poor grew substantially in the decades following the Civil War, hospitals and medical schools combined their efforts to standardize charity care by giving over the bodies of poor patients to medical study. This, too, had a long tradition, but the difference by 1880 was the number of new general care hospitals that not only attracted funding from benefactors impressed by medical science but also gathered support from urban political leaders eager for institutions that so loudly announced progress. The medical profession, too (though not without resistance from the older generation of doctors), came to embrace an ideal of practice centered not in the patient’s home but amid the growing array of complicated instruments (in the new operating room, for example) as well as baseline requirements (antisepsis, professional nurses) that by the 1890s testified to orthodox power. It followed that by the end of the century, increasing numbers of middleclass patients as well as the poor began to receive care in hospitals.

Finally, these transformations in knowledge and institutions, affecting the very definition of health and disease, were joined to a change in orthodox medicine’s claim to unique legal privileges. Physicians were able to argue for the first time that the therapeutic promise in orthodox medicine, along with the ability of its institutions to attract well-funded support, made it by far the best repository for the collective interest in good health. Orthodox professional organizations like the American Medical Association grew rapidly in membership and lobbying force. Opponents wary of orthodox monopoly power continued to object to physicians’ drive for privilege, but, increasingly, lawmaking bodies were inclined to agree with the physicians, passing licensure laws and other regulations that restricted and marginalized alternative forms of medicine by 1900.

Thus, physicians’ rise in status to become a dominant profession with unparalleled authority to practice medicine and define disease was the sharpest single change in the organization of health care during the century. And yet there remained a gap between organization and effective cure. Although after 1880 the new science boosted successful efforts in public health and sanitation, and physicians’ institutions and professional power to some extent stabilized a risky commercial world of drugs and healers, these changes did not lead immediately to effective new drugs; specific medicines for most infections still were thirty to fifty years in the future. Indeed, at the end of the century many Americans (including some physicians) worried that despite gains in public health measures, orthodoxy’s new emphasis on laboratory findings and the ideal of the doctor-scientist might actually harm patient care. They feared physicians would become more remote from communities, and thus less sensitive to the social root of caregiving traditionally nourished by bedside relationships based on personal knowledge and trust.

FURTHER READINGS

Leavitt, Judith Walzer. Brought to Bed: Childbearing in America, 1750 to 1950. New York: Oxford University Press, 1986.

Typhoid Mary: Captive to the Public’s Health. Boston: Beacon Press, 1996.

Leavitt, Judith Walzer, and Ronald L. Numbers, eds. Sickness and Health in America: Readings in the History of Medicine and Public Health. 3d ed. Madison: University of Wisconsin Press, 1997.

Ludmerer, Kenneth M. Learning to Heal: The Development of American Medical Education. New York: Basic Books, 1985.

Morantz-Sanchez, Regina Markell. Sympathy and Science: Women Physicians in American Medicine. New York: Oxford University Press, 1985.

Pernick, Martin S. A Calculus of Suffering: Pain, Professionalism, and Anesthesia in 19th Century America. New York: Columbia University Press, 1985.

Rosenberg, Charles E. The Care of Strangers: The Rise of America’s Hospital System. New York: Basic Books, 1987.

Explaining Epidemics and Other Studies in the History of Medicine. Cambridge, U.K.: Cambridge University Press, 1992.

Rothman, Sheila M. Living in the Shadow of Death: Tuberculosis and the Social Experience of Illness in American History. New York: Basic Books, 1994.

Savitt, Todd L. Medicine and Slavery: The Diseases and Health Care of Blacks in Antebellum Virginia. Urbana: University of Illinois Press, 1978.

Starr, Paul. The Social Transformation of American Medicine. New York: Basic Books, 1982.

Tomes, Nancy. The Gospel of Germs: Men, Women, and the Microbe in American Life, 1870-1930. Cambridge, Mass.: Harvard University Press, 1998.

Vogel, Virgil. American Indian Medicine. Norman: University of Oklahoma Press, 1970.

Warner, John Harley. Against the Spirit of System: The French Impulse in Nineteenth-Century American Medicine. Princeton, N. J.: Princeton University Press, 1998.

The Therapeutic Perspective: Medical Practice, Knowledge, and Professional Identity in America, 1820-1885. Cambridge, Mass.: Harvard University Press, 1986.

Source: Encyclopedia of the United States in the Nineteenth Century. 3 vols. Charles Scribner’s Sons, 2001.

CITATION: Stowe, Steven M.: "Health and Disease." Nineteenth Century U.S. Newspapers, Cengage Learning, 2008

       

お断わり

ここに掲載されたエッセイで述べられた見解や意見は各エッセイの著者のものであり、また原本となる一次資料で述べられた見解や意見は原資料に由来するものであって、Galeはこれらのエッセイや一次資料をあるがままに出版・再現するのみで、おのおのの見解や意見を支持・否定するものではありません。著者の所属機関名や肩書はエッセイ寄稿時点のものであり、現在のものとは異なる場合があります。

これらのエッセイの著作権は、他に記載のある場合をのぞき、Gale, a Cengage Company に帰属するか、当社が著作権者より許諾を得て使用しているものです。事前の許諾なく無断で複製、転載、送信、放送、配布、貸与、翻訳、変造することを禁じます。