It is tricky to evaluate the effects of the shift to maize on Native American health

Assessing the central nutritional role of indigenous agriculture alongside that of hunting and gathering might well suggest that malnutrition, rather than simply the spread of diseases through concentrated settlement, made communities less likely to recover from infectious epidemics. According to Snow, had “European expansion been less rapid, and had lethal epidemics not swept the landscape clear of Indian resistance as effectively as they did, the dynamics of historic cultural adaption” on the Great Plains and at the previous sites of European contact might have been different. But it is worth students and researchers asking a slightly different set of questions: Did the inability to reproduce horticultural and hunter-gathering methods actively contribute to Native American demographic decline following epidemics, rather than simply demonstrating another unfortunate result of the Biological Exchange? Is it possible to draw any stronger conclusions in regard to direct causation, rather than more general association, in assessing the dual disruption to hunter-gathering and horticultural practices? Did the change in Native American diets following European contact directly impact the attendant increase in mortality rates – as distinct from epidemics elsewhere in the world where demography was able to re-stabilize after around a century?

In order to consider such a possibility,plastic pots 30 liters students and researchers should turn to the literature on medical anthropology, paleo-archaeology, and modern experimental data on the link between health, immunity, and the consumption of vitamins and important minerals. Disrupted access to macro-nutrients and micro-nutrients – whether derived from hunted and gathered animals and plants, from indigenous agricultural practices, or a combination of both – might be defined as a co-factor alongside specifically pre-determined genetic loci and/or the biological exchange of diseases. Infectious diseases began to ravage Native American populations from the moment of first contact. But an inability to recover immunity and/or fertility, according to a working hypothesis, might have been exacerbated by declining access to ancestral sources of food. To assess these possibilities in greater detail, and how they might be understood by researchers and students, let us turn to more specific examples that relate to the hunting of animal products and the gathering and/or managed cultivation of vegetables, fruits, tubers, and seeds. They allow a potentially fruitful synthesis between early-contact history and modern research in evolutionary medicine and nutrition. Emphasizing the role of colonial human intervention rather than an amorphous biological exchange, modern historical scholarship provides further evidence of the problematic impact of European domestic agriculture on Native American health and fertility after contact.

Aside from its greater propensity to affect the spread of infectious diseases, we now know that the proliferation of small enclosures of cattle and grain physically disrupted hunting and gathering practices, as well as pre-contact forms of plant horticulture, and reduced their attendant nutritional gains. According to biological anthropologist Clark Spencer Larsen, the emphasis on disease in the biological exchange thesis “has overshadowed a host of other important consequences of contact such as population relocation, forced labor, dietary change, and other areas.”Meister similarly notes that “later population decline resulting from disease was made possible because Indians had been driven from their land and robbed of their other resources [including hunted animals and cultivated crops].”According to Anderson, “before long, the expansion of livestock-based agriculture ceased being a model for Indian improvement and instead served almost exclusively as a pretext for conquest, a very different expression of the cultural impact of distinct farming practices” among Europeans and Native Americans in eastern North America from the 1600s.As Kunitz has pointed out in a discussion of the paleoepidemiology of southwestern Native American communities, and their malnutrition following subsequent European colonization, “one does not need to invoke large-scale dramatic epidemics; prosaic entities like malnutrition… are more than sufficient to do the job [in demographic collapse].”Though it remains difficult to address the direct triggers for final mortality, Thornton has assessed much evidence on the history of Plains Native Americans in the two centuries after contact and concluded that their “mortality and fertility” was severely impacted “when the great herds of buffalo were destroyed” by European agricultural patterns, Native American over-hunting in response to curtailed nutritional sources elsewhere, and open warfare.

Other scholars concur that an association can be drawn between worsening health, the declining ability to hunt animals and/or cultivate plants, and a new reliance on European agricultural production as animals on the Great Plains came to be over-hunted.Before turning to cultivated plants, let us now consider how students and future researchers might understand the ways in which declining access to hunted animal products reduced the consumption of important minerals and fat-soluble vitamins, and affected overall health, immunity, and fertility. We would begin with a historical assessment of the role of hunted animals in Native American history. Particularly during the winter, many Native American communities relied on hunting and gathering fatty cuts of meat for optimal nutritional health, long after Neolithic-era Europeans had moved towards domesticated animal husbandry and grain production. Frison’s classic work on the pre-historic practices of the High Plains, Great Plains, and Rocky Mountain regions, for example, suggests that nutritious organs and fats from buffalo meat were favored over and above other portions.Having considered the importance of hunting in ancient populations, students might then turn to seventeenth-century and eighteenth-century European observations of indigenous communities in the northernmost parts of continental North America. Hunting and gathering patterns in those regions were less altered by colonization, and so provide a more accurate picture of their ancestral nutritional profile within their regional context. Their populations relied on fats from fish and land animals as a greater proportion of their diet due to their climate and ecology.Students could then move on to observational studies from pioneering ancestral health theorists such as Weston A. Price, as well as those from more recent anthropologists who have worked in regions such as British Columbia and sub-Arctic Canada. As Hearne noted elsewhere, community members tended to select only the fattiest parts of the animal, or nutrient-dense organ meats, throwing the rest away: “On the twenty-second of July we met several strangers,round plastic pots whom we joined in pursuit of the caribou, which were at this time so plentiful that we got everyday a sufficient number for our support, and indeed too frequently killed several merely for the tongues, marrow and fat.”Students might then examine oral histories, archival collections, archeological records, interviews, and participant observation of contemporary practices, using methods and materials from ethnographers such as Richard Daly, who has noted the historical preference for fish fats among Delgamuukw indigenous peoples in British Columbia, going back several centuries in communal memory. According to Daly, fat “rendered from salmon heads was prepared in summer, hung in bladder pouches in the rodent-resistant family meat caches, and saved for winter use.” Oils were “prepared from fatty fish and meat such as oolichan, salmon and beaver. Special processes were involved in preparing the heads- drying or boiling them for oil-as well as the eyes, bellies and eggs”. Moreover the “arrival of the oolichan. . . was traditionally announced with the cry, ‘Hlaa aat’ixshi halimootxw!’ or, ‘Our Saviour has just arrived!’” Ooligan grease was thus a prized gift in feasts and between neighbors.Surveying modern communities of Native Americans, Weston A. Price’s 1939 Nutrition and Physical Degeneration noted a similar preference for animal and fish fats, and organ meats, and suggested its provenance in ancestral food patterns that dated back centuries and even millennia.

The indigenous communities Price encountered were seen to prize the fattiest parts of meat and fish, including organ meats, rather than muscle-cuts. As Fallon Morell and Enig have summarized, Price linked a diet high in fats from mammals and fish to “an almost complete absence of tooth decay and dental deformities among native Americans who lived as their ancestors did… [including among] the nomadic tribes living in the far northern territories of British Columbia and the Yukon, as well as the wary inhabitants of the Florida Everglades, who were finally coaxed into allowing him to take photographs… Skeletal remains of the Indians of Vancouver that Price studied were similar, showing a virtual absence of tooth decay, arthritis and any other kind of bone deformity…”Ironically, in order to move beyond conjecture when discussing declining health markers after European contact, paleo-anthropological and paleo-archaeological evidence for the precontact Native American consumption of maize provides students and researchers with a helpful framework. Evidence for diminishing health following the introduction of maize raises the hypothesis that consumption of the grain came at the expense of more nutrient-dense calorie sources from animals and other plant species. Familiarity with such a hypothesis should help students and researchers to examine a similar – albeit greatly increased – association between the intervention of colonial European agriculture and the failure of Native Americans to recover demographically from infectious diseases in the three centuries after contact . In eastern North America from A.D. 800 to 1100, a shift towards maize-centered agriculture took place in regions as far afield as the Southwest and the Eastern seaboard. Hunted and gathered meats, as well as nutrient-dense crops such as gourds, seeds, and tubers, declined as a proportion of overall calorific consumption. Across the South Atlantic and Gulf Coastal plains, the shift towards maize cultivation as a vital source of calories was associated with the development of “socially ranked societies” and “fortified civic ceremonial centers” placed near maize storage centers. Corn came to be central to the activities of Iroquois confederations stretching from the east coast to the Ohio River valley, as well as in the Mississippian chiefdoms that grew along the river-ways of the Southeast and Midwest.

On the one hand, the cost-to-yield ratio of pre-contact maize cultivation may have become seemingly more attractive because of an increased demographic pressure on both wild and domesticated resources. That is to say, increased mortality due to food scarcity from hunted and gathered sources might have been diminished by the calories provided by maize. Yet on the other hand, there is a growing scholarly consensus that health and immunity decreased following the indigenous production of maize among North American communities around 2000 years ago, particularly in the American southwest. It may well have provided a source of energy to keep Native American communities alive. But those communities, when compared to populations who had preceded the demographic pressure placed on hunter-gathering, seemed to register declining health markers. That which saved them from death through famine was not necessarily nutritionally optimal as a dominant calorie source. To be sure, we can find historical records from the contact and pre-contact era that suggest several indigenous Native American methods to increase the nutritional profile of corn, some of which may have been introduced in pre-historic communities as they shifted towards the grain. These include soaking it in lye made from wood ashes, in order to make it more easily ground. Doing do also likely made the protein and niacin in corn more bio-available, due to the alkalinizing effect of the wood ashes/slaked lime solution. Greater absorption of Niacin is associated with greater bio-availability of important minerals such as Calcium and Potassium. Nonetheless, the production of “hominy” from corn in this way may not always have been enough to counteract potential nutritional deficiencies associated with the shift toward the grain.In order to understand the distinction between famine-prevention and optimal nutritional health, students and researchers would be able to consider a number of different case studies. From the Great Lakes region to the southern plains, and from eastern coastal to Pacific populations, for example, the regular consumption of bison, deer, and fish likely prevented protein and iron deficiencies that later accompanied the reliance on maize for a greater proportion of the daily calorie output of Native American communities.Analysis of bones from prehistoric North America reveals that, long before European contact, potentially problematic indicators followed the move towards maize consumption. Several studies have focused on growth retardation and suggested its negative association with health markers more generally. Declining stature has been correlated with the onset of agricultural production of maize during the first centuries of the Common Era. Archaeological evidence from late prehistoric Dickson Mounds populations in west-central Illinois show that agricultural intensification led to a decline in skeletal weight and height. In the Lower Illinois Valley children under six in nascent maize producing societies have been found to have suffered growth retardation compared to hunter-gathering communities from nearby excavations.