JavaScript Free Code
Showing posts with label Anthropology. Show all posts
Showing posts with label Anthropology. Show all posts

University of Utah anthropologists counted the number of carbon-dated artifacts at archaeological sites and concluded that a population boom and scarce food explain why people in eastern North America domesticated plants for the first time on the continent about 5,000 years ago.

Population boom preceded early farming in North America
New research backs eastern North America plant domestication theory 
[Credit: WikiCommons]
"Domesticated plants and animals are part of our everyday lives, so much so that we take them for granted," says Brian Codding, senior author of the study published online by the British journal Royal Society Open Science. "But they represent a very unique thing in human history. They allowed for large numbers of people to live in one place. That ultimately set the stage for the emergence of civilization."

Graduate student Elic Weitzel, the study's first author, adds: "For most of human history, people lived off wild foods -- whatever they could hunt or gather. It's only relatively recently that people made this switch to a very different method of acquiring their food. It's important to understand why that transition happened."

The study dealt not with a full-fledged agricultural economy, but with the earlier step of domestication, when early people in eastern North America first started growing plants they had harvested in the wild, namely, squash, sunflower, marshelder and a chenopod named pitseed goosefoot, a pseudocereal grain closely related to quinoa.

Codding, an assistant professor of anthropology, says at least 11 plant domestication events have been identified in world history, starting with wheat about 11,500 years ago in the Middle East. The eastern North American plant domestication event, which began around 5,000 years ago, was the ninth of those 11 events and came after a population boom 6,900 to 5,200 years ago, he adds.

For many years, two competing theories have sought to explain the cause of plant domestication in eastern North America: First, population growth and resulting food scarcity prompted people to grow foods on which they already foraged. Second, a theory called "niche construction" or "ecosystem engineering" that basically says intentional experimentation and management during times of plenty -- and not immediate necessity -- led people to manage and manipulate wild plants to increase their food supply.

"We argue that human populations significantly increased prior to plant domestication in eastern North America, suggesting that people are driven to domestication when populations outstrip the supply of wild foods," Weitzel says.

"The transition to domesticating food allowed human populations to increase drastically around the world and made our modern way of life possible," he adds. "People start living near the fields. Whenever you've got sedentary communities, they start to expand. Villages expand into cities. Once you have that, you have all sorts of social changes. We really don't see state-level society until domestication occurs."

When early North Americans first domesticated crops

The region of eastern North America covered by the study includes most of Missouri, Illinois, Indiana, Ohio, West Virginia, Kentucky, Tennessee and Arkansas, and portions of Oklahoma, Kansas, Iowa, Virginia, North Carolina, South Carolina, Georgia, Mississippi and Louisiana.

Population boom preceded early farming in North America
This map shows the area covered by a new University of Utah study that concludes a population boom and resulting 
scarcity of wild foods are what caused early people in eastern North America to domesticate wild food plants for
 the first time on the continent starting about 5,000 year ago. The triangles and names represent archaeological 
sites previously identified as locations where one or more of the these plants first were domesticated: squash, 
sunflower, marshelder and pitseed goosefoot, a relative of quinoa. The small circles are sites where 
radiocarbon-dated artifacts have been found, with a single circle often representing many dated artifacts.
 The study area includes much of eastern North America inland from the Atlantic and Gulf coasts 
[Credit: Elic Weitzel, University of Utah]
"This is the region where these plant foods were domesticated from their wild variants," Weitzel says. "Everywhere else in North America, crops were imported from elsewhere," particularly Mexico and Central America.

Four indigenous plant species constitute what scientists call the Eastern Agricultural Complex, which people began to domesticate about 5,000 years ago.

Previous research shows specific domestication dates were 5,025 years ago for squash at an archaeological site named Phillips Spring in Missouri, 4,840 years ago for sunflower seeds domesticated at Hayes in Tennessee, 4,400 years ago for marshelder at the Napoleon Hollow site in Illinois, and 3,800 years ago for pitseed goosefoot found in large quantities at Riverton, Illinois, along with squash, sunflower and marshelder.

Three more recent sites also have been found to contain evidence of domestication of all four species: Kentucky's Cloudsplitter and Newt Kindigenash rockshelters, dated to 3,700 and 3,640 years ago, respectively, and the 3,400-year-old Marble Bluff site in Arkansas.

Sunflower and squash -- including acorn and green and yellow summer squashes -- remain important crops today, while marshelder and pitseed goosefoot are not (although the related quinoa is popular).

Deducing population swings from radiocarbon dates

"It's really difficult to arrive at measures of prehistoric populations. So archaeologists have struggled for a long time coming up with some way of quantifying population levels when we don't have historical records," Weitzel says.

"People have looked at the number of sites through time, the number of artifacts through time and some of the best work has looked at the effects of population growth," such as in the switch from a diet of tortoises to rabbits as population grew in the eastern Mediterranean during the past 50,000 years, he adds.

Codding says that in the past decade, archaeologists have expanded the use of radiocarbon-dates for artifacts to reconstruct prehistoric population histories. Weitzel says radiocarbon dates in the new study came from artifacts such as charcoal, nutshells and animal bones -- all recorded in a database maintained by Canadian scientists.

The University of Utah anthropologists used these "summed radiocarbon dates" for 3,750 dated artifacts from eastern North America during the past 15,000 years.

"The assumption is that if you had more people, they left more stuff around that could be dated," Weitzel says. "So if you have more people, you conceivably should have more radiocarbon dates."

"We plotted the dates through time," namely, the number of radiocarbon dates from artifacts in every 100-year period for the past 15,000 years, he adds.

The analysis indicated six periods of significant population increase or decrease during that time, including one during which population nearly doubled in eastern North America starting about 6,900 years ago and continuing apace until 5,200 years ago -- not long before plant domestication began, Codding says.

Codding notes that even though plant domestication meant "these people were producing food to feed themselves and their families, they're still hunting and foraging," eating turtles, fish, water fowl and deer, among other animals.

The other theory

Weitzel says the concept of niche construction is that people were harvesting wild plants, and "were able to get more food from certain plants." By manipulating the environment -- such as transplanting wild plants or setting fires to create areas favorable for growth of wild food plants -- they began "experimenting with these plants to see if they could grow them to be bigger or easier to collect and consume," he adds. "That kind of experimentation then leads to domestication."

Codding says: "The idea is that when times are good and people have plenty of food then they will experiment with plants. We say that doesn't provide an explanation for plant domestication in eastern North America." He believes the behavioral ecology explanation: increasing population and-or decreasing wild food resources led to plant domestication.

Source: University of Utah [August 02, 2016]

Population boom preceded early farming in North America


A genetic mutation may have helped modern humans adapt to smoke exposure from fires and perhaps sparked an evolutionary advantage over their archaic competitors, including Neanderthals, according to a team of researchers.

Where there's smoke and a mutation there may be an evolutionary edge for humans
A genetic mutation that is now ubiquitous in humans may have increased our tolerance to smoke, leading to an 
evolutionary advantage over other hominins, such as Neanderthals [Credit:Web]
Modern humans are the only primates that carry this genetic mutation that potentially increased tolerance to toxic materials produced by fires for cooking, protection and heating, said Gary Perdew, the John T. and Paige S. Smith Professor in Agricultural Sciences, Penn State. At high concentrations, smoke-derived toxins can increase the risk of respiratory infections. For expectant mothers, exposure to these toxins can increase the chance of low birth weight and infant mortality.

The mutation may have offered ancient humans a sweet spot in effectively processing some of these toxins -- such as dioxins and polycyclic aromatic hydrocarbons -- compared to other hominins.

"If you're breathing in smoke, you want to metabolize these hydrophobic compounds and get rid of them, however, you don't want to metabolize them so rapidly that it overloads your system and causes overt cellular toxicity," said Perdew.

The researchers, who released their findings in the current issue of Molecular Biology and Evolution, suggest that a difference in the aryl hydrocarbon receptor -- which regulates the body's response to polycyclic aromatic hydrocarbons -- between humans, Neanderthals and other non-human primates may have made humans more desensitized to certain smoke toxins. The mutation in the receptor is located in the middle of the ligand-binding domain and is found in all present-day humans, Perdew added.

Ligands are small molecules that attach to receptor proteins in certain areas in much the same way that keys fit into locks.

Where there's smoke and a mutation there may be an evolutionary edge for humans
Troy Hubbard, Ph.D candidate in molecular biology, left, reviews as chart of proteins collected from a sample 
with Dr. Gary Perdew in their Life Sciences Lab at Penn State [Credit: Patrick Mansell]
"For Neanderthals, inhaling smoke and eating charcoal-broiled meat, they would be exposed to multiple sources of polycyclic aromatic hydrocarbons, which are known to be carcinogens and lead to cell death at high concentrations," said Perdew. "The evolutionary hypothesis is, if Neanderthals were exposed to large amounts of these smoke-derived toxins, it could lead to respiratory problems, decreased reproductive capacity for women and increased susceptibility to respiratory viruses among preadolescents, while humans would exhibit decreased toxicity because they are more slowly metabolizing these compounds."

There is evidence that both humans and Neanderthals used fire, according to George Perry, assistant professor of anthropology and biology, Penn State, who worked with Perdew.

"Our hominin ancestors -- they would technically not be called humans at that time -- were likely using fire at least a million years ago, and some infer an earlier control and use of fire approximately 2 million years ago," said Perry.

Fire would have played an important role for both humans and Neanderthals.

"Cooking with fire could have allowed our ancestors to incorporate a broader range of foods in our diets, for example, by softening roots and tubers that might otherwise have been hard to chew," Perry said. "Cooking could also help increase the digestibility of other foods, both in chewing time and reduced energetic investment in digestion."

Where there's smoke and a mutation there may be an evolutionary edge for humans
Adding cell culture media to cells in Gary Perdew's Life Sciences Laboratory 
[Credit: Patrick Mansell]
Fire also provided warmth, particularly in the higher latitudes, according to Perry.

"Besides heating and cooking, humans used -- and still use -- fire for landscape burning and as part of hunting and gathering, and now as part of agriculture," said Perry.

The study may also lend support to a recent theory that the invention of cooking may have helped humans thrive, according to Perdew.

He also suggested that the receptor might give humans a better tolerance for cigarette smoke, allowing people to smoke, but also putting them at risk of cancer and other chronic diseases.

"Our tolerance has allowed us to pick up bad habits," Perdew said.

The researchers used computational and molecular techniques to examine the difference in the genetics of polycyclic aromatic hydrocarbon tolerance between humans and Neanderthals. They examined a genomic database of humans, Neanderthals and a Denisovan, a hominin more closely related to Neanderthals than humans.

"We thought the differences in aryl hydrocarbon receptor ligand sensitivity would be about ten-fold, but when we looked at it closely, the differences turned out to be huge," said Perdew. "Having this mutation made a dramatic difference. It was a hundred-fold to as much of a thousand-fold difference."

In contrast, the sensitivity of the aryl hydrocarbon receptor for some endogenous -- produced in the body -- ligands is the same between human and Neanderthal, which further illustrates that modern humans may have adapted to specific environmental toxin exposures through this critical mutation in the aryl hydrocarbon receptor.

Author: Matt Swayne| Source: Penn State University [August 02, 2016]

Where there's smoke and a mutation there may be an evolutionary edge for humans


Archaeological sites speak about the everyday lives of people in other times. Yet knowing how to interpret this reality does not tend to be straightforward. We know that Palaeolithic societies lived on hunting and gathering, but the bones found in prehistoric settlements are not always the food leftovers of the societies that lived in them. Or they are not exclusively that. Peoples of this type were nomads and used to be constantly on the move across the territory, so other predators, such as hyenas or wolves, lurking around in search of food remains left by humans would be a common occurrence. Or even at a specific moment, carnivores could have sheltered in a cave abandoned by Prehistoric peoples and there raise their puppies and bring in the bones of the animals caught to feed them. These predators used to bite the bones leaving their teeth marks on them.

Tracking down the first chefs
A piece of experimental research has shown that human bites on bones have distinctive features 
allowing them to be differentiated from the bites made by other animals, and that cooking the meat
 in advance influences the appearance of these marks. This study provides valuable conclusions
 for analyzing food remains found on sites [Credit: Antonio J. Romero/UPV/EHU]
So it is very difficult to identify, for example, a roasted shoulder of mouflon eaten several thousand years ago from a few bone fragments that remain of it today. To be able to identify cases like this one, a novel way is to analyse the marks that we humans leave on bones when eating meat today. Human beings not only alter the bones when using stone knives on them and exposing them to fire to cook them, but like other animals, we also leave bite marks on the surface of the bone when we remove the meat to feed ourselves.

In this respect, the researcher at the UPV/EHU's Department of Geography, Prehistory and Archaeology of the Faculty of Arts Antonio J. Romero has led a piece of experimental research in which ninety lamb bones -- phalanges, radii and scapulae -- were studied and the meat of which was consumed by ten volunteers using only hands and teeth. To control the variables resulting from the processing of the food beforehand, a third of the sample was eaten raw, another third roasted and the rest boiled.

What did they eat and how?

The results, published in the Journal of Archaeological Science: Reports, show that over half of the bones bore the marks of human bites, teeth marks as well as fractures caused by chewing. These marks, analysed under a binocular magnifying glass have a set of characteristics (size and morphology) that allows them to be differentiated from those produced by other animals. Furthermore, as the researcher explained, "although the men produced more marks than the women, according to these data, it is not possible as yet to differentiate between them." On the other hand, cooking the meat beforehand affects the appearance of marks: "the teeth marks tend to appear more regularly in the roasted or boiled specimens," pointed out the researcher, "while the damage on the tips, edges and crushing tends to be more usual in the bones eaten raw."

"There are various similar studies that have explored in depth the damage caused by animals on bones when feeding, but not dealing with the marks that we humans leave behind," explained Antonio Romero. Studies of this type have a clear application in the analysis of archaeological remains, in particular for historical eras. So in each case a whole set of characteristics is studied, such as the location of the damage left on the bones, its morphology and dimensions, which is not always easy to apply to the archaeological record, but "together with other prints of human activity that are more reliable, such as the marks of stone knives, etc., it is possible to complete the interpretation," he insisted. This research constitutes a real breakthrough in the possibility of finding out what kind of meat foods hominids consumed and in what circumstances (whether or not they cooked the meat before they ate it). "It allows us to find out more about human beings in the past and the origin of our modern behaviour, about the way we process foods (cooking them or not) and about our way of eating," he concluded.

Source: University of the Basque Country [August 02, 2016]

Tracking down the first chefs


There are plenty of things that make it possible for humans to live in large groups and pack into cities. New building techniques and materials, for instance, allow construction of high-rise buildings; plumbing delivers clean, fresh water and sewage systems that help to prevent diseases. One factor, however, is rarely included on the list: having one or more gods.

Belief in a deity helps humans cooperate and live in large groups, studies say
Zeus with his thunderbolt [Credit: WikiCommons/Marie-Lan Nguyen]
In two studies published earlier this year in the journal Behavioral and Brain Sciences, Professor of Human Evolutionary Biology Joseph Henrich examined the notion that, by helping enforce ethical and cultural norms, belief in a powerful, omniscient God helped human societies quickly grow. Complementing that premise in a second study published in Nature, Henrich showed that people who believe in God are more likely to treat others fairly.

"What we want to understand is how humans were able to scale up from being relatively small societies to larger groups very quickly," Henrich said. "One answer is that religion can act as a kind of social technology that helped humans scale up and build large, complex societies.

"If you look at the religion of very small-scale societies, like hunter-gatherers, there's no intertwining between religion and ethics or morality," he said. "There are supernatural agents, but they tend to be weak, they can be tricked, and they don't have any power over the afterlife. It's only over time that gods become increasingly concerned with human affairs. Gods that have control over the afterlife don't appear until relatively late in human history."

Where did the concepts of those more powerful gods come from?

"We have evolved some basic cognitive abilities that allow us to represent and understand these supernatural beings," Henrich said. "Cultural evolution can then shape the details of what those gods care about and how powerful they are."

As gods grew more powerful, Henrich said, they gradually were represented as being more interested in the day-to-day affairs of mankind and more willing to punish those who did not conform to social norms.

And as the gods changed, so too did the rituals that played a key part of binding people to their faiths.

Belief in a deity helps humans cooperate and live in large groups, studies say
Joseph Henrich, Professor of Human Evolutionary Biology, is the author of a recent study on how 
belief in an omniscient, punishing God helped people to cooperate and form larger societies 
[Credit: Kris Snibbe]
Where small hunter-gatherer societies often used dance and moving in synch to help bind groups together, the shared belief in omniscient, powerful gods and rituals like Sunday services and prayer help unite larger communities of faith, Henrich said.

Such tight-knit societies eventually either were able to outcompete their neighbors�by sharing resources, growing faster, or fielding ever-larger armies�or they served as an example of their gods' powers, and in attracting converts.

"The key element is that there is an in-group pro-sociality," Henrich said. "You are willing to support other members of your group � but that circle expands over time."

To understand whether and how belief in God might influence people's behavior, Henrich and colleagues conducted experiments that took them to more than half a dozen locations around the world.

"We went to eight societies and tested Hindus, Christians, Buddhists, and others," Henrich said. "What we did was give people a chance to essentially cheat at a game or to be biased toward themselves and their local communities over strangers from the same religious group."

At the beginning of the experiment, participants were given a number of coins, along with two cups, one for a distant person of the same religion, and the other for either themselves or a member of their community. Participants then selected one of the cups in their minds, and rolled a black and white die. If the die came up black, they placed a coin in the cup they'd mentally selected. If it was white, a coin went in the other cup.

Importantly, Henrich said, the experiment is designed in a way that only the participants know whether or not they followed the rules. Using statistics, however, researchers were able to calculate how likely any outcome might be, and to understand how biased participants were toward themselves, a close member of their group, or a distant member of the same religion.

"Based on interviews done later, what we found was how omniscient and punishing people believe their god to be�predicted how much they would cheat, essentially," Henrich said. "Those who believed in more-punishing and more-knowing deities cheated less in favor of themselves and their local groups over distance co-religionists, although everyone cheated in favor of themselves or in favor of their local group a little bit. What this means is they were allocating more coins to a distant co-religionist, and expanding the social sphere."

While religion�and the cooperation it engendered�was likely a key factor in helping society reach the heights of modernity, its role is now gradually being supplanted by secular institutions.

The job of enforcing ethical behavior, once the purview of a punishing god, now falls to the justice system, where crimes are punished not with damnation, but with prison sentences, Henrich said.

Author: Peter Reuell | Source: Harvard University [August 01, 2016]

Belief in a deity helps humans cooperate and live in large groups, studies say