To give you the best possible experience, this site uses cookies. Review our Privacy Policy and Terms of Service to learn more.
Got it!
Player FM - Internet Radio Done Right
Checked 2d ago
Added two years ago
Content provided by Kambiz Kamrani. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Kambiz Kamrani or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App Go offline with the Player FM app!
Content provided by Kambiz Kamrani. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Kambiz Kamrani or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Content provided by Kambiz Kamrani. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Kambiz Kamrani or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
The human brain stands apart in the animal kingdom, not just in sheer size but in its remarkable cognitive abilities. For decades, researchers have sought to understand what genetic changes fueled the expansion of our neocortex, the region responsible for higher-order thinking, reasoning, and language. A recent study published in Science Advances 1 by Nesil Eşiyok and colleagues sheds light on this question, identifying a crucial genetic duo— NBPF14 and NOTCH2NLB —that orchestrates the abundance of neural progenitor cells, a key factor in the evolutionary enlargement of the human brain. Microscopic image of a section of an electroporated, genetically modified chimpanzee brain organoid. Cell nuclei in blue, precursor cells in magenta, electroporated, genetically modified cells in green and dividing cells in orange. The Role of NBPF14 and NOTCH2NLB At the heart of this discovery is the interplay between two genes that are uniquely human. The study demonstrates that NBPF14 and NOTCH2NLB function in tandem to regulate the proliferation and transformation of neural progenitor cells, the building blocks of the developing neocortex. NBPF14 stimulates the multiplication of these progenitor cells, ensuring an ample supply during brain development. NOTCH2NLB directs these progenitor cells toward a more specialized type, which eventually matures into the neurons that compose our brain’s intricate networks. This genetic interaction sets humans apart from other primates, providing a biological explanation for why our neocortex is uniquely large and complex. Experimental Evidence: A Multi-Faceted Approach To uncover the role of these genes, researchers combined multiple experimental approaches, including: Mouse Models : Introducing human versions of these genes into mice resulted in an increase in neural progenitor cells, suggesting a direct link between gene function and brain expansion. Chimpanzee Brain Organoids : By cultivating miniature brain-like structures from chimpanzee stem cells, the researchers observed how the absence of these human-specific genes limited neural progenitor proliferation. Human Brain Tissue Analysis : Further supporting their findings, the study examined human fetal brain tissue to confirm the expression patterns of NBPF14 and NOTCH2NLB in early brain development. "The remarkable feature of our study is that results from both animal models and alternative methods complement each other well and mutually confirm their findings," explains Michael Heide, the study’s lead researcher. "This not only underscores the significance of our work but also highlights the potential for reducing reliance on animal models by refining alternative methods." Evolutionary and Medical Implications Understanding the genetic underpinnings of human brain expansion has profound implications beyond evolutionary biology. These insights may help illuminate the origins of neurodevelopmental disorders such as microcephaly, in which brain size is significantly reduced due to disruptions in progenitor cell proliferation. "Our findings deepen the fundamental understanding of brain development and provide new insights into the evolutionary origins of our large brain," says Nesil Eşiyok, first author of the study. "In the long term, they could contribute to the development of therapeutic approaches for malformations of the brain." The Future of Human Brain Evolution Research This study adds another piece to the puzzle of human brain evolution, but many questions remain. How did these genes originate and become unique to humans? What other genetic or environmental factors contributed to the increase in brain size over millions of years? Future research will continue to probe these mysteries, combining genetic analysis with fossil and archaeological evidence to paint a more complete picture of our evolutionary past. With each discovery, the story of human brain evolution becomes clearer, offering a deeper appreciation for the genetic symphony that shaped our species’ most defining trait—our intelligence. 1 Eşiyok, N., Liutikaite, N., Haffner, C., Peters, J., Heide, S., Oegema, C. E., Huttner, W. B., & Heide, M. (2025). A dyad of human-specific NBPF14 and NOTCH2NLB orchestrates cortical progenitor abundance crucial for human neocortex expansion. Science Advances , 11 (13). https://doi.org/10.1126/sciadv.ads7543…
The human pelvis is a structural keystone—shaping movement, birth, and even evolutionary trajectories. Now, a fossilized pelvis found in South Africa’s Drimolen Main Quarry (DMQ) is providing rare insights 1 into the anatomy and biomechanics of early hominins. Dating to approximately two million years ago, the partial pelvis, designated DNH 43, raises new questions about how early relatives of Homo sapiens walked, gave birth, and adapted to their environments. Three-dimensional polygon models derived from surface scanning of DNH 43: (A) sacrum (DNH 43A) with arrow indicating cranially directed deformation of the left side of the sacral plateau; (B) bisection and reflection of the relatively undistorted right side to reconstruct the left side; (C) medial view of the two refit pieces of the os coxae (DNH 43B); (D) anterior view of the articulated pelvis with the reconstructed sacrum and the right os coxae reflected to reproduce the left side; (E) superior view of the articulated pelvis; and (F) lateral view of the articulated pelvis. The Significance of DNH 43 Fossilized hominin pelvises are exceptionally rare, making DNH 43 a crucial find. Recovered from DMQ—a site known for yielding both Paranthropus robustus and some of the earliest remains of Homo erectus —this specimen includes the sacrum and portions of the right os coxae. While past studies linked the fossil’s morphology to Australopithecus and Paranthropus , a new in-depth analysis offers a more comprehensive comparison across hominin species. The research team reconstructed the pelvis using digital modeling and analyzed its shape relative to an expanded dataset of fossil specimens. Their findings highlight a mix of primitive and derived traits, suggesting that Paranthropus robustus —a robust, small-brained hominin—may have had a more complex locomotor and obstetric anatomy than previously assumed. A Unique Mix of Features DNH 43 exhibits several characteristics linking it to Australopithecus and Paranthropus , including a small overall size, relatively narrow sacroiliac articulation, and a moderately wide tuberoacetabular sulcus. However, one of the most surprising findings was its pelvic incidence—the angle at which the sacrum orients the upper body over the pelvis. “The orientation of the sacrum in DNH 43 is strikingly similar to that of modern humans,” the researchers note. “This challenges some long-standing assumptions about early hominin posture and locomotion.” The study also found that while the overall structure of the pelvis retains a primitive, gracile morphology, it features a broad birth canal, suggesting that Paranthropus robustus may have had a different childbirth process than previously hypothesized. What This Means for Hominin Evolution The findings have significant implications for understanding locomotion and reproduction in early hominins. While Paranthropus robustus has long been thought to have relied on a more ape-like, rigid gait, the morphology of DNH 43 suggests a more efficient bipedal posture. The breadth of the pelvic inlet and biacetabular dimensions of DNH 43 are also noteworthy. Compared to other hominin specimens, these proportions suggest that its species might have had a more adaptable approach to childbirth. Unlike modern humans, who experience a complex rotational birth mechanism due to the conflicting demands of a large brain and a narrow pelvis, Paranthropus robustus may have given birth through a more straightforward, non-rotational process, similar to what has been inferred for Australopithecus . Future Directions Despite these insights, much remains uncertain. While DNH 43 is currently attributed to Paranthropus robustus , its similarities with Australopithecus and even early Homo specimens underscore the need for further research on postcranial variation in early hominins. Future discoveries may help clarify whether traits observed in DNH 43 were unique to Paranthropus robustus or shared across other hominin lineages. “Our findings emphasize the importance of postcranial remains in reconstructing the evolutionary history of early hominins,” the researchers explain. “The more we learn about their anatomy, the better we can understand how different species moved, survived, and ultimately gave rise to our own lineage.” As paleoanthropologists continue to uncover fossils in the Cradle of Humankind, specimens like DNH 43 will remain crucial in refining our understanding of early human evolution. This pelvis may not answer all questions, but it brings us one step closer to piecing together the intricate puzzle of hominin adaptation. 1 Berg, E., Hammond, A. S., Warrener, A. G., Shirley Mitchell, M., Tocheri, M. W., Baker, S. E., Herries, A. I. R., Strait, D. S., & Orr, C. M. (2025). Further assessment of a ~2-million-year-old hominin pelvis (DNH 43) from Drimolen Main Quarry, South Africa. South African Journal of Science , 121 (3/4). https://doi.org/10.17159/sajs.2025/17908…
The narrative of human technological advancement has long positioned metallurgy as a hallmark of settled agricultural societies. However, recent findings from the Gre Fılla site in southeastern Turkey suggest that the roots of metalworking may extend deeper into our hunter-gatherer past than previously understood. a) Location of early metallurgical activities in Anatolia and Gre Fılla archaeological site. b) The context where the vitrified material (GRE-VRF) was found. c-d) Structure K45 and the in-situ mortar and furnace installation. Credit: Gre Fılla Excavation / Özlem Ekinbaş Can The Gre Fılla Site: A Window into Prehistoric Innovation Nestled in the upper Tigris Valley, Gre Fılla has been under excavation since 2018. This site, spanning from the Pre-Pottery Neolithic (PPN) to the Pottery Neolithic periods, offers a rare glimpse into the lives of Anatolia's last hunter-gatherers. Among the architectural remnants and everyday artifacts, researchers have uncovered compelling evidence of early copper use and production. Evidence of Early Metallurgical Practices In the layers corresponding to the Pre-Pottery Neolithic B (PPNB) period, several notable discoveries have been made: Copper Artifacts : Items such as a bar-shaped copper object and various unworked lumps indicate familiarity with the metal. Vitrified Materials : Fragments exhibiting signs of exposure to high temperatures, some containing embedded copper droplets, suggest experimental smelting or annealing processes. These findings challenge the traditional timeline, which places the advent of copper metallurgy in the Chalcolithic period, around 4000 BCE. The evidence from Gre Fılla pushes this date back by several millennia, indicating that hunter-gatherer communities were engaging with complex pyrotechnology as early as 7000 BCE. a) The front and backsides of the vitrified material. b) composite tool, chisel axe with a bone handle resembling lithic axes. c) Chisel axe. d) The cross-section of copper object (GRE-C-002). Credit: Üftade Muşkara et al. Analyzing the Artifacts To understand the nature of these early metallurgical activities, a suite of analytical techniques was employed: X-ray Fluorescence Spectroscopy (pXRF) : Used to determine the elemental composition of the artifacts. Flame Atomic Absorption Spectroscopy (FAAS) : Applied for precise quantification of metal concentrations. X-ray Diffraction (XRD) : Utilized to identify mineralogical phases present in the vitrified materials. One particularly intriguing artifact, a copper bar-shaped object, underwent lead isotope analysis. The results indicated that the metal's source was not the nearby Ergani mines but rather distant regions near the Black Sea, such as Trabzon or Artvin. This suggests the existence of extensive exchange networks during this period, highlighting a level of socio-economic complexity not typically attributed to hunter-gatherer groups. Implications for the History of Metallurgy The discoveries at Gre Fılla necessitate a reevaluation of the origins of metallurgy. The transition from the Neolithic to the Chalcolithic may have been more gradual and regionally variable than previously thought, involving phases of experimentation and innovation within hunter-gatherer communities. Furthermore, these findings underscore the importance of localized developments in technological advancement. Rather than a singular origin point, metallurgy may have emerged independently in various regions, influenced by local resources, knowledge systems, and cultural practices. Conclusion The evidence from Gre Fılla paints a more nuanced picture of our prehistoric ancestors, showcasing their ingenuity and adaptability. As excavations continue and analytical techniques advance, it is likely that our understanding of early technological innovations will continue to evolve, shedding new light on the complex tapestry of human history. Related Research For those interested in further exploring early metallurgical practices and their implications, the following studies provide valuable insights: "Elemental Analysis of Pre-Pottery Neolithic B Copper Finds from Gre Fılla" : This study offers a detailed chemical analysis of copper artifacts from Gre Fılla, providing insights into early metallurgical techniques. "Early Copper Production by the Last Hunter-Gatherers" : This research presents evidence of advanced copper production techniques during the PPNB period at Gre Fılla, contributing to our understanding of early human engagement with metallurgy.…
Neanderthals are often recognized for their distinct facial features—large, forward-projecting midfaces, prominent brow ridges, and wide nasal openings. In contrast, modern humans have relatively smaller, flatter faces with retracted midfaces and more delicate bone structures. For decades, researchers have debated the evolutionary forces behind these differences. Was Neanderthal facial anatomy an adaptation to cold climates? A byproduct of powerful biting forces? Or was it simply the result of genetic drift? A recent study published in the Journal of Human Evolution 1 examines these questions from a new angle—by tracking how facial growth unfolds from infancy to adulthood in Neanderthals, modern humans ( Homo sapiens ), and chimpanzees ( Pan troglodytes verus ). Using a combination of 3D geometric morphometrics and microscopic bone analysis, the study explores how these species differ not just in their final adult form, but in the developmental pathways that get them there. How Faces Grow: A Comparative Approach At birth, Neanderthals already have larger midfaces than modern humans. That gap widens over time, but not in the way many might expect. Instead of growing at a steady rate like humans, Neanderthal midfaces expand dramatically during childhood and adolescence. By contrast, modern human faces reach their adult size much earlier—often by adolescence—before slowing or stopping entirely. To track these changes, researchers compared skulls from 128 modern humans, 13 Neanderthals, and 33 chimpanzees. Using 3D surface scans and microscopic bone modeling, they analyzed how each species' maxilla—the central bone of the midface—changed shape and size throughout life. Density plots showing the distributions of the distances between the mean of non-simulated adults (AG 5) and both non-simulated as well as simulated adults. Top: Distribution of the distances between the non-simulated adult chimpanzees and their mean (gold). The blue bars represent the distances between the adult chimpanzees simulated along the present-day human trajectory and the adult chimpanzee mean, while the red bars show the same for the adult chimpanzees simulated along the Neanderthal trajectory. Middle: Distribution of Procrustes distances between the non-simulated adult Neanderthals and their mean (red). The blue bars represent the distances between the adult Neanderthals simulated along the present-day human trajectory and the adult Neanderthal mean, while the gold bars show the same for the adult Neanderthals simulated along the chimpanzee trajectory. Bottom: Distribution of the distances between the non-simulated adult present-day humans and their mean (blue). The gold distribution represents the distances between the adult present-day humans simulated along the chimpanzee trajectory and the adult present-day human mean, while the red distribution shows the same for the adult present-day humans simulated along the Neanderthal trajectory. Both chimpanzee and Neanderthals, when simulated along the present-day human trajectory, plot outside of the distribution of the non-simulated adults. AG = age group. The results show that Neanderthals exhibit prolonged growth compared to modern humans, which contributes to their robust, projecting facial structure. Modern humans, on the other hand, experience earlier cessation of facial growth, contributing to their smaller, flatter midfaces. Chimpanzees, though distinct from both groups, exhibit a growth trajectory more similar to Neanderthals, further supporting the idea that modern human facial development is an evolutionary outlier. "The midfaces of Neanderthals are on average already larger at birth than those of modern humans and continue growing for a longer period, contributing to their distinctive facial projection," explains lead author Alexandra Schuh of the Max Planck Institute for Evolutionary Anthropology. Bone Remodeling and Facial Shape What causes these growth differences? At a microscopic level, bones grow and change shape through a process called remodeling, which involves both bone formation and resorption (the breakdown of bone tissue). The study found that modern humans have higher levels of bone resorption in the midface than Neanderthals, leading to a retracted, flatter facial structure. Neanderthals, by contrast, show greater bone formation in the nasal and infraorbital regions, which helps maintain their large, projecting midfaces throughout development. In other words, the way bone is added or removed during growth plays a major role in determining the final shape of the face. "Our findings highlight how the balance between bone formation and resorption shapes the distinct facial morphologies of Neanderthals and modern humans," says co-author Philipp Gunz. Interestingly, these patterns differ from those seen in chimpanzees. While Neanderthals and modern humans share some similarities in their facial remodeling processes, chimpanzees exhibit distinct bone growth patterns, particularly in the development of their prominent canine region. This suggests that the unique growth trajectory of modern humans is not just an extension of our evolutionary past but a departure from it. Why Do These Differences Matter? The distinct facial growth patterns observed in modern humans may be linked to broader evolutionary changes, including shifts in diet, social structure, and even cognition. Some researchers argue that the reduction in facial size and projection in Homo sapiens is connected to changes in dietary habits—particularly the advent of cooking and tool use, which reduced the need for powerful biting and chewing. Others propose that these facial changes are part of a broader trend toward "self-domestication" in humans—a process in which social selection favors individuals with more juvenile, less aggressive facial traits. "Facial gracilization in modern humans may be linked to behavioral changes, such as increased social cooperation and reduced aggression," says co-author Sarah Freidline. Additionally, differences in facial growth could have played a role in speech and communication. Some researchers suggest that the retraction of the modern human midface may have altered vocal tract anatomy, contributing to our ability to produce a wider range of speech sounds. Neanderthals: A Different Developmental Path While Neanderthals and modern humans share a common ancestor, their distinct growth trajectories suggest that they evolved along different developmental paths. The prolonged facial growth in Neanderthals may have been an adaptation to high-energy lifestyles, cold climates, or unique social behaviors. By contrast, the early cessation of facial growth in modern humans may have set the stage for the unique physical and cognitive traits that define our species today. Final Thoughts This study sheds light on the fundamental differences in how Neanderthal and modern human faces grow and develop. Rather than focusing solely on their adult differences, it highlights the underlying developmental processes that shaped their evolutionary paths. Understanding these growth patterns not only helps explain why Neanderthals looked the way they did but also offers insight into the evolutionary forces that shaped our own species. And while Neanderthals may be gone, the echoes of their biology live on in the DNA of modern humans—reminding us that our evolutionary story is still being written. Related Research Neanderthal Facial Growth and Adaptation Bastir, M., & Rosas, A. (2016). Craniofacial levels and the morphological maturation of the human skull. Journal of Anatomy, 228 (5), 784-797. https://doi.org/10.1111/joa.12455 The Role of Diet in Human Facial Evolution Lieberman, D. E. (2008). Speculations about the selective basis for modern human cranial form. Evolutionary Anthropology, 17 (1), 55-68. https://doi.org/10.1002/evan.20169 Self-Domestication and Facial Gracilization Cieri, R. L., et al. (2014). Craniofacial feminization in modern humans: Self-domestication or sexual selection? Current Anthropology, 55 (4), 419-443. https://doi.org/10.1086/677209 The Impact of Growth on Craniofacial Form Ponce de León, M. S., & Zollikofer, C. P. (2001). Neanderthal cranial ontogeny and its implications for late hominid diversity. Nature, 412 (6846), 534-538. https://doi.org/10.1038/35087578 1 Schuh, A., Gunz, P., Villa, C., Maureille, B., Toussaint, M., Abrams, G., Hublin, J.-J., & Freidline, S. E. (2025). Human midfacial growth pattern differs from that of Neanderthals and chimpanzees. Journal of Human Evolution , 202 (103667), 103667. https://doi.org/10.1016/j.jhevol.2025.103667…
The early human settlement of South America stands as one of the last great migrations in human history, yet the environmental conditions that shaped this journey remain debated. New research by Lorena Becerra-Valdivia, published in Nature Communications 1 , suggests that humans did not simply follow stable climates but adapted to fluctuating conditions, sometimes settling in areas experiencing severe cold. Using Bayesian chronological modeling and data from over 150 archaeological sites, the study examines how two major climatic events—the Antarctic Cold Reversal (ACR) and the Younger Dryas (YD)—influenced early human dispersal across the continent. Spatial distribution of archaeological sites included in the Bayesian chronological modelling according to geographic province, lithic technology (tradition or category), altitude and evidence for megafauna (MF) killing/scavenging by humans. Although a single lithic tradition/category is assigned to each site, some contain more than one (e.g., Pay Paso 1, Uruguay; see Supplementary Note 2 ) so this map should be seen as illustrative. The modelling work (e.g., Fig. 2 ) takes different cultural components and specific lithic traditions/categories into account. AL = Amazonian Lowlands (dark orange); BH = Brazilian Highlands (teal green); CA = Central Andes (sky blue); NA = Northern Andes (black); OL = Orinoco Lowlands (magenta pink); PPL = Paraguay-Paraná Lowlands (deep blue); PP = Patagonian Plateau (burnt orange); SA = Southern Andes (pale yellow); AB = Abriense (pink); EJ = El Jobo (light green); FT = Fishtail (blue); HU = Huentelauquén (green); IT = Itaparica (purple); PJ = Paiján (olive green); PP = projectile point (brown); PPa = Pay Paso (cyan); TG = Tigre (red); TQ = Tequendamiense (light red); UB = uniface/biface (orange). <2,5000 masl = orange. ≥2,5000 masl = blue. MF kill/scavenge evidence no = green, yes = pink. Source data can be found in Supplementary Data 1 . The Antarctic Cold Reversal and the First Settlements When early humans arrived in South America, they encountered a continent undergoing dramatic climate shifts. Around 14,700 to 13,000 years ago, the Southern Hemisphere experienced the Antarctic Cold Reversal, a period of cooling that reversed the warming trend seen in the Northern Hemisphere. Instead of deterring settlement, this cold phase appears to coincide with some of the earliest human activity in the region. According to Becerra-Valdivia's study, sites in the southern Andes and Patagonia show evidence of human occupation before or during this cold phase. This suggests that early settlers may have developed cultural adaptations that allowed them to survive in colder conditions. "The cultural timeline does not indicate that cold conditions were a barrier to human expansion," the study notes. "Rather, the earliest known human activity aligns with regions most affected by cooling, particularly in high-altitude and southernmost areas." The findings challenge earlier assumptions that humans would have avoided these harsh environments. Instead, their ability to endure may have been aided by cumulative knowledge, allowing them to exploit diverse ecological niches despite temperature fluctuations. The Younger Dryas and Expansion Across the Continent Following the Antarctic Cold Reversal, the Younger Dryas—a sharp cooling event that lasted from approximately 12,900 to 11,700 years ago—brought further environmental instability. This event, however, appears to have played a different role in human settlement. The study suggests that widespread occupation of South America became more pronounced after the Younger Dryas, as conditions stabilized. Start estimates for ACR/YD-aged cultural components are arranged according to ( a ) province (blue) and ( b ) lithic tradition/category (teal), as well as ( c ) high-altitude (≥2,500 m.a.s.l.) occupation (yellow) and sites with ( d ) evidence of megafauna killing/scavenging (red). The sample size is next to each distribution name, noting the number of cultural components that were included to create each model (see Supplementary Fig. 1 - 8 , 12 - 21 , 22 , 23 ). Categories marked with an asterisk (*) represent provinces and lithic technologies with limited cultural evidence—both in sample size and scholarly agreement—that predate the ACR and do not temporally extend into this period (see Supplementary Fig. 10 ). Brackets beneath each distribution denote 68.2% and 95.4% CI. The timing of the ACR and YD are denoted in dark and light grey bands, accordingly. Source data can be found within the Source Data file. OxCal code is in Supplementary Note 3 . A key pattern observed in the data is a west-to-east dispersal trend. Early occupation appears concentrated in the western Andes, with later settlements spreading toward the lowlands and eastern regions. "The Andes and the Pacific Coast likely served as primary routes for early human dispersal," Becerra-Valdivia explains. "Evidence suggests a later expansion toward interior regions such as the Paraguay-Paraná Lowlands." This supports genetic studies indicating that the Andes played a crucial role in early South American population movements. While lowland rainforests and humid environments likely presented challenges for early settlers, the western highlands may have offered a more viable pathway for migration. Rethinking Climate and Human Adaptation The study also touches on a long-standing debate: were humans responsible for the extinction of South America's megafauna, or did climate play the dominant role? The archaeological evidence does not establish a clear causal link between human activity and megafaunal decline. Instead, the data suggest that hunting and scavenging of large animals, such as giant sloths and mastodons, began well before their extinction. "The timing of megafaunal exploitation by humans does not directly align with their extinction patterns," the study states. "While human hunting may have contributed to population declines, the role of climate shifts cannot be overlooked." This contrasts with theories that place humans as the primary drivers of megafaunal extinctions. Instead, the study emphasizes the need for more refined models that integrate archaeological, genetic, and environmental data. Gaps in the Archaeological Record Despite these insights, the research also highlights major gaps in the archaeological and chronometric record. Some regions, such as the Amazon and Orinoco Lowlands, are significantly underrepresented in radiocarbon dating efforts. The study calls for more systematic dating of stratified sites and improved documentation of cultural sequences. "A more comprehensive dataset is needed to refine the cultural timeline," the study suggests. "Current gaps limit our ability to fully understand the pace and nature of early human settlement in South America." Addressing these issues could provide a clearer picture of how humans adapted to diverse environments, from glaciated mountain ranges to humid tropical forests. Conclusion Rather than being driven solely by climate, early human settlement in South America appears to have been shaped by a combination of environmental shifts and cultural adaptation. The ability to inhabit regions experiencing severe cold suggests a level of resilience that may have been underestimated. As more data emerge, the story of the first South Americans continues to evolve, offering new perspectives on how humans navigated one of the most complex migrations in history. Related Research and Citations If you're interested in further exploring the intersection of climate and early human migration, the following studies may be of interest: Prates, L., & Pérez, S. I. (2021). "Late Pleistocene South American megafaunal extinctions associated with rise of Fishtail points and human population." Nature Communications, 12 , 2175. https://doi.org/10.1038/s41467-021-22454-2 Waters, M. R., & Stafford, T. W. (2007). "Redefining the age of Clovis: implications for the peopling of the Americas." Science, 315 (5815), 1122-1126. https://doi.org/10.1126/science.1137166 Moreno-Mayar, J. V., et al. (2018). "Early human dispersals within the Americas." Science, 362 (6419), eaav2621. https://doi.org/10.1126/science.aav2621 These studies provide additional insights into the timing of human migration into the Americas and the ecological challenges early settlers faced. 1 Becerra-Valdivia, L. (2025). Climate influence on the early human occupation of South America during the late Pleistocene. Nature Communications , 16 (1), 2780. https://doi.org/10.1038/s41467-025-58134-5…
Archaeology often deals with what remains—the bones, the stone tools, the charred remnants of ancient hearths. But in the upland regions of Warner Valley, Oregon, a different kind of evidence is telling the story of early human diets: microscopic starch granules trapped in the cracks of bedrock metates. These stone grinding surfaces, found alongside rock art panels and other cultural features, are yielding the first direct evidence of plant processing in this landscape. Anthropologist Lisbeth Loutderback extracting plant residues from a metate at an archaeological site on public land in southcentral Oregon. Credit: Stefania Wilks, University of Utah A study published in American Antiquity 1 analyzed starch residues from 58 bedrock metates and identified granules from biscuitroot ( Lomatium spp.), a geophyte that has been an important food source for Indigenous communities in the region for thousands of years. A Closer Look at the Bedrock Metates Unlike portable grinding stones, bedrock metates are stationary features—shallow depressions worn into rock surfaces through repeated use. While their presence at archaeological sites suggests food processing activities, confirming what they were used for has been difficult. The study authors, led by archaeobotanist Stefania Wilks, hypothesized that plant starches might have been preserved deep in the stone’s crevices, protected from erosion and weathering. Ancient Native Americans used depressions in rock, called metates, like this one in Oregon's Warner Valley, to grind food. Credit: Stefania Wilks, University of Utah To test this, researchers sampled metates from three rock art sites in Warner Valley: Corral Lake, Barry Spring, and Long Lake. They used a two-step extraction process—first scrubbing the surface of the metates and then applying a chemical deflocculant to dislodge starch granules trapped in microscopic crevices. "It was critical to distinguish between surface contamination and deeply embedded residues," Wilks explained. "By comparing the results, we could confirm that these starches were not recent additions but rather remnants of ancient plant processing." Biscuitroot and the Role of Geophytes in Early Diets The analysis identified starch granules from a range of plants, but Lomatium spp. stood out. The plant, commonly known as biscuitroot, is a member of the carrot family ( Apiaceae ), and its starchy taproots have long been valued for their nutritional properties. "Geophytes like biscuitroot are an incredibly resilient food source," Wilks said. "They store carbohydrates underground, making them available even in dry or cold seasons. This would have been crucial for past hunter-gatherer groups in the northern Great Basin." Ethnographic accounts describe Indigenous groups harvesting Lomatium with digging sticks in the spring, pounding the roots into cakes for long-term storage. The starch granules extracted from the metates closely matched those of modern Lomatium , providing direct evidence that these plants were ground and processed on-site. Ancient Kitchens in the Landscape The study also suggests that these rock art sites were not just places of symbolic expression but were directly linked to subsistence activities. The presence of grinding surfaces near large stands of edible plants supports the idea that people returned to these locations seasonally, taking advantage of the natural abundance. Interestingly, the researchers found evidence that some metates had been reused over long periods. Patina analysis showed that some surfaces bore signs of older grinding activity, with newer use-wear overlaid on top. "This pattern indicates that these places held significance over generations," Wilks noted. "People were returning to the same spots, likely because of their proximity to reliable food sources." Revising Assumptions About Early Foraging Strategies The discovery challenges long-standing assumptions that upland environments were primarily used for hunting rather than plant processing. Archaeologists have long associated high-elevation sites with seasonal game tracking, but the presence of grinding tools and starch granules suggests a more complex picture. "As archaeologists, we've tended to focus on hunting as the primary activity at these sites," Wilks said. "But this evidence forces us to think more broadly about what people were doing in these landscapes. They weren’t just chasing game; they were processing plant foods in ways that left lasting traces in the stone." The findings also add to a growing body of research demonstrating that geophytes were a staple in early North American diets. Previous studies have identified geophyte starch in sites dating back to the Late Pleistocene and Early Holocene, reinforcing the idea that plant foods played a central role in subsistence strategies. A New Tool for Studying Ancient Diets The success of starch analysis in this study highlights the potential of the method for investigating ancient diets in other open-air settings. While charred seeds and pollen can sometimes provide clues about plant use, starch granules offer a more direct link to food processing. "This approach is especially valuable in places where organic remains don’t typically preserve well," Wilks said. "By looking at the microscopic residues left behind, we can reconstruct aspects of daily life that would otherwise be invisible in the archaeological record." The research team hopes to expand their work to other regions, testing whether similar grinding features elsewhere in the Great Basin contain preserved plant residues. "What we’re seeing here may be part of a much broader pattern," Wilks said. "These upland sites may have been essential food processing locations for thousands of years, and we’re just beginning to understand their role in the larger subsistence landscape." Related Research For those interested in further reading on starch analysis and geophyte use in early diets, here are some key studies: Louderback, L. A., & Pavlik, B. M. (2017). "Starch Granule Evidence for the Earliest Potato Use in North America." PNAS, 114 (29), 7606–7610. https://doi.org/10.1073/pnas.1707710114 Herzog, N. M., & Lawlor, A. T. (2016). "Reevaluating Diet and Technology in the Archaic Great Basin Using Starch Grain Assemblages from Hogup Cave, Utah." American Antiquity, 81 (4), 664–681. https://doi.org/10.1017/S0002731600101027 Henry, A. G. (2020). "Starch Granules as Markers of Diet and Behavior." In Handbook for the Analysis of Micro-Particles in Archaeological Samples . Springer. https://doi.org/10.1007/978-3-030-51103-2_5 This study provides a powerful reminder that even the smallest traces—microscopic starch granules—can reshape our understanding of human history. 1 Wilks, S. L., Louderback, L. A., Simper, H. M., & Cannon, W. J. (2025). Starch granule evidence for biscuitroot ( Lomatium spp.) processing at upland rock art sites in Warner Valley, Oregon. American Antiquity , 1–17. https://doi.org/10.1017/aaq.2024.42…
For decades, archaeologists have debated the nuances of Levallois technology—a stone tool production method used by Homo sapiens , Neanderthals, and other ancient hominins. These tools, characterized by a prepared-core technique that allowed for precise flake removal, have long been studied using traditional measurements. But a new study introduces a more sophisticated approach—three-dimensional geometric morphometrics (3D GM)—to examine the shape and variability of Levallois cores, particularly those associated with the Nubian Levallois method. Landmark configurations for cores and end-products during processing steps. A core outline landmarks after manual placement; B preferential scar landmarks after manual placement; C core outline semilandmarks after resampling and sliding; D preferential scar outline semilandmarks after resampling and sliding; E upper and lower surface patch placement on a template core; F platform, dorsal and ventral surface patch placement on a template product; G platform and outline landmarks after manual placement (different colours reflect curves); H product outline semilandmarks after resampling and sliding; I extracted ventral outline after resampling and sliding This research, led by Emily Hallinan and João Cascalheira and published in Archaeological and Anthropological Sciences , applies 3D scanning and advanced statistical modeling to Levallois cores from the Nile Valley and Dhofar, Oman. The results challenge long-held assumptions about how early humans controlled tool shape and suggest that the differences in Levallois core designs may be more influenced by cultural traditions than previously thought. Why Levallois Technology Matters Levallois technology represents a milestone in human cognitive and technological evolution. Unlike simpler stone tool techniques, Levallois required a high degree of planning—preparing a core in a way that controlled the shape and size of the resulting flake. This method provided a reliable way to produce standardized tools and reflects a level of foresight that archaeologists associate with complex cognition. But despite its importance, Levallois technology has remained difficult to define with precision. Early typological classifications focused on the shape of the flake removed from the core, while later technological perspectives emphasized the step-by-step process of preparing the core itself. This new study offers a different lens: analyzing the entire three-dimensional structure of the core to assess how shape is controlled across different regions and tradition. A New Approach: 3D Shape Analysis Hallinan and Cascalheira applied 3D GM to capture the full complexity of Nubian Levallois cores—tools that were long thought to have been shaped using a distinct method primarily in Northeast Africa. This technique allowed them to measure not just the overall shape but the subtle variations in surface convexity, elongation, and ridge structure that traditional methods might miss. By analyzing cores from Egypt's Nazlet Khater region and Dhofar, Oman, the researchers tested several hypotheses about Levallois toolmaking: Was core shape independent of size, as suggested by the principle of "autocorrelation," meaning the shape remained consistent despite reduction in size through use? Did different preparation strategies—such as distal versus lateral flake removals—result in distinct core shapes? Did regional variation in Levallois core shape reflect broader cultural or environmental influences? Findings: A Mix of Tradition and Adaptation The study revealed that while core shape was largely maintained throughout reduction, there were significant regional differences. The cores from Dhofar were more elongated and had steeper distal ridges, while those from Nazlet Khater were flatter with a more convex lower surface. This suggests that while all these cores followed the same general technological principles, regional toolmakers may have had different priorities or traditions guiding their work. "The fact that we see such distinct differences between regions suggests that shape was not purely a functional byproduct of Levallois reduction," the authors note. "Rather, there may have been cultural traditions at play in how these tools were made." Implications for Human Evolution One of the most intriguing findings was the high degree of standardization in Nubian Levallois core shape. While past studies have argued that Nubian cores were more uniform than other Levallois methods, this study quantified that standardization more rigorously. "We tend to think about stone tools in terms of their function, but this level of consistency suggests an element of learned tradition," the researchers explain. "That raises new questions about how these techniques were transmitted across generations and how they may have influenced early human migration and adaptation." These insights add to a growing body of evidence that early humans were not just adapting to their environments but were also shaping their technological world in ways that reflect social learning and cultural identity. Future Research: Expanding the Scope This study is one of the first to apply 3D GM techniques to Levallois cores, but its authors see potential for much broader applications. Future research could compare Levallois core shapes across different regions and time periods to map technological evolution more precisely. Additionally, applying these methods to other stone tool traditions could help clarify long-standing debates about how early humans developed their toolmaking skills. By using cutting-edge digital tools to analyze some of the oldest human technologies, researchers are shedding new light on the cognitive and cultural abilities of our ancestors—showing that even a seemingly simple stone flake holds clues to the minds that made them. Additional Related Research Lycett, S. J., & von Cramon-Taubadel, N. (2013). "A quantitative 3D geometric morphometric analysis of Levallois core shape." Journal of Human Evolution, 64 (6), 487-498. https://doi.org/10.1016/j.jhevol.2013.03.009 Eren, M. I., Lycett, S. J., & Roos, C. I. (2016). "The efficiency of stone tool production: Quantifying the effects of Levallois preparation." PLOS ONE, 11 (6), e0158002. https://doi.org/10.1371/journal.pone.0158002 Blinkhorn, J., & Grove, M. (2021). "Cultural transmission and Levallois variability: A model-based assessment." Evolutionary Anthropology, 30 (4), 208-218. https://doi.org/10.1002/evan.21855 Samawi, O., & Hallinan, E. (2024). "Levallois shape variability in Nubian assemblages: A comparative study of the Nile Valley and Arabia." Quaternary Science Reviews, 319 , 108195. https://doi.org/10.1016/j.quascirev.2024.108195 This research marks an important shift in how archaeologists study ancient technology. By moving beyond traditional typologies and embracing 3D analytical techniques, it becomes possible to see ancient toolmakers not just as survival-driven engineers, but as participants in rich technological traditions passed down across generations. Hallinan, E., & Cascalheira, J. (2025). Quantifying Levallois: a 3D geometric morphometric approach to Nubian technology. Archaeological and Anthropological Sciences , 17 (4). https://doi.org/10.1007/s12520-025-02199-2…
For decades, the story of modern human origins seemed relatively straightforward: Homo sapiens emerged in Africa roughly 300,000 years ago, evolving as a single, continuous lineage before expanding across the globe. But new research suggests that this narrative is missing an entire chapter. Modern humans descended from not one, but at least two ancestral populations that drifted apart and later reconnected, long before modern humans spread across the globe. A study published in Nature Genetics 1 by Trevor Cousins, Aylwyn Scally, and Richard Durbin of the University of Cambridge proposes that modern humans did not descend from a single population, but rather from two deeply divergent ancestral groups. These two populations split approximately 1.5 million years ago and remained separate for over a million years before mixing again around 300,000 years ago. One of these groups contributed about 80% of modern human DNA, while the other—now vanished—left a genetic imprint making up the remaining 20%. "Rather than a single lineage evolving smoothly over time, the evidence suggests a history of separation and recombination," says Cousins. "These groups were apart for a million years—longer than modern humans have been on the planet." This discovery complicates the long-held assumption that our species emerged from a single, unbroken line of ancestors. It also raises new questions about which fossil hominins—such as Homo erectus and Homo heidelbergensis —might represent these lost populations. A Hidden Population, a Vanished Legacy What makes this finding particularly striking is that this ancient genetic mixing event is not just a curiosity of the distant past. It is present in every modern human population, across Africa, Europe, Asia, and the Americas. Unlike the more familiar interbreeding episodes with Neanderthals and Denisovans—events that contributed only about 2% of the DNA in non-African populations—this deeper ancestral mixture accounts for ten times that amount. And yet, until now, it had remained invisible in our understanding of human evolution. The researchers made this discovery not by analyzing ancient bones but by studying the DNA of living people. Using data from the 1000 Genomes Project, they applied a computational model called cobraa (Coalescence-Based Reconstruction of Ancestral Admixture), which allowed them to detect subtle genetic signals left by ancient populations. This approach circumvents the need for physical fossils, offering a way to reconstruct population history even when no bones or artifacts remain. A Genetic Bottleneck and the Fate of Our Ancestors One of the study’s most intriguing findings is that, immediately after the initial split 1.5 million years ago, one of the two populations went through a severe genetic bottleneck. This group shrank to a tiny size before slowly recovering. Over time, this population eventually gave rise to the majority of Homo sapiens ancestry, as well as to Neanderthals and Denisovans. "This population bottleneck could be the result of an ecological crisis, a migration event, or simply chance," says Scally. "What’s remarkable is that despite its small size, this group ultimately shaped most of our genetic heritage." The other group—whose genes now make up the remaining 20% of modern human DNA—appears to have remained larger but was eventually absorbed into the expanding Homo sapiens population. Interestingly, the researchers found that genetic material from this secondary group tended to be located away from functionally important regions of the genome, suggesting that some of its DNA may not have been fully compatible with the majority genetic background. This pattern hints at a process called purifying selection, in which harmful mutations are gradually removed from a population over time. Who Were These Lost Ancestors? This discovery invites a crucial question: If these two populations remained separate for over a million years, what did they look like? Were they physically distinct, akin to separate species? And do they correspond to any known hominin fossils? Fossil evidence suggests that Homo erectus was widespread across Africa and Eurasia throughout this period. Other species, such as Homo heidelbergensis , also inhabited Africa and Europe. Either of these groups—or even an as-yet-undiscovered lineage—could be the long-lost ancestors detected in this study. "These populations were likely distinct in ways we don’t yet understand," says Durbin. "We may find fossils that match their genetic legacy, or we may already have them but lack the tools to identify their role in our ancestry." If this ancient mixing event is confirmed through further research, it could reshape how we classify and think about human evolutionary history. Rather than a single lineage, our origins may have been more like a braided stream—separate currents that merged over time. Rethinking Species Boundaries in Human Evolution The idea that modern humans emerged from multiple ancestral populations aligns with a broader shift in how biologists think about evolution. Increasingly, researchers are finding that species do not always evolve in neat, separate branches but often exchange genes across long periods. "What’s becoming clear is that species don’t evolve in isolation," Cousins explains. "Interbreeding and genetic exchange have likely played a role in the emergence of new species across the animal kingdom, not just in humans." This study also suggests that similar events may have occurred in other species. The researchers applied their method to genetic data from bats, dolphins, chimpanzees, and gorillas, and found evidence of deep ancestral structure in some of these species as well. What Comes Next? With this new model of human origins, researchers now have a fresh set of questions to explore. What conditions allowed these populations to remain separate for so long? What finally brought them back together? Did their reunion contribute to the development of cognitive traits that define modern humans? To refine these findings, scientists may turn to ancient DNA—if they can find it. Fossils from the crucial period around 300,000 years ago could hold traces of these lost ancestors, offering a direct genetic link to the populations detected in modern DNA. "The fact that we can reconstruct events from hundreds of thousands or even millions of years ago, just by looking at DNA today, is astonishing," says Scally. "It tells us that our history is far richer and more complex than we ever imagined." Conclusion The story of human origins is still being written. As researchers develop new genetic tools and unearth new fossils, the narrative continues to shift. What this study makes clear is that Homo sapiens did not emerge from a single lineage but from the reunion of two deeply ancient populations—distant relatives who came together to form the species that would eventually inhabit every corner of the planet. If this discovery holds, it could reshape how we define what it means to be human—not as the product of a linear march of progress, but as the outcome of ancient migrations, genetic exchanges, and lost populations that shaped our species in ways we are only beginning to understand. Additional Related Research Skoglund, P., & Reich, D. (2020). Ancient DNA and the new science of the human past. Nature, 577 , 645–656. https://doi.org/10.1038/s41586-019-1863-2 Hublin, J. J., et al. (2017). "New fossils from Jebel Irhoud, Morocco and the pan-African origin of Homo sapiens ." Nature , 546(7657), 289–292. DOI: 10.1038/nature22336 This study presents evidence from Moroccan fossils suggesting that H. sapiens had a more widespread African origin than previously believed. Ragsdale, A. P., et al. (2023). "A weakly structured stem for human origins in Africa." Nature , 617(7962), 755–763. DOI: 10.1038/s41586-023-06184-w Supports deep population structure within Africa, indicating H. sapiens arose from multiple interconnected groups rather than a single lineage. Reich, D., et al. (2010). "Genetic history of an archaic hominin group from Denisova Cave in Siberia." Nature , 468(7327), 1053–1060. DOI: 10.1038/nature09710 First genomic analysis of Denisovans, showing they contributed DNA to modern human populations. Terhorst, J., et al. (2017). "Robust and scalable inference of population history from hundreds of unphased whole genomes." Nature Genetics , 49(2), 303–309. DOI: 10.1038/ng.3748 Develops computational tools to analyze population splits and admixture in human evolution. 1 Cousins, T., Scally, A. & Durbin, R. A structured coalescent model reveals deep ancestral structure shared by all modern humans. Nat Genet (2025). https://doi.org/10.1038/s41588-025-02117-1…
For decades, archaeologists have puzzled over one of humanity’s most crucial technological leaps—when and how early humans began making sharp stone tools. A new study proposes an unexpected answer: before hominins ever struck two rocks together, they may have been using naturally occurring sharp stones to butcher meat and process plants. Examples of naturally produced sharp-edged stone specimens (top row, left to right: specimens #1, #5, #4, #7), or ‘cores’ from which sharp edged stone specimens likely manifested (bottom row, left to right: specimens #13, #20, #15 #24), from the Antarctic peninsula. Details about these specimens are available in the supplementary online materials (Data S1). These specimens and additional specimens can also be seen in figures S1-S28. (Image by Michelle R. Bebber, Metin I. Eren, and Alastair Key). Credit: Archaeometry (2025). DOI: 10.1111/arcm.13075 The research, published in Archaeometry 1 , suggests that before the first intentional toolmakers, hominins may have relied on "naturaliths"—sharp rock fragments created by natural geological or biological processes. These early humans may have used these naturally occurring cutting tools long before they figured out how to produce them deliberately. “The idea that early hominins just suddenly ‘invented’ knapping as a technological breakthrough doesn’t quite fit,” said Metin I. Eren, co-author of the study and an archaeologist at Kent State University. “There had to be an existing demand for sharp edges before anyone bothered to make them.” Nature as the First Toolmaker Stone tool production, or knapping, has long been considered one of the defining skills that set early humans apart from other primates. But Eren and his colleagues suggest that for hundreds of thousands of years before hominins learned to manufacture stone flakes, they simply collected and used ones already lying around. These naturaliths may have been produced in a variety of ways—rockfalls, erosion, wave action, glacial activity, and even trampling by large animals like elephants. “In some places, nature makes hundreds or even thousands of these naturally sharp stones,” said co-author Michelle R. Bebber, an archaeologist at Kent State University. “Early hominins probably had access to an abundant supply of cutting tools without needing to make them.” Examples of naturally produced sharp-edged basalt specimens (bottom row) found near Giant's causeway, Northern Ireland. These specimens appear to have been produced via downward rolling processes as well as coastal action. (Image by Michelle R. Bebber and Metin I. Eren). Credit: Archaeometry (2025). DOI: 10.1111/arcm.13075 The researchers conducted fieldwork at multiple locations—including Kenya, South Africa, and Antarctica—to document how natural processes produce sharp-edged stones. They found that in many environments, these fragments were not rare at all. Some sites had thousands of sharp stone pieces scattered across the landscape, indistinguishable from early hominin-made tools. The Missing Link Between Stone and Bone The study argues that before the first knappers, hominins were already using sharp objects to process food. This likely started with bone flakes created during marrow extraction. Experiments with modern tools suggest that bone fragments can be used effectively to cut meat, but stone flakes are significantly sharper. If hominins were already using bone flakes, it’s not hard to imagine them picking up a naturalith and recognizing its advantage. “Once you start using sharp-edged materials, you eventually start looking for ways to get more of them,” Eren said. “If you’re in an area with no naturaliths, you have a problem. That’s where knapping becomes useful.” The researchers suggest that naturaliths served as a bridge between using bone flakes and intentionally striking rocks together to make sharp stone tools. This gradual shift—from opportunistic use to controlled production—may have set the stage for the technological explosion of the Oldowan industry around 2.6 million years ago. Not a "Eureka!" Moment, But a Slow Discovery The traditional view of early toolmaking suggests that one particularly clever hominin, perhaps while cracking a nut or smashing a bone, accidentally broke a rock and discovered the sharp edges it produced. But the new hypothesis suggests that knapping wasn’t an accident—it was an imitation of nature. “Rather than being a sudden breakthrough, early knapping was likely an attempt to copy something that already existed,” said Eren. “Hominins weren’t inventing something new, they were figuring out how to make more of what they already used.” This changes how archaeologists might interpret the earliest evidence of stone tool use. If early hominins were using naturaliths, then traces of tool use—cut marks on bones, for example—might predate the oldest known knapped tools. Future research will need to look for signs that early hominins collected and carried naturaliths before they mastered making their own. Implications for Early Human Evolution If early humans relied on naturaliths before learning to knap, it raises new questions about how our ancestors thought about technology. The ability to recognize and select useful objects in the environment is a key cognitive step in human evolution. The findings also suggest that archaeologists should reconsider the definition of tool use in the fossil record. If a sharp stone was used by a hominin but not modified, does it still count as a tool? “If naturaliths were being used extensively, then the history of tool use is likely much older than we think,” Bebber said. “And that means the origins of human technology are more complex than a single moment of invention.” The Next Steps To test the hypothesis, archaeologists will need to analyze ancient sites for patterns suggesting naturalith use. This could involve examining sites older than 3.3 million years—the age of the earliest known stone tools—to look for cut marks on bones that may have been made with unmodified rocks. Experimental archaeology could also help determine whether naturally occurring flakes are sharp enough to leave distinct traces that can be identified in the fossil record. For now, the study challenges long-standing assumptions about the origins of tool use. Rather than a single discovery that launched the Stone Age, early hominins may have been cutting with sharp rocks for millennia before they ever struck two stones together with purpose. Related Research: Harmand, S., Lewis, J. E., Feibel, C. S., et al. (2015). 3.3-million-year-old stone tools from Lomekwi 3, West Turkana, Kenya. Nature , 521(7552), 310-315. https://doi.org/10.1038/nature14464 Plummer, T. W., Finestone, E. M., et al. (2023). Expanded geographic and taxonomic evidence for early butchery in the Oldowan. Science , 379(6628), 561-566. https://doi.org/10.1126/science.adf4234 Key, A. J. M., Lycett, S. J. (2023). The evolution of hominin tool use: A functional perspective on early stone tools. Quaternary Science Reviews , 314, 107963. https://doi.org/10.1016/j.quascirev.2023.107963 1 Eren, M. I., Lycett, S. J., Bebber, M. R., Key, A., Buchanan, B., Finestone, E., Benson, J., Gürbüz, R. B., Cebeiro, A., Garba, R., Grunow, A., Lovejoy, C. O., MacDonald, D., Maletic, E., Miller, G. L., Ortiz, J. D., Paige, J., Pargeter, J., Proffitt, T., … Walker, R. S. (2025). What can lithics tell us about hominin technology’s ‘primordial soup’? An origin of stone knapping via the emulation of Mother Nature. Archaeometry . https://doi.org/10.1111/arcm.13075…
The human skeleton has long been a resource for science, offering insights into disease, migration, and evolution. But behind every collection of bones stored in laboratories and museums lies a deeper story—one of power, consent, and ethics. A recent paper in the American Journal of Biological Anthropology urges anthropologists and anatomists to confront the legacy of human skeletal collections and calls for a new ethical framework that prioritizes transparency, community collaboration, and respect for the deceased. Credit: Boris Hamer from Pexels A Legacy of Exploitation For centuries, human remains have been collected, often without consent, to serve scientific and medical purposes. During the 19th and 20th centuries, anthropologists, medical schools, and museums amassed vast collections of human bones, frequently taken from marginalized communities—Indigenous groups, enslaved individuals, and the poor. Many of these remains were acquired through colonial grave robbing, unethical medical research, or outright theft. As biological anthropologist Gwen Robbins Schug and her colleagues point out in their new study, skeletal remains were often treated as mere specimens, detached from their human identities. "These were once living people, with families and histories. The ways in which their remains have been used, stored, and studied have not always honored that fact," the authors write. The Call for Ethical Reform The study is part of a broader movement within anthropology to rethink the way human remains are curated, studied, and displayed. The authors argue for several key reforms: Transparency in Collection Histories : Museums and universities should fully document and disclose the origins of their skeletal collections. Collaboration with Descendant Communities : Whenever possible, institutions should work with communities whose ancestors' remains are housed in collections, allowing them a say in how those remains are treated. Prioritization of Repatriation and Reburial : When remains were taken without consent, institutions should make efforts to return them to their communities or facilitate respectful reburial. Non-Destructive Research Methods : Advances in technology allow researchers to gather data from skeletal remains without permanently altering them. These methods should be prioritized to minimize harm. "Ethics needs to be at the forefront of all the work that we do," Schug and her colleagues argue, emphasizing that the field must reckon with its colonial and racist foundations. Recent Ethical Controversies The paper comes at a time of increasing scrutiny of how human remains are handled in scientific and academic settings. Several high-profile cases have highlighted the issue. In 2021, it was revealed that Princeton University had been storing and using remains of two Black children who were killed in the 1985 Philadelphia MOVE bombing for forensic anthropology classes—without the knowledge or consent of their families. The revelation sparked widespread outrage and renewed calls for accountability in anthropology and forensic science. Similarly, many Indigenous communities have long fought for the return of their ancestors' remains. The Native American Graves Protection and Repatriation Act (NAGPRA), passed in 1990, was intended to facilitate the return of remains to tribal communities, but many institutions have been slow to comply. The Future of Ethical Research The study's authors stress that this is a turning point for the field. Ethical research is not about halting scientific inquiry but about conducting it responsibly. As Siân Halcrow, one of the co-authors, explains, "We have the tools and the knowledge to study the past without repeating the mistakes of the past. Now, we need the will to do so. Anthropologists and anatomists must shift from a mindset of ownership to one of stewardship—recognizing that human remains are not mere objects of study but the physical legacy of real people. By embracing ethical research practices, the field can move forward in a way that respects both the dead and the living. Additional Related Research: Blakey, M. L. (2022). "The Skeletal Biographies of Enslaved Africans in the Americas: Ethics and Research." Annual Review of Anthropology, 51 , 125-142. https://doi.org/10.1146/annurev-anthro-092421-061013 Colwell, C. (2017). Plundered Skulls and Stolen Spirits: Inside the Fight to Reclaim Native America’s Culture. University of Chicago Press. de la Cova, C. (2022). "The Ethics of Teaching with Human Remains: A Bioarchaeological Perspective." American Anthropologist, 124 (4), 765-779. https://doi.org/10.1111/aman.13648 Fine-Dare, K. S. (2002). Grave Injustice: The American Indian Repatriation Movement and NAGPRA. University of Nebraska Press. Would you like a further breakdown of legal and institutional changes happening in response to these ethical concerns?…
Few traits define humanity as clearly as language. Yet, despite its central role in human evolution, determining when and how language first emerged remains a challenge. Fossils do not speak, and ancient DNA does not carry recordings of conversations. Traditionally, scholars have debated linguistic origins based on indirect clues—symbolic artifacts, brain size, or the complexity of tool-making. Credit: CC0 Public Domain A new genomic study, published in Frontiers in Psychology 1 , approaches the problem differently. By analyzing genetic divergences in early Homo sapiens populations, researchers argue that the biological capacity for language must have been present at least 135,000 years ago. If all modern human populations possess language and descend from a common ancestor, then linguistic ability must have existed before the first major human population split. "Every human society on Earth has language, and all human languages share core structural features. The question is not whether early Homo sapiens had language, but when," explains Shigeru Miyagawa, a linguist at MIT and co-author of the study. Genomic Clues: Tracing Language Through Population Splits Unlike previous studies that relied on archaeology or comparative anatomy, this research examines how human populations began to branch off from one another. Using genomic data from 15 different studies—including analyses of Y-chromosomes, mitochondrial DNA, and whole-genome comparisons—the team identified the earliest detectable split in Homo sapiens populations. This division, which eventually led to the Khoisan-speaking groups of southern Africa, occurred around 135,000 years ago. "If language had emerged after this split, we would expect to find some modern human populations without language or with a radically different form of communication," says Miyagawa. "But we don’t. Every population has fully developed linguistic abilities, suggesting that language was already in place before this first major division." What Came First: Language or Symbolic Thought? The genetic evidence suggests that Homo sapiens had the capacity for language long before the first clear signs of symbolic behavior appear in the archaeological record. For years, many researchers tied language origins to the emergence of art, ritual burials, and complex social behaviors, which become widespread around 100,000 years ago. "We see a lag between when the genetic evidence tells us language capacity was present and when symbolic artifacts appear in the record," notes Ian Tattersall, a paleoanthropologist at the American Museum of Natural History and co-author of the study. "This suggests that early Homo sapiens could speak long before they regularly expressed symbolic ideas in durable materials." This challenges the long-held view that language and symbolism arose in tandem. Instead, it suggests that the brain's ability to process language may have developed first as an internal cognitive tool, later spilling into outward communication and cultural expression. The Role of Language in Human Expansion One of the biggest questions in human evolution is how Homo sapiens successfully spread across the globe while other hominin species, like the Neanderthals and Denisovans, eventually disappeared. The researchers propose that language might have played a key role. "The ability to use complex language would have given early Homo sapiens a crucial advantage, enabling cooperation, teaching, and planning in ways that no other species could match," Miyagawa explains. This advantage may have allowed small groups of Homo sapiens to outcompete other hominins despite being similar in physical abilities. Language could have been the tool that enabled larger social networks, more sophisticated hunting strategies, and better survival in harsh environments. How This Fits with Other Evidence This genomic analysis aligns with findings from other fields: Fossil Evidence : The brain size and anatomy of early Homo sapiens suggest they had the neural capacity for language. While Neanderthals may have had some linguistic ability, Homo sapiens likely possessed more advanced syntax and vocal control. Archaeological Evidence : The earliest symbolic artifacts—engraved ochre, beads, and ritual burials—appear around 100,000 years ago, suggesting that spoken language preceded visible cultural expressions. Comparative Linguistics : Despite their diversity, all human languages share deep structural similarities, hinting at a common origin. Language as a Defining Human Trait This study does not pinpoint the exact moment when words were first spoken, but it sets a lower boundary for when language must have been present. The ability to use language—arguably the most powerful cognitive tool ever evolved—likely shaped every aspect of early human life, from cooperation to migration. "Language is not just a communication system; it is the foundation of human thought, culture, and innovation," Tattersall emphasizes. "Understanding when it emerged is key to understanding what makes us human." As new genetic and archaeological discoveries continue to refine the story, one thing remains clear: the origins of language are deeply entwined with the origins of Homo sapiens itself. Further Reading & Related Research Tattersall, I. (2017). Masters of the Planet: The Search for Our Human Origins. Palgrave Macmillan. Explores the cognitive evolution of early Homo sapiens and the role of language. Berwick, R. C., & Chomsky, N. (2016). Why Only Us: Language and Evolution. MIT Press. Argues that language is a unique cognitive leap in human evolution. Dediu, D., & Levinson, S. C. (2018). "Neanderthal language revisited: not only us." Current Opinion in Behavioral Sciences, 21 , 49–55. DOI: 10.1016/j.cobeha.2018.01.001 Examines whether Neanderthals had some form of language. This research raises new questions about how language evolved, how it shaped human societies, and why Homo sapiens emerged as the dominant species. The story of human language is still being written, but its origins stretch far deeper into the past than previously thought. 1 Miyagawa, S., DeSalle, R., Nóbrega, V. A., Nitschke, R., Okumura, M., & Tattersall, I. (2025). Linguistic capacity was present in the Homo sapiens population 135 thousand years ago. Frontiers in Psychology , 16 . https://doi.org/10.3389/fpsyg.2025.1503900…
Three million years ago on the East African plains, a tense scene might have played out. A group of Australopithecus afarensis —small, upright-walking hominins—gathers around a carcass, quickly slicing off scraps of meat with sharpened stones. Their actions do not go unnoticed. From the tall grass, a Homotherium —a scimitar-toothed cat—lurks, poised to charge. The hominins scatter. But could they actually outrun the predator? Reconstruction of the fossil skeleton of Lucy the Australopithecus afarensis. Credit: Wikimedia/Author 120 , CC BY-SA A new study, published in Current Biology 1 , has modeled how Australopithecus afarensis ran, offering the most detailed analysis yet of their locomotor abilities. The findings suggest that, while they were competent bipeds, their running capabilities lagged far behind those of later human ancestors. "Our simulations revealed that Lucy wasn’t as efficient or as fast at running as modern humans," the researchers explain. While humans today can sprint at speeds exceeding 20 mph, the best A. afarensis could likely manage was around 11 mph—barely enough to escape a determined predator. The Running Experiment: Reconstructing Lucy in 3D To test Australopithecus afarensis ’ running ability, researchers digitally reconstructed the skeleton of Lucy, the most famous representative of the species. Where bones were missing, they used scaled versions of other Australopithecus fossils. They also incorporated comparative data from chimpanzees and modern humans to estimate muscle placement and function. Using 3D modeling software, they then "fleshed out" the skeleton with muscles, simulating different configurations—from a more human-like anatomy to a more ape-like one. Finally, they tested multiple versions of A. afarensis with and without an Achilles tendon, a key structure that aids in running efficiency in modern humans but is absent in chimpanzees. The digital models were then placed into a physics-based simulation program called GaitSym, which allowed researchers to simulate running motion, measure energy efficiency, and determine top running speeds. Slow and Inefficient: The Limits of Early Hominin Locomotion The results painted a clear picture: while A. afarensis was capable of running, it was neither fast nor efficient. The fastest simulated hominin reached just 11 mph—comparable to a slow jog in modern humans. Even untrained human sprinters can hit speeds of around 17.6 mph. "The metabolic cost of running in Australopithecus afarensis was up to three times higher than in modern humans," the researchers report. This means that every step took considerably more energy, making sustained running over long distances impractical. The presence or absence of an Achilles tendon also made a significant difference: when removed from the simulation, running efficiency dropped dramatically, reinforcing its importance in later hominin evolution. Could Australopithecus afarensis Hunt or Escape? These findings challenge the idea that A. afarensis could have engaged in endurance hunting, a strategy where early humans chased prey until it collapsed from exhaustion. This technique, seen in some modern hunter-gatherer groups, relies on a combination of efficient cooling (via sweating), a long Achilles tendon, and an energy-efficient gait—traits that appear to have developed later in hominin evolution. If A. afarensis wasn’t built for endurance running, how did they obtain meat? Cut-marked bones dating to 3.4 million years ago suggest they had access to animal flesh, but whether they hunted or scavenged remains unclear. Given their limited running ability, scavenging—perhaps snatching scraps from a predator’s kill—seems the more likely scenario. As for escaping predators, the data suggests they wouldn’t have been able to outrun a fast-moving threat. Instead, they may have relied on group coordination, vigilance, and possibly taking refuge in trees. "If faced with a predator like a Homotherium , running away probably wasn’t an option," the researchers note. A Step Toward Human Endurance The study highlights a crucial transition in human evolution. While Australopithecus afarensis had already shifted to full-time bipedalism, they had not yet acquired the long-distance running abilities that characterize later members of the genus Homo . This shift likely occurred around 2 million years ago with Homo erectus , whose longer legs, shorter arms, and well-developed Achilles tendon suggest a body built for efficient locomotion. The ability to run farther and faster would have given Homo erectus a major advantage in acquiring food and avoiding predators. Rewriting the Story of Early Human Locomotion For decades, Lucy has been a symbol of the evolutionary leap toward bipedalism. But this new research reminds us that walking on two legs didn’t immediately translate to efficient running. Instead, human endurance running was likely a gradual adaptation, appearing only with later hominins. So, if an Australopithecus afarensis had been caught in the middle of an ancient savanna chase, the odds were not in their favor. Their best bet? Spot danger early, and climb fast. Related Research For those interested in further reading, here are additional studies on early human locomotion: Bramble, D. M., & Lieberman, D. E. (2004). "Endurance running and the evolution of Homo." Nature, 432 (7015), 345–352. DOI: 10.1038/nature03052 Pontzer, H., Raichlen, D. A., & Wood, B. M. (2014). "Hunter-gatherers as models in public health." Obesity Reviews, 15 (Supplement 1), 12–23. DOI: 10.1111/obr.12174 Sockol, M. D., Raichlen, D. A., & Pontzer, H. (2007). "Chimpanzee locomotor energetics and the origin of human bipedalism." Proceedings of the National Academy of Sciences, 104 (30), 12265–12269. DOI: 10.1073/pnas.0703267104 These studies, along with the latest findings on Australopithecus afarensis , continue to refine our understanding of how and when humans became the long-distance runners they are today. 1 Bates, K. T., McCormack, S., Donald, E., Coatham, S., Brassey, C. A., Charles, J., O’Mahoney, T., van Bijlert, P. A., & Sellers, W. I. (2025). Running performance in Australopithecus afarensis. Current Biology: CB , 35 (1), 224-230.e4. https://doi.org/10.1016/j.cub.2024.11.025…
Before the soft-footed, domesticated Felis catus found its way into Chinese homes, another feline species occupied human settlements for thousands of years. A new genetic and archaeological study 1 has revealed that leopard cats ( Prionailurus bengalensis ), small wild felines native to East Asia, lived alongside people in China’s early agrarian societies for at least 3,500 years—only to disappear from human settlements centuries before the arrival of domestic cats via the Silk Road. A Tang Dynasty mural from A.D. 829 is one of the earliest depictions of domestic cats in China. Two black-and-white cats are visible at the center. (Image credit: Zheng H, Liu Y, Chi M. (2013). Chinese Archaeology.) Researchers analyzing 22 ancient feline remains from 14 sites across China have reshaped the timeline of cat domestication in the region. Their findings suggest that leopard cats filled the niche of rodent control in human settlements long before domesticated cats arrived. But unlike their Middle Eastern and European counterparts, early Chinese societies never domesticated these wild felines. Instead, leopard cats vanished from human settlements roughly 1,800 years ago, leaving a curious gap before the first domesticated cats appeared around 1,400 years ago during the Tang Dynasty. "Leopard cats were part of human settlements for thousands of years, but their relationship with people was very different from what we see with domestic cats today," said Shu-Jin Luo, a geneticist at Peking University and co-author of the study. A Feline Mystery: The Gap Between Wild and Domestic Cats The prevailing theory once suggested that domesticated cats were present in China as early as 5,400 years ago, based on felid remains found at the Neolithic site of Quanhucun. However, subsequent DNA analysis overturned this assumption, revealing that the remains actually belonged to leopard cats, not domesticated felines. Leopard cats, roughly the size of modern house cats, are adaptable predators with a long history of interaction with human societies. They likely entered settlements in search of prey, much like how wild cats in the Near East started their commensal relationship with early farmers. However, despite their presence in ancient Chinese communities, leopard cats never made the leap to full domestication. This painting from the bottom of a bowl is one of the earliest depictions of a cat from China, dating to 168 B.C. Markings on the cat's fur suggest it's a leopard cat, not a domestic cat. (Image credit: Hunan Museum Collection Database.) By around 150 CE, leopard cats seemingly vanished from archaeological records. The reasons remain unclear, but researchers speculate that a combination of climate shifts, societal upheaval following the fall of the Han Dynasty, and changes in agricultural practices may have led to their decline. "Unlike in the Near East, where wildcats transitioned into domesticated companions, leopard cats in China never developed that close bond with humans," said co-author Joel Alves, a bioarchaeologist at the University of Oxford. The Arrival of Domestic Cats via the Silk Road For centuries after leopard cats disappeared from human settlements, no feline remains appeared in Chinese archaeological sites. Then, in the Tang Dynasty (618–907 CE), domesticated cats—descendants of Felis lybica , the African wildcat—arrived in China. DNA evidence suggests these cats were brought from Central Asia, likely via traders and diplomats along the Silk Road. The earliest known domestic cat in China, excavated from Tongwan City in Shaanxi, dates to between 706 and 883 CE. Genetic analysis shows that this cat was closely related to domestic cats found in medieval Kazakhstan, reinforcing the idea that these animals were transported across Eurasian trade routes. Interestingly, early Chinese domestic cats likely looked different from their Middle Eastern and European relatives. Genetic markers indicate they had short hair, long tails, and white or partially white coats—traits that remain more common in East Asian domestic cats today. The Cultural Shift: How China Embraced the Domestic Cat The arrival of Felis catus was not merely a biological event; it marked a cultural shift in how Chinese society interacted with felines. In contrast to the leopard cats that coexisted in settlements but remained wild, domestic cats quickly became valued members of elite households. Historical records from the Tang Dynasty depict these cats as treasured pets, sometimes associated with religious rituals. By the Song Dynasty (960–1279 CE), they were widely accepted as companions and protectors of food stores, much like their European counterparts. "Cats were introduced as exotic pets, but they eventually became widespread, replacing leopard cats as the primary felines in human settlements," said Luo. Why Didn’t Leopard Cats Become Domesticated? The question remains: why did leopard cats never make the transition to domesticated status? One possibility is their temperament. Unlike Felis lybica , which has a relatively social nature and was able to adapt to human environments, leopard cats are solitary, elusive, and notoriously difficult to tame. Even today, attempts to domesticate them have been largely unsuccessful. Another factor may have been competition. When domestic cats arrived, they occupied the same ecological niche that leopard cats once filled—only with a temperament better suited for cohabiting with humans. This may have prevented leopard cats from re-establishing themselves in settlements. The Legacy of Ancient China’s Felines The study of ancient feline remains in China adds an important layer to the story of cat domestication. While the domestication of Felis catus is a well-documented phenomenon in the Near East and Europe, China’s feline history followed a different trajectory—one where a native wild cat briefly occupied human spaces before being replaced by an imported species. Today, leopard cats remain widespread across Asia, though they are rarely found in close association with humans. The story of their early presence in Chinese settlements serves as a reminder that the path to domestication is not always straightforward. "Leopard cats and domestic cats tell two very different stories of human-animal interaction in China," said Alves. "One coexisted but remained wild, while the other became a companion. The transition was anything but inevitable." Related Research For those interested in further reading on the topic, here are some additional studies exploring cat domestication and human-feline interactions: Ottoni, C., et al. (2017). "The palaeogenetics of cat dispersal in the ancient world." Nature Ecology & Evolution , 1(0139). DOI: 10.1038/s41559-017-0139 Haruda, A. F., et al. (2020). "The earliest domestic cat on the Silk Road." Scientific Reports , 10, 67798. DOI: 10.1038/s41598-020-67798-6 Yu, H., et al. (2021). "Genomic evidence for the Chinese mountain cat as a wildcat conspecific ( Felis silvestris bieti ) and its introgression to domestic cats." Science Advances , 7(0221). DOI: 10.1126/sciadv.abg0221 These studies, along with Han et al. (2025), continue to refine our understanding of how humans and felines have interacted across time and geography. 1 Han, Y., Hu, S., Liu, K., Xu, X., Doherty, S., Jamieson, A. E., Manin, A., Martins, S. G., Yang, M., Yu, C., Wang, J., Wu, Z., Chen, C., Han, S., Lu, D., Peng, L., Wu, X., Li, Z., Fan, W., … Luo, S.-J. (2025). Leopard Cats Occupied Human Settlements in China for 3,500 years before the Arrival of Domestic cats in 600-900 CE around the Tang Dynasty. In bioRxiv . https://doi.org/10.1101/2025.01.31.635809…
For decades, the story of agriculture in the Mediterranean has been told as a wave of migration—Neolithic farmers from the Near East expanding across Europe, replacing or mixing with hunter-gatherer populations along the way. North Africa, however, has always been an outlier in this narrative. Some scholars have suggested that local foraging groups resisted farming altogether, surviving on land snails, wild plants, and game long after their European and Levantine neighbors had adopted agricultural practices. Others proposed that migration from Europe brought domesticated animals and crops but left little genetic trace. An archaeological dig site at Doukanet el Khoutifa, Tunisia, in the eastern Maghreb region. Credit: Giulio Lucarini Now, a study published in Nature 1 by a team of researchers led by geneticist David Reich at Harvard Medical School provides the first genetic evidence to clarify what actually happened in the eastern Maghreb (modern Tunisia and northeastern Algeria). By sequencing the genomes of individuals who lived in the region between 15,000 and 6,000 years ago, the researchers discovered a story of continuity rather than replacement. “There’s not been much of a North African story,” Reich explains. “It was a huge hole.” Rather than a sweeping demographic shift, the DNA suggests that local foragers maintained a strong genetic presence even as farming slowly made its way into the region. The results challenge long-held assumptions about how agriculture spread, showing that North African societies adapted new food-producing strategies while preserving their ancestral genetic identity. Read more…
Welcome to Player FM!
Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.