1






| Thumbs Up/Down |
| Received: 998/0 Given: 737/2 |
I haven't found any thread mentioning these studies.
Abstract
This study examines the temporal and geographical evolution of polygenic scores (PGSs) across cognitive measures (Educational Attainment [EA], Intelligence Quotient [IQ]), Socioeconomic Status (SES), and psychiatric conditions (Autism Spectrum Disorder [ASD], schizophrenia [SCZ]) in various populations. Our findings indicate positive directional selection for EA, IQ, and SES traits over the past 12,000 years. Schizophrenia and autism, while similar, showed different temporal patterns, aligning with theories suggesting they are psychological opposites. We observed a decline in PGS for neuroticism and depression, likely due to their genetic correlations and pleiotropic effects on intelligence. Significant PGS shifts from the Upper Paleolithic to the Neolithic periods suggest lifestyle and cognitive demand changes, particularly during the Neolithic Revolution. The study supports a mild hypothesis of Gregory Clark’s model, showing a noticeable rise in genetic propensities for intelligence, academic achievement and professional status across Europe from the Middle Ages to the present. While latitude strongly influenced height, its impact on schizophrenia and autism was smaller and varied. Contrary to the cold winters theory, the study found no significant correlation between latitude and intelligence.
[...]
Discussion
This study’s examination of PGSs across cognitive, psychiatric and physical traits reveals distinct temporal and ancestral influences.
In cognitive variables, a robust negative correlation with age was noted for EA3, IQ and SES, suggesting an evolutionary trend in these traits (Figure 3). The effect size for EA3 and EA4 was reduced by approximately 50% after accounting for ancestry but continued to be significant (β = −0.127 and −0.141, p < .001). The impact of genome age on IQ and SES was slightly less (−0.111) compared to EA after ancestry control, yet remained noteworthy.
The increase in cognitive PGSs was about 0.12 SDs per 1000 years, implying that 10,000 years ago the PGSs for cognitive abilities were about 1.2 SDs lower than they are today.
The scatterplot shows that the increase in EA and IQ PGS started around 20 Kya, whereas between 20 and 33 Kya there was no clear trend (Figure 4). The period between 20 and 33 Kya witnessed a decrease in human population density in Europe due to the expansion of the ice sheets, which reached the maximum extension about 20 Kya during the Last Glacial Maximum (LGM; Tallavaara et al, Reference Tallavaara, Luoto, Korhonen, Järvinen and Seppä2015). Subsequently, population density started increasing, and this would have initiated a cycle of gene-culture coevolution due to increased intergroup conflict and social complexity. Indeed, intergroup conflict in Europe during the Holocene has been shown to be density dependent, and to be the main cause of population booms and busts (Kondor et al., Reference Kondor, Bennett, Gronenborn, Antunes, Hoyer and Turchin2023).
This is reflected by the sharp increase in EA3, EA4 and IQ PGSs between the Upper Paleolithic and the Neolithic (Supplementary Figures S1, S2, S3). Hunter-gatherer ancestry (particularly V2 or WHG) was negatively associated with EA3, EA4 and IQ in the regression models (Supplementary Tables S1−S3) even after accounting for Years BP (β = -0.314, -0.4, -0.249), suggesting that the increase in cognitive capacity was not solely driven by the Neolithic revolution but was partly mediated by admixture with the immigrants that accompanied it. Anatolian Neolithic farmers who intermixed with native HGs contributed between approximately 40% and 98% of Neolithic European ancestry (Chintalapati et al., Reference Chintalapati, Patterson and Moorjani2022). Near-Eastern ancestry (V4) had a similar negative effect on the three phenotypes (−0.31, −0.88 and −0.313).
In the regression model (Supplementary Tables S1−S3), Anatolian/European farmer ancestry (V5) positively predicted EA3, EA4 and IQ PGS. As V5 was omitted from the regression to deal with multicollinearity, its effect was absorbed into the intercept, and all the other components had negative beta. The positive effect on EA and IQ was confirmed by re-running the regression analysis by including V5 and omitting each of the other four components in turn.
Psychiatric phenotypes displayed complex relationships with time, heavily influenced by ancestry. For example, the temporal changes in ASD and SCZ were not solely attributable to time but were significantly mediated by ancestral components. This implies that the temporal alterations in these PGSs are significantly influenced by authentic ancestry transitions or by the uneven representation of ancestries and geographical diversity in our sample. For instance, there was a higher representation of Northern European samples during the Middle Ages and Southern European samples during the Neolithic period.
Shifts in ancestry can underlie authentic alterations in phenotype risk if the new genetic variants introduced into a population alter the inherent risk or manifestation of certain traits or conditions. For example, populations from regions with varying sunlight exposure (e.g., closer to the equator vs. farther away) have different skin pigmentation due to the differential selective pressures of UV light exposure. If these populations mix, the resulting generations might exhibit a wider range of skin pigmentation due to the blending of genes influencing this trait and a different average skin darkness.
Conversely, if the GWAS did not fully account for potential population stratification (especially that mediated by environmental factors), the observed associations between certain genetic markers and traits or disease might be misleading or confounded. This is because the genetic differences might be coincidentally associated with the trait or disease due to the unaccounted-for environmental factors rather than having a direct biological connection.
Theoretically, if a PGS for a trait increases over time, and the ancestry is related to the trait independently of environmental biases in the GWAS, this indicates a genuine change in genetic predisposition even if the association with time is entirely mediated by ancestry. However, selection still cannot be ruled out, because ancestry shifts might indicate that individuals belonging to a particular ancestry were favored in certain environments because of different genetic predisposition to some advantageous trait.
Middle Eastern ancestry (V4) positively predicted the schizophrenia and neuroticism PGS (Supplementary Tables S5 and S7) in the regression models but negatively predicted autism PGS (Supplementary Table S4).
Intriguingly, schizophrenia and autism demonstrated divergent temporal trends, aligning with the theory that these disorders embody opposite ends of the psychological spectrum (Crespi et al., Reference Crespi, Stead and Elliot2010). Furthermore, their correlation with Years BP aligned with pleiotropic impacts on cognitive abilities, given that schizophrenia and autism present reciprocal genetic correlations to IQ.
Lastly, inclinations towards psychotic tendencies may have been advantageous in traditional societies where mystical beliefs and hallucinations were often considered normative, in contrast to contemporary societies that demand heightened concentration and specialization, thereby prioritizing autistic traits.
The presence of contrasting selection pressures on schizophrenia and autism is also hinted at by the inverse effects of latitude (β = −0.187 and 0.05 respectively). This suggests that a colder climate might have favored a higher genetic predisposition towards autism while simultaneously selecting against a risk for schizophrenia.
Both neuroticism and depression PGSs have demonstrated a decline over time, consistent with their substantial genetic and phenotypic correlation. The observed trend may also be attributed to pleiotropic effects on intelligence, which bears a weak, yet negative, relationship with both neuroticism and depression.
The correlations observed at individual (Supplementary Figure S1) and temporal levels aligned with expectations from existing literature. Group-level correlations for PGSs had a similar direction but were more pronounced (Figure 12). This perspective sheds light on the potential shared or contrasting selective pressures on various phenotypes.
Autism exhibited a significant positive correlation with EA3 and EA4 (.59 and .68 respectively), contrasting with a substantial negative correlation with schizophrenia (−.83). This supports the hypothesis that schizophrenia and autism are at different extremes of the psychological spectrum (Crespi & Badcock, Reference Crespi and Badcock2008; Crespi et al., Reference Crespi, Stead and Elliot2010), possibly representing divergent cognitive-behavioral adaptation strategies in socio-cultural environments. The positive link between autism and cognition corroborates previous findings of a genetic correlation at the individual level (Grove et al., Reference Grove, Ripke, Als, Mattheisen, Walters, Won, Pallesen, Agerbo, Andreassen, Anney, Awashti, Belliveau, Bettella, Buxbaum, Bybjerg-Grauholm, Bækvad-Hansen, Cerrato, Chambert, Christensen, Churchhouse and Børglum2019)].
In contrast, depression was negatively correlated with EA3, EA4 and height, but showed a positive correlation with schizophrenia (r = .42). This is consistent with the commonly observed negative genetic and phenotypic correlation between depression and cognition (Xu et al., Reference Xu, Sun, Ji, Tian, Duan, Zhai, Wang, Pang, Zhang, Zhao, Li, Hjelmborg, Christensen and Tan2015). EA3 and EA4 not only had a strong correlation with each other (r = .86) but also with IQ (r ∼ .8) and SES (r ∼ .75), while showing negative correlations with neuroticism (r ∼ −.65) and schizophrenia (r = −.55 and −.77 respectively). These findings reflect the positive association within populations between SES and intelligence (Trzaskowski et al., Reference Trzaskowski, Harlaar, Arden, Krapohl, Rimfeld, McMillan, Dale and Plomin2014) but the negative genetic correlations of cognition with neuroticism (Anglim et al., Reference Anglim, Dunlop, Wee, Horwood, Wood and Marty2022) and schizophrenia (Ohi et al., Reference Ohi, Sumiyoshi, Fujino, Yasuda, Yamamori, Fujimoto, Shiino, Sumiyoshi and Hashimoto2018). Interestingly, although EA shows a slight positive correlation with schizophrenia within populations, it negatively correlates with intelligence. This discrepancy indicates that EA3 and EA4 are more indicative of selective pressures on intelligence rather than on noncognitive skills related to EA.
An unexpected finding was the strong positive correlation between height and intracranial volume (.7), yet no significant correlation with IQ. However, this correlation between height and intracranial volume was anticipated, considering the established positive relationship between brain and body size across and within species (Grabowski, Reference Grabowski2016).
In general, we identified the most substantial discrepancies across phenotypes between the Upper Paleolithic and Neolithic periods, likely attributable to the profound cultural and lifestyle shifts occurring between these two epochs. According to the model outlined in the introduction, the Neolithic Revolution, marking the transition from hunter-gatherer ways of life to agriculture-centered settlements, initiated a surge in population density, escalated social complexity, and necessitated enhanced planning and organization. Consequently, cognition underwent selection to navigate the intricacies of labor division, specialization, and the hurdles of social competition and collaboration. A significant ‘leap’ in the PGSs for EA and IQ was also observed between the Bronze Age and the Iron Age, hinting at a period of selection favoring sophisticated cognitive abilities.
The research findings support the weak hypothesis of Clark’s model, highlighting a distinct rise in the genetic predispositions for traits associated with intelligence and EA across Europe after the Middle Ages. When PGSs for cognitive phenotypes were analyzed within the context of Great Britain — by contrasting English medieval genomes with their modern British counterparts — the effect sizes (denoted through Cohen’s d) were either comparable or slightly reduced relative to the overall sample, ranging from d = 0.15 to 0.5 (Figure 11). Hence, Clark’s strong hypothesis — that the increase was more pronounced in Britain than elsewhere — is not supported by the analyzed dataset.
Our study included an analysis of a subsample consisting of 767 individuals from Southeastern Europe and Anatolia, specifically concentrating on two primary cognitive PGSs: EA3 and EA4. We observed a significant negative correlation between EA3 and Years BP, with a correlation coefficient of −.185 (p = 2.644e-07), as illustrated in Supplementary Figure S11. This suggests an increase in EA3 over time. Regression analysis, considering the first 10 PCs and sequence coverage, revealed a consistent effect of Years BP (β = −0.11), although the impact of sequence coverage alone was not significant (β = −0.04, p = .41).
Similarly, for EA4, a negative correlation with Years BP was noted (r = −.22), as depicted in Supplementary Figure S12. The regression analysis, adjusted for the first 10 PCs and sequence coverage, demonstrated a diminished yet significant effect (β = −0.11). This finding adds further support to the notion that polygenic selection influencing cognitive abilities extended beyond Northwestern Europe, encompassing regions of Western Asia and Southeastern Europe, thereby challenging the more stringent interpretation of Clark’s theory. While the impact of Years BP was slightly diminished in comparison to the total sample, a more robust inference about the differential in selection strength between these two regions would necessitate a larger sample size.
Our examination of key cognitive phenotypes (EA3, EA4, IQ) across historical cultures aligns with the patterns observed in ancestry and time-based analyses. The earliest samples, consisting of hunter-gatherers and those from the Middle East, had the lowest scores (Supplementary Figures S14, S15, S16). In contrast, samples from Iron Age and Medieval Italy showed the highest scores. This reinforces the conclusions of our recent study (Piffer et al., Reference Piffer, Dutton and Kirkegaard2023), which indicated that Europe’s PGSs reached their zenith in central Italy during the Republican era. Additionally, incorporating a new Etruscan sample (N = 48), genetically akin to the Republican Romans as per Posth et al. (Reference Posth, Zaro, Spyrou, Vai, Gnecchi-Ruscone, Modi, Peltzer, Mötsch, Nägele, Vågene, Nelson, Radzevičiūtė, Freund, Bondioli, Cappuccini, Frenzel, Pacciani, Boschin, Capecchi, Martini and Krause2021), into our Iron Age Italian dataset, further corroborates these results.
A significant divergence was noted in the sample of ancient Greeks from the Bronze Age in the comparison between IQ and EA. The Bronze Age Greeks, while displaying average scores on EA, manifested the highest scores in IQ PGS. This result was predicted based on their renowned cultural accomplishments and is in agreement with the historical estimates by Galton (Galton, Reference Galton1869). The marked disparity between the EA and IQ PGS among ancient Greeks merits additional research for a deeper understanding.
The analysis of physical traits, such as height, highlighted the influence of environmental factors, as shown by the positive effect of latitude on the height PGS (β = 0.5) in the regression model (Supplementary Table S9). This aligns with Bergmann’s rule, suggesting adaptations to varying climatic conditions. However, as shown in Supplementary Table S9, the impact of Years BP was reduced when controlling for ancestry (from β = −0.306 to −0.095).
Notably, a decline in height PGS was observed from the Paleolithic to the Neolithic periods, implying that the post-agricultural revolution reduction in body size could be attributed not only to dietary limitations and heightened prevalence of diseases (Mummert et al., Reference Mummert, Esche, Robinson and Armelagos2011), but also to genetic factors. We found a positive impact of EHG ancestry on the height PGS (β = 0.93). This result replicates the finding that differences in height between contemporary northern and southern Europeans are due to different amounts of Steppe ancestry (which is used as a proxy for EHG) rather than selection (Irving-Pease et al., Reference Irving-Pease, Refoyo-Martínez, Barrie, Ingason, Pearson, Fischer, Sjögren, Halgren, Macleod, Demeter, Henriksen, Vimala, McColl, Vaughn, Speidel, Stern, Scorrano, Ramsøe and Schork2024).
Indeed, polygenic height scores achieved their highest levels in medieval Dutch, English and Viking populations (Supplementary Figure S13). In contrast, contemporary British samples exhibited lower polygenic height scores compared to their medieval English counterparts. This observation raises two plausible interpretations: either there has been negative selection against height since the Middle Ages, or the ancestral composition of the medieval sample leaned more towards continental ancestry, resembling the Dutch and Scandinavian populations.
The latter scenario gains further support from the context in which the samples were collected — a research initiative primarily centered around Anglo-Saxon cemeteries aimed at investigating the extent of continental migration to England, as detailed in the study by Gretzinger et al. (Reference Gretzinger, Sayer, Justeau, Altena, Pala, Dulias, Edwards, Jodoin, Lacher, Sabin, Vågene, Haak, Ebenesersdóttir, Moore, Radzeviciute, Schmidt, Brace, Bager, Patterson, Papac and Schiffels2022). Their findings indicate that continental (‘Anglo-Saxon’) ancestry started spreading during the early Middle Ages but later blended with the pre-existing ‘Celtic’ or native genetic pool, resulting in a dilution of its genetic influence (Gretzinger et al., Reference Gretzinger, Sayer, Justeau, Altena, Pala, Dulias, Edwards, Jodoin, Lacher, Sabin, Vågene, Haak, Ebenesersdóttir, Moore, Radzeviciute, Schmidt, Brace, Bager, Patterson, Papac and Schiffels2022). Hence, the decrease in continental ancestry inferred by the study is likely due to a mix of genuine change in ancestry within England and the oversampling of cemeteries with continental ancestry. In turn, the decrease in continental ancestry is likely the cause of the observed decline in the height PGS for England.
Notably, in contemporary times, the Dutch population holds the distinction of being the tallest globally, and at the very least, within Europe. This observation lends credence to the notion that their genetic inclination towards greater height was already pronounced during medieval periods. As an intriguing historical anecdote, Dante, in the Divina Commedia (canto XXXI), poetically illustrates this height differential by noting that even when stacked three Friesians (‘frisoni’, from a region in the northern Netherlands) on top of one another, they could not reach the towering stature of the giant Nembrot.
Consequently, the significant increase (around 20 cm) in height among the Dutch over the last 200 years is likely attributable to environmental factors rather than natural selection, as outlined by Stulp et al. in their Reference Stulp, Bonnell and Barrett2023 study.
Intriguingly, we observed a reduction in the intracranial volume PGS from the Middle Ages to the contemporary period.
There was also a decrease (Cohen’s d = −0.42) from the Upper Paleolithic to the Neolithic period (Supplementary Figure S10). This fits the archaeological record of a reduction in cranial capacity during the transition from the Pleistocene to the Holocene (Stibel, Reference Stibel2021). This effect is likely driven by the decrease in body size during the Holocene transition, as Height and ICV were strongly correlated at the group level (Figure 12).
A significant decrease in ICV PGS was observed between the medieval and the contemporary sample (Cohen’s d = −0.17). This is in line with the brain size reduction over the last 3000 years reconstructed from fossil data (Stibel, Reference Stibel2021). Importantly, these findings show that the reduction in cranial capacity was not due to a reduction in the genotypic intelligence, as the IQ and EA PGS had the largest increases at the Pleistocene-Holocene and Medieval-Contemporary transitions. Regression models showed that the trajectory of ICV was largely determined by pleiotropic effects, primarily from EA4 and Height (see Supplementary information).
Consequently, these findings underscore the limitations of using cranial size in archaeological studies as a sole measure for tracking changes in complex cognition throughout recent evolutionary history.
Indeed, the temporal changes observed in several traits appear to be influenced more by pleiotropic effects resulting from selection on other traits than from direct selection effects (Suppl. info). Specifically, changes in ASD over time were shaped by common selective pressures on EA4 and SCZ. Evolutionary changes in Neuroticism are also best explained by pleiotropic effects from multiple traits, such as Depression, EA4 and SCZ. The changes in SCZ over time are attributed to pleiotropic effects from traits like ASD, Depression, EA3, EA4, Height and Neuroticism. Conversely, traits like EA3, EA4 and Height showed evidence of direct selection over time.
Admittedly, the persistence of an effect after accounting for the other traits depends on the amount of signal of each PGS. In fact, EA3, EA4 and Height were based on the largest GWAS and likely contain more signal than the other traits.
Collectively, these findings emphasize the intricate nature of evolutionary patterns, suggesting that trait changes over time can be shaped not just by direct selection effects but also by pleiotropic effects.
However, the data do not support the cold winters theory of intelligence (Lynn, Reference Lynn1991, Reference Lynn2006), as the effect of latitude on EA3 and IQ in the regression models (Supplementary Tables S1 and S3) tended to be slightly negative (β = −0.055 and −0.097, p = .022 and p < .001 respectively), contrary to the theory’s prediction. This implies that, at least within the temperate zone from which our samples were derived (30.64 to 69.65°), and within the last 12,000 years, climate did not exert a discernible selective pressure. Indeed, Eastern Hunter Gatherer ancestry, which derived up to 70% of ancestry from Ancient North Eurasians living in the extremely cold climates of Siberia (Posth et al., Reference Posth, Yu, Ghalichi, Rougier, Crevecoeur, Huang, Ringbauer, Rohrlach, Nägele, Villalba-Mouco, Radzeviciute, Ferraz, Stoessel, Tukhbatova, Drucker, Lari, Modi, Vai, Saupe, Scheib and Krause2023), was a negative predictor of EA or IQ in the regressions. It is possible that any selective pressure on intelligence from challenging natural environments was offset by the difficulties in forming complex, highly populated societies in ancient times, thereby disabling the gene-culture coevolutionary mechanism that appears to have accompanied humans over the last 12,000 years.
Our results underscore the complex interplay between genetic predispositions and environmental factors. For instance, the shift in height PGS from the Paleolithic to the Neolithic period suggests genetic adaptations alongside dietary and lifestyle changes.
The observed shifts in PGSs during key historical periods, such as the Neolithic Revolution and before the Industrial Revolution, align with major cultural and societal transformations. These findings provide empirical support for theories of gene-culture coevolution, particularly in cognitive traits.
While our study provides significant insights, it has limitations, such as potential biases in sample representation. Future research could focus on more diverse populations to further elucidate the genetic underpinnings of these traits.
Ascertainment bias, especially regarding traits tied to social status or intelligence, stems from the nonrandom selection of samples available for analysis. This bias can significantly distort our understanding of ancient populations and the prevalence or distribution of certain genetic traits, such as height, EA and IQ.
Ascertainment bias, especially regarding traits tied tBurial practices often varied significantly across different cultures and historical periods, with certain individuals more likely to be preserved and discovered, based on their social status, wealth or other societal factors. High-status individuals might have been buried in ways that better preserve their remains (e.g., in tombs or with goods that deter grave robbers), making their genomes more accessible to researchers. This can lead to an overrepresentation of the genetics of the elite in ancient DNA studies.o social status or intelligence, stems from the nonrandom selection of samples available for analysis. This bias can significantly distort our understanding of ancient populations and the prevalence or distribution of certain genetic traits, such as height, EA and IQ.
There is also a tendency for research to focus on sites and samples that are more accessible or better preserved, which might not be representative of the broader ancient population. This can lead to a form of ascertainment bias where the conclusions drawn from such studies disproportionately reflect the genetics of specific subgroups.
Another potential source of significant biases is the practice of cremation. In cultures where cremation was preferred, the high temperatures effectively destroy organic material, including DNA, making it impossible to recover genetic information from cremated remains. This practice contrasts with burial, where DNA can be preserved for thousands of years under the right conditions.
This variation can lead to a form of bias in ancient DNA studies, as the genetic information accessible to researchers may disproportionately represent cultures or subpopulations that favored burial. Moreover, if cremation practices varied by social status or religion within each society, this would represent another source of bias.
Among ancient Greeks, lavish cremations and ceremonies were common, with famous examples like Achilles and Hector from the Iliad illustrating the elaborate nature of these rites. Following the Greeks, the Romans adopted cremation, especially for their military heroes. Cremation became a status symbol in Rome, where urns containing ashes were placed in columbaria (Borbonus, Reference Borbonus2019; Hope, Reference Hope2007; Pearce et al., Reference Pearce, Millett and Struck2000).
However, around 100 CE, cremation practices declined, partly due to the spread of Christianity. Cremation could have introduced bias also in the representativeness of our large Viking sample, as this practice was particularly common, especially during the early Viking age.
Furthermore, although our total sample size was relatively large (N = 2625), the sample size for some historical periods (e.g., Paleolithic, Imperial) lacked the power to detect significant effects.
Additionally, our analysis did not incorporate gene-environment interactions. Notably, allele effects on traits may vary temporally, such that certain alleles contributing to a trait’s risk in ancestral environments may not have the same impact in contemporary contexts. While this phenomenon is well documented in Mendelian diseases (e.g., the interaction of PKU with diet, the MAOA gene with childhood maltreatment, and sickle cell anemia with malaria exposure), detecting similar effects in polygenic traits is more challenging due to the involvement of numerous genetic loci, each exerting minor influences. A reviewer hypothesized that such gene-environment interactions might falsely appear as an increase in the PGS for EA over time, resulting from the loss of alleles that were advantageous in historical environments but are no longer beneficial. It is also important to consider that alleles associated with enhanced cognitive abilities in ancient times may still be advantageous in modern environments. This is because the underlying structure of general intelligence has been shown to predict success in a wide range of activities, from physical fitness to academic achievement (Gil-Espinosa et al., Reference Gil-Espinosa, Chillón, Fernández-García and Cadenas-Sanchez2020; Plomin & Deary, Reference Plomin and Deary2015).
Furthermore, extending this hypothesis to other traits we analyzed, one might expect a uniform rise in PGS across all traits. However, our results indicated a decrease in PGS for traits such as neuroticism, depression and schizophrenia, pointing to a more complex evolutionary narrative than the one suggested. For instance, assuming the GWAS identified schizophrenia risk alleles relevant predominantly in contemporary settings, one would predict lower schizophrenia PGS in ancient samples, reflecting the loss of these risk alleles in modern populations. Contrary to this expectation, we observed an elevated schizophrenia PGS in ancient samples, a trend paralleled in neuroticism and depression.
It is crucial to recognize that GWAS identify both positive and negative genetic correlations with traits. These findings are integrated into the calculation of PGS, which aggregates the varied impacts of genetic variants, both augmenting and diminishing the trait, weighted by their respective effect sizes. If alleles that increase a trait’s likelihood diminish in influence in ancient populations, the same principle should apply to alleles that decrease the trait’s likelihood. Ideally, these contrasting effects should neutralize each other, maintaining a balance in the PGS across different time periods.
In conclusion, our study highlights the intricate nature of genetic evolution, influenced by cultural, societal, and environmental factors. It affirms the weak hypothesis of Clark’s model, showing a notable increase in genetic predispositions for intelligence and EA across Europe post-Middle Ages. These insights contribute to our understanding of human evolutionary dynamics over the last several millennia
https://www.cambridge.org/core/journ...B0BC8E37D9273D






| Thumbs Up/Down |
| Received: 998/0 Given: 737/2 |
Many findings seem to be in accord with this as yet to be peer-reviewd paper.
Discussion
Figure 3: Gallery of notable single-locus selection trajectories.
Each panel displays the derived allele frequency trajectory over time for a variant (uncorrected for structure), along with selection coefficient (s), selection statistic (X), and posterior probability of selection (π). Circles represent frequencies in Western Hunter-Gatherers (orange), Early European Farmers (green), and Steppe Pastoralists (blue). The highlighted loci are not necessarily those with the strongest signals, and even include negative results. We highlight them here because of their biological interest and because they speak to long-standing debates. For Panels 4, 5, 6, 33, 35, and 36 separate analyses are shown for transects before and after a manually selected peak (marked by a black line), with 200-year overlap. In cases where π>90%, the confidence interval is shaded blue (or blue for before and red for after the split); otherwise, the shading is gray. Variants reported in other ancient DNA studies are marked with an asterisk.
Figure 4: Coordinated selection on alleles affecting same traits (polygenic adaptation).
The polygenic score of Western Eurasians over 14000 years in black, with 95% confidence interval in gray. Red represents the linear mixed model regression, adjusted for population structure, with slope γ. Three tests of polygenic selection—γ, γsign, and rs—are all significant for each of these twelve traits, with the relevant statistics at the top of each panel.
Previous work has shown that classic selective sweeps driving alleles to fixation have been rare over the broad span of human evolution67,68. Thus, we were surprised that over the last 14,000 years in West Eurasia there have been many hundreds of instances of directional selection with coefficients on the order of 0.5% or more (Figure 2b). This is large enough that if a similarly dense landscape of directionally selected variants had existed tens of thousands of years ago, and if the selection coefficients had been constant since then, we would expect many fixed differences across populations, despite the fact that previous studies have shown there are only a handful—hardly more than would be expected based on random drift68.
An alternative explanation for this paradox is to hypothesize that West Eurasians have been experiencing qualitatively more and different natural selection in the Holocene than in earlier periods because of rapidly changing lifestyles and economies. Without a comparable time transect before the advent of food production and societies with high population densities, it is impossible to test this directly. However, this hypothesis is consistent with our evidence of particular intense selection for blood-immune-inflammatory traits, and our evidence that selection for these traits becoming even stronger in the Bronze Age than it was in earlier periods (Figure 1c, Extended Data Figure 4b).
We project that there are at least 5000 independent signals of directional selection (half of the 10361 non-HLA loci found at the FDR=50% threshold) that are in linkage disequilibrium with the overwhelming majority of variants in the genome (Extended Data Figure 2b). This seem to be at odds with findings that there has been relatively little contribution from directional selection to allele frequency changes in genome compared to much larger forces of gene flow, genetic drift, and purifying or stabilizing selection69. In fact, there is no conflict. Our method allows us to partition the effects of selection at each SNP into the effects of directional selection (s), and the combined effects of fluctuating selection and drift (σ2). We estimate that only 2.35 ± 0.13% (jackknife standard deviation) of allele frequency changes are due to directional selection. These results suggest that selection is so rampant that even if a tiny fraction of allele-frequency change is due to directional selection, this corresponds to many hundreds of loci. A corollary is that recent studies finding that stabilizing selection is relatively more important than directional selection in shaping the human allele frequency spectrum70 are fully reconcilable with our analyses.
It is important to apply similar approaches to ancient DNA time series over longer times and to other world regions. Comparison of ancient DNA time transects would allow more generalizable insights by identifying which patterns of selection are shared and which are distinctive to the human population history of Holocene West Eurasia.
https://www.biorxiv.org/content/10.1....613021v1.full






| Thumbs Up/Down |
| Received: 998/0 Given: 737/2 |
Despite using the search bar to look for threads on these studies, I could only find one based on the first paper on the "similar threads" pannel. Anyway, at least it seems the one by David Reich hasn't seen any thread on it until now.
I wonder why they didn't plot a graph for autism, TA deserved it.
There are currently 4 users browsing this thread. (0 members and 4 guests)
Bookmarks