Poor sleep may accelerate brain aging

People who sleep poorly are more likely than others to have brains that appear older than they actually are. This is according to a comprehensive brain imaging study from Karolinska Institutet, published in the journal eBioMedicine. The paper is titled “Poor sleep health is associated with older brain age: the role of systemic inflammation.”

Increased inflammation in the body may partly explain the association.

Poor sleep has been linked to dementia, but it is unclear whether unhealthy sleep habits contribute to the development of dementia or whether they are rather early symptoms of the disease.

In a new study, researchers at Karolinska Institutet have investigated the link between sleep characteristics and how old the brain appears in relation to its chronological age.

The study includes 27,500 middle-aged and older people from the UK Biobank who underwent magnetic resonance imaging (MRI) of the brain. Using machine learning, the researchers estimated the biological age of the brain based on over a thousand brain MRI phenotypes.

Low-grade inflammation

The participants’ sleep quality was scored based on five self-reported factors: chronotype (being a morning/evening person), sleep duration, insomnia, snoring, and daytime sleepiness. They were then divided into three groups: healthy (≥4 points), intermediate (2–3 points), or poor (≤1 point) sleep.

“The gap between brain age and chronological age widened by about six months for every 1-point decrease in healthy sleep score,” explains Abigail Dove, researcher at the Department of Neurobiology, Care Sciences and Society, Karolinska Institutet, who led the study. “People with poor sleep had brains that appeared on average one year older than their actual age.”

To understand how poor sleep can affect the brain, the researchers also examined levels of low-grade inflammation in the body. They found that inflammation could explain just over ten percent of the link between poor sleep and older brain age.

“Our findings provide evidence that poor sleep may contribute to accelerated brain aging and point to inflammation as one of the underlying mechanisms,” says Abigail Dove. “Since sleep is modifiable, it may be possible to prevent accelerated brain aging and perhaps even cognitive decline through healthier sleep.”

Several possible explanations

Other possible mechanisms that could explain the association are negative effects on the brain’s waste clearance system, which is active mainly during sleep, or that poor sleep affects cardiovascular health, which in turn can have a negative impact on the brain.

Participants in the UK Biobank are healthier than the general UK population, which could limit the generalizability of the findings. Another limitation of the study is that the results are based on self-reported sleep.

The study was conducted in collaboration with researchers from the Swedish School of Sport and Health Sciences, and Tianjin Medical University and Sichuan University in China, among others.

Meet your worm avatar: How microscopic worms are helping find new drugs for rare diseases

New research from the MRC Laboratory of Medical Sciences (LMS) provides a powerful, scalable method for finding treatments for rare genetic diseases using tiny, transparent worms.

The study, led by Dr. André Brown and the Behavioral Phenomics group at the LMS, is published in BMC Biology. It represents a step toward solving a major challenge in medicine: how to develop treatments for the thousands of rare genetic diseases that currently have none. The work builds on a previous study published earlier this year in eLife, and together they mark a shift in how we can model these diseases and test potential treatments at scale.

The rare disease paradox

Rare diseases may be individually uncommon, but together they affect millions of people worldwide. With over 7,000 known rare genetic disorders, fewer than 10% have approved treatments. One of the main obstacles? Economics.

Developing a new drug from scratch typically takes 10 years and costs about $2.5 billion. For diseases that affect just a handful of patients, traditional pharmaceutical investment simply doesn’t add up. That’s left most rare disease families with little more than a diagnosis and no clear path forward.

A scalable alternative: Worm avatars

Enter Caenorhabditis elegans, a tiny nematode worm with surprising power. Using this microscopic organism, researchers can now rapidly create genetic “avatars” of rare diseases—worms engineered to carry the same genetic mutations as human patients.

Worm models of disease are not new, but what sets this latest work apart is the systematic, high-throughput approach the team has developed. By using advanced imaging and behavior-tracking tools, they can now quantify subtle movement differences in mutant worms—what they call “behavioral fingerprints”—and use those signatures to test the effect of hundreds of existing drugs.

This is particularly beneficial because many rare diseases affect the nervous system meaning there are often behavioral phenotypes, not just developmental phenotypes, which are harder to see by eye.

Why drug repurposing?

Instead of starting from scratch, the researchers are focused on repurposing existing drugs—ones already shown to be safe in humans. That dramatically speeds up the path to the clinic.

It’s not just theoretical. The drug Epalrestat made it from a worm model to a Phase III clinical trial in just five years, at a fraction of the typical cost—roughly $5 million. Another compound, Ravicti, followed a similar trajectory after being identified in an initial worm screen.

What’s new in the latest study?

While the first study, published in eLife in January, focused on gene knockouts—completely disabling certain genes—the latest paper takes things a step further. It introduces patient-specific mutations into the worms, mimicking the exact DNA changes found in individuals with ultra-rare conditions.

“These models are closer to what’s actually happening in patients,” explains André. “And we’ve shown that our behavioral phenotyping method works across a wide range of these models.”

The team’s goal now is to extend this approach to thousands of diseases.

The promise is enormous. The team believes that with sufficient investment, it would be possible to create worm avatars for every rare disease with a conserved gene and systematically screen existing drugs for therapeutic effects.

In the long term, that could mean faster, cheaper access to treatments for families who currently have no options.

“It’s a different approach to disease modeling—cheaper, faster, and more scalable. We still have a lot to learn, and not every model will lead to a treatment,” says André “But we now know it’s possible to do this systematically. That’s a new opportunity.”

RSV vaccines are safe and effective, review finds

A new Cochrane review demonstrates that vaccines for respiratory syncytial virus (RSV) are both safe and effective in protecting vulnerable groups that are most at risk of serious illness, including older adults and infants.

RSV is a common virus that causes coughs and colds but can also lead to life-threatening lung infections like pneumonia. Children under the age of two are at the highest risk of severe RSV infection and death, with older adults also vulnerable.

An international group of researchers analyzed 14 clinical trials with over 100,000 participants, including older adults, pregnant women, women of childbearing age, and children. Trials were conducted across a wide range of countries, spanning all continents. The work is published in the Cochrane Database of Systematic Reviews.

Results showed strong evidence that the RSV prefusion vaccines in older adults reduce RSV-associated lower respiratory tract (such as pneumonia and bronchitis) disease by 77% and RSV-associated acute respiratory disease (such as the cold) by 67%. Vaccination of pregnant individuals with an RSV F protein-based vaccine reduced the risk of their children needing medical care for RSV-associated lower respiratory tract disease by 54%, reduced the babies’ chance of severe RSV-related disease by 74%, and lowered the risk of hospitalization by 54%.

“From our review of clinical trials, we found high-certainty evidence that RSV vaccines protect older adults and strong evidence they benefit infants when mothers are vaccinated during pregnancy,” said Dr. KM Saif-Ur-Rahman, lead author and Senior Research Methodologist at Evidence Synthesis Ireland and Cochrane Ireland, University of Galway, Ireland. “That’s encouraging news for two of the groups most at risk.”

The review found little to no difference in serious side effects between vaccinated and unvaccinated groups across all age groups.

The findings of this review are based on clinical trial data, as real-world evidence on effectiveness and safety was not yet available at the time of publication.

“It’s important to be clear that our review is based on evidence from randomized trials, the strongest evidence available,” said Kate Olsson, author and vaccine expert from the European Centre for Disease Prevention and Control (ECDC). “Post authorization real-world studies are ongoing and data from those studies will continue to add to what we know about the safety and effectiveness of these RSV vaccines.”

The systematic review is planned to be complemented by two additional analyses on the efficacy, effectiveness, and safety of different RSV vaccines following search updates. ECDC plans to publish the first update with new data in the coming weeks.

Genetic adaptation helps Turkana people conserve water in harsh desert climate

Cornell researchers have contributed to a multi-institutional study of how the nomadic Turkana people of northern Kenya—who have lived for thousands of years in extreme desert conditions—evolved to survive, showing humans’ resilience in even the harshest environments.

In the study, published in Science on Sept. 18, a team of researchers from Kenya and the U.S., working with Turkana communities, identified eight regions of DNA in the genomes of the Turkana that have evolved through natural selection in the last 5,000 to 8,000 years. One gene in particular showed exceptionally strong evidence for recent adaptation: STC1, which helps the kidneys conserve water and also may protect from waste products in a diet, like the Turkana’s, that is rich in red meat.

Cornell researchers helped to identify when and how the adaptive variant of STC1 emerged and to link it to changes in the environment, finding that the Turkana’s ability to thrive with less water emerged around 5,000 to 8,000 years ago, at the same time Northern Kenya went through a period of aridification.

“The project really looks at it from all these different angles and comes up with this quite coherent story which sets it apart from other studies,” said Philipp Messer, associate professor of computational biology in the College of Agriculture and Life Sciences.

Four years ago, lead researchers from the project—from the University of California, Berkeley; Vanderbilt University; the Nairobi-based Turkana Health and Genomics Project (THGP) and others—had already, after extensive discussions with Turkana elders and community members, sequenced 367 whole genomes and identified the STC1 gene. But they wanted to better understand how the adaptive variant of this gene evolved.

That’s when they connected with Messer and then-graduate student and second author Ian Caldas, Ph.D. ’22, who had developed a method using machine learning and simulations to infer how and when an adaptation emerged and how quickly it spread through a population.

“They wanted to know how this adaptation came about. Was it a new mutation? Did it already exist in the population previously and then become more widely prevalent as it became adaptive?” Messer said. “And Ian had developed this really cool new method to infer those parameters from genomic data.”

Messer and Caldas found that the STC1 adaptation had likely already been present in the population at a low frequency long before it began to increase between 5,000 to 8,000 years ago. In another population in East Africa, the Daasanach, researchers found that the adaptation arose independently at around the same time.

“This made a lot of sense because that’s when a lot of aridification happened in the region,” Messer said. “We were also able to measure how strong selection was at this locus, and it’s very strong.”

They calculated that the selection coefficient is around 5%, which means Turkana with the adaptive variant of the gene, on average, had 5% more offspring than those without it. “It might seem like a small number, but if you have enough individuals, then it becomes statistically significant, and that adaptation is very likely to spread through the population,” Messer said. “Five percent is in line with the strongest other examples of recent adaption in humans that we know of.”

The study provides a uniquely robust link between the environment, genetic adaptation and the human phenotype and experience of Turkana: Over the course of years of blood and urine samples, the research team found that 90% of participants were technically dehydrated but otherwise healthy. Turkana get an estimated 70 to 80% of their nutrition from animal products such as milk, blood and meat, but gout, which can be caused by a buildup of waste products related to the body’s processing of red meat, is rare in the community.

The research underlines humans’ ability to survive and adapt to harsh environments—which is particularly germane given the impending impacts of climate change, the authors write. The study also has a direct impact on modern Turkana communities; as more in their population transition to urban environments, their genetic makeup may turn from beneficial to detrimental, a phenomenon called evolutionary mismatch. The broader research team found that Turkana living in cities are more prone to chronic diseases such as hypertension and obesity.

The team is currently working on a podcast, in the native language, to reach Turkana communities and pass on the knowledge gleaned from the study.

Childhood stress strongly linked to chronic disease in adulthood, researchers report

Research published by Duke University researchers has found a strong link between higher stress in children and adverse health conditions for them later in life. Appearing in the journal Proceedings of the National Academy of Sciences, the study used measurable metrics of health over time to create a more quantitative view of how stress early in life affects health.

“We’ve had an idea for a long time, since the ’80s at least, that when children have adversity in their lives, it affects how their bodies work, not just psychologically, but also physiologically. It gets underneath the skin, and it becomes embodied in the way your body handles stress,” said co-author Herman Pontzer, Duke professor of evolutionary anthropology and global health.

Researchers focused on allostatic load (AL), which refers to the wear and tear on the body because of chronic stress. The researchers “tested associations between childhood AL and adult cardiometabolic health,” relying on biomarkers that included antibodies of C-reactive protein, which is a marker of inflammation in the body; and the Epstein-Barr virus, which is common and highly contagious; body mass index; and blood pressure.

Analysis by lead author Elena Hinz, a Ph.D. student in the Pontzer Lab at Duke, showed that a child’s stress levels, in children as young as 9 to 11 years old, is an indicator of their cardio and metabolic health in adulthood.

Usually, researchers ask adults to remember their stress. Hinz and her fellow researchers used a big study that collected quantitative—not just qualitative—samples over time.

The paper’s authors reviewed data from the Great Smoky Mountains Study (GSMS), a longitudinal study of child psychiatric disorders that began in 1992—and continues today—to determine the need for mental health services.

Hinz, who grew up in a rural community in East Tennessee, said her own coming-of-age experiences spurred her interest in childhood stress.

“I’m from the rural South and kind of have this idea of what stress looks like in that environment, in terms of childhood adversity and dietary stress and the physical environment that kids are in,” said Hinz.

Hinz said humans combat acute stress through a fight or flight response: “Your body collectively reacts by increasing your heart rate and blood pressure when you are experiencing a stressful situation,” she explained. “Those and other responses help you deal with that stress, but it’s not good to always be in that state. I’m interested in what happens when that doesn’t really subside.”

Poverty is at the crux of the study, which indicates a stable, financially secure home is essential for a healthy childhood free of chronic stress.

“Children, eight, nine and ten years old—what’s happening to them seems to be predictive of blood pressure,” said Pontzer about early life stress.

“What helps is education and job training and all of the stuff that gets communities out of poverty. That gets people the help they need when they need it, as opposed to health care cost barriers. Making sure that a kid knows there’s going to be dinner and food on the table because that psychological stress isn’t just psychological,” said Pontzer. “It gets into the way your body works.”

Unique pan-cancer immunotherapy destroys tumors without attacking healthy tissue

A new, highly potent class of immunotherapeutics with unique Velcro-like binding properties can kill diverse cancer types without harming normal tissue, University of California, Irvine cancer researchers have demonstrated.

A team led by Michael Demetriou, MD, Ph.D., has reported that by targeting cancer-associated complex carbohydrate chains called glycans with binding proteins, they could penetrate the protective shields of tumor cells and trigger their death without toxicity to surrounding tissue.

Their biologically engineered immunotherapies—glycan-dependent T cell recruiter (GlyTR, pronounced ‘glitter’) compounds, GlyTR1 and GlyTR 2—proved safe and effective in models for a spectrum of cancers, including those of the breast, colon, lung, ovaries, pancreas and prostate, the researchers report in the journal Cell.

“It’s the holy grail—one treatment to kill virtually all cancers,” said Demetriou, a professor of neurology, microbiology and molecular genetics at the UC Irvine School of Medicine and the paper’s corresponding author. “GlyTR’s velcro-like sugar-binding technology addresses the two major issues limiting current cancer immunotherapies: distinguishing cancer from normal tissue and cancer’s ability to suppress the immune system.”

Landmark research

The study’s publication, the culmination of a decade of research, is a watershed moment and source of pride for UC Irvine and the UCI Health Chao Family Comprehensive Cancer Center.

“This landmark study is a paradigm shift with the very real potential to change how we treat cancer patients,” said Marian Waterman, Ph.D., former deputy director of research at the cancer center and champion of the project since Demetriou, a UCI Health neurologist, began working on the concept in 2015 with his then-postdoctoral fellow, Raymond W. Zhou, the study’s first author.

Added Richard A. Van Etten, MD, Ph.D., director of the cancer center and an early supporter of the GlyTR project, “This novel technology may, for the first time, allow the widespread application of targeted T-cell therapy to solid tumors, which is the ‘holy grail’ in the immuno-oncology field.”

Current treatments, such as chimeric antigen receptor (CAR) T therapy, use the body’s white blood cells to attack cancer. They have largely worked only for blood cancers, such as leukemia. The GlyTR technology also proved effective in targeting leukemia, the study shows.

Unorthodox approach

While many cancer researchers have sought protein biomarkers for specific cancers, Demetriou and Zhou aimed at a more abundant target, the unique coating of glycans that surround cancer cells but are found in very low density in normal cells.

These complex sugar chains are the most widespread cancer antigens known but were generally ignored by researchers because they are inert to the immune system.

To solve this problem, Demetriou and Zhou engineered the GlyTR compounds to attach themselves, Velcro-like, to glycan-dense cancer cells while ignoring low-glycan-density normal cells. Once attached, the GlyTR compounds identify the cancer cells as targets for killing by the body’s immune system.

In contrast, current cancer immunotherapies attack cells based on specific proteins regardless of their glycan density and thereby fail to distinguish tumor cells from healthy tissue.

A second impediment to developing broadly active cancer immunotherapies is the shield glycans form around solid tumors.

By targeting glycans and blanketing the tumor cells with the Velcro-like compounds, the GlyTR technology overcomes both obstacles.

Human trials

The next step will be testing the therapy’s safety and effectiveness in humans. Clinical grade GlyTR1 protein manufacturing is already being developed at the NCI Experimental Therapeutics program labs in Maryland, Demetriou said.

That will enable the launch of a phase 1 clinical trial, which could begin within about two years. It will test the therapy in patients with a range of metastatic solid cancers. The highest glycan density is typically seen in patients with refractory/metastatic disease, a population that also has the greatest unmet need for treatment.

“This is the revolutionary approach to cancer treatment our patients have been waiting for,” said Farshid Dayyani, MD, Ph.D., medical director of the cancer center’s Stern Center for Clinical Trials and Research. “We are committing all available resources to bring this exciting new trial to UCI Health as fast as possible.”

A new look at how the brain works reveals that wiring isn’t everything

How a brain’s anatomical structure relates to its function is one of the most important questions in neuroscience. It explores how physical components, such as neurons and their connections, give rise to complex behaviors and thoughts. A recent study of the brain of the tiny worm C. elegans provides a surprising answer: Structure alone doesn’t explain how the brain works.

C. elegans is often used in neuroscience research because, unlike the incredibly complex human brain, which has billions of connections, the worm has a very simple nervous system with only 302 neurons. A complete, detailed map of every single one of its connections, or brain wiring diagram (connectome), was mapped several years ago, making it ideal for study.

In this research, scientists compared the worm’s physical wiring in the brain to its signaling network, how the signals travel from one neuron to another. First, they used an electron microscope to get a detailed map of the physical connections between its nerve cells. Then, they activated individual neurons with light to create a signaling network and used a technique called calcium imaging to observe which other neurons responded to this stimulation. Finally, they used computer programs to compare the physical wiring map and the signal flow map, identifying any differences and areas of overlap.

The team discovered that the brain’s functional organization differs from its anatomical structure. An analogy is that the brain’s structure is like a city map showing every street. However, the function is more akin to traffic flow, with jams, detours and shortcuts that are not visible on the map. In other words, brain activity does not always follow the predictable pathways of its physical wiring.

“Our results provide new insight into the interplay between brain structure, in the form of a complete synaptic-level connectome, and brain function, in the form of a system-wide causal signal propagation atlas,” wrote the researchers in their paper published in PRX Life. “Collectively, our findings suggest that the effective signaling network…has different network properties from the underlying connectome.”

While the physical and signal maps differed significantly, a few features were preserved. For example, the physical and signal wiring maps of the worm’s feeding organ (pharynx) look the same.

What it means for us

The research may well have been conducted in a tiny worm, but the findings have enormous implications for us. They suggest that scientists need to look beyond the brain’s wiring to fully understand how it works. This may help improve our understanding of neurological disorders like Alzheimer’s and schizophrenia, which involve an interruption in the brain’s ability to process information.

Key driver of pancreatic cancer spread identified

A Cornell-led study has revealed how a deadly form of pancreatic cancer enters the bloodstream, solving a long-standing mystery of how the disease spreads and identifying a promising target for therapy.

Pancreatic ductal adenocarcinoma is among the most lethal cancers, with fewer than 10% of patients surviving five years after diagnosis. Its microenvironment is a dense, fibrotic tissue that acts like armor around the tumor. This barrier makes drug delivery difficult and should, in theory, prevent the tumor from spreading. Yet the cancer metastasizes with striking efficiency—a paradox that has puzzled scientists.

New research published in the journal Molecular Cancer reveals that a biological receptor called ALK7 is responsible, by activating two interconnected pathways that work in tandem. One makes cancer cells more mobile through a process called epithelial-mesenchymal transition, and the other produces enzymes that physically break down the blood vessel walls.

“In other words, ALK7 gives pancreatic cancer cells both the engine to move and the tools to invade,” said Esak Lee, lead author of the study and assistant professor in the Meinig School of Biomedical Engineering in Cornell Engineering.

The research helps resolve conflicting findings about ALK7, which some studies had linked to blocking cancer spread while others had tied it to driving it. Using mouse models of pancreatic cancer and advanced organ-on-chip systems that mimic human blood vessels, the researchers showed that blocking ALK7 significantly slowed metastasis.

The organ-on-chip system, developed in Lee’s lab, simulates the tumor microenvironment and is superior to animal models for studying different stages of the cancer. Using it, the researchers studied whether ALK7 drives the initial invasion of blood vessels or the later stage, when circulating tumor cells exit the bloodstream to form new tumors in organs such as the lungs or liver.

What they found was that cancer cells couldn’t enter blood vessels when ALK7 was inhibited. But when they mimicked a later stage of cancer by placing the cells inside the vessels, they spread quickly, indicating that the timing for treatment is crucial.

“Once we miss this early opportunity to block ALK7 receptors, the cancer cells can freely circulate in the bloodstream and easily seed into other organs,” Lee said. “But if we can inhibit ALK7 at the cancer’s earliest and most vulnerable stage, we might see better outcomes for patients.”

The study also highlights the potential to apply organ-on-chip platforms to study other types of cancers, or how immune cells infiltrate and exit vessels.

“Some cancers have very different microenvironments so, potentially, ALK7 might show different impacts,” Lee said. “I hope this study really opens a new avenue for cancer research.”

Popular keto diet linked to glucose intolerance and fatty liver in mice

Avocado toast with fried cheese as the bread and zucchini noodles in butter-bacon sauce are among the many recipe ideas fueling social media’s beloved high-fat, low-carbohydrate ketogenic, or keto diet. However, scientists have found that while keto can lead to limited weight gain and even weight loss, it does so at the cost of metabolic issues like glucose intolerance.

In a joint study by the University of Utah and Utah Diabetes and Metabolism Research Center, U.S., scientists divided mice into four dietary groups: ketogenic diet (KD, 90% fat), a high-fat diet (HFD, 60% fat), low-fat (LFD), and low-fat moderate protein (LFMP), with varying levels of carbohydrates and proteins.

The mice were allowed to eat freely for up to 36 weeks in males and 44 weeks in females. Test results showed that while the KD supported weight control, it also raised blood cholesterol levels and led to fatty liver in males.

The findings are published in Science Advances.

The diet is named “ketogenic” because it sends the body into ketosis—a metabolic state where the body burns fat as the primary fuel instead of the usual carbohydrates, and, as a result, produces molecules called ketone bodies. The diet isn’t a new food trend, it has been around for nearly 100 years and is well-established for treating drug-resistant epilepsy in children, by reducing seizures in many cases.

The exact way a KD helps control seizures is still unclear, but several ideas have been proposed. Some studies have suggested that KD can stabilize blood glucose (BG), which is beneficial to brain metabolism and neurotransmitter activity, while others highlight the anticonvulsant effects of ketone bodies themselves.

As ketogenic diets have exploded in popularity in recent years, the researchers of this study wanted to explore the long-term metabolic effects of following the diet. Previous studies have explored this arena, but often reported mixed findings or did not directly compare KDs to other diets or account for both sexes.

For this study, the researchers included both male and female mice and carried out regular check-ins to assess the long-term effects of KD on health parameters, including body composition, organ health, and blood profile.

They found that KD protected against excessive weight gain compared to the conventional high-fat diet, but gained more weight than low-fat diets. Long-term KD caused severe hyperlipidemia, meaning there were very high levels of fat in the blood.

KD-fed mice developed severe glucose intolerance because the diet caused a severe impairment in insulin secretion from the pancreas. While female mice on KD seemed fine, the male ones showed signs of liver dysfunction and fatty liver.

The findings made it quite evident that long-term KD can trigger several metabolic disturbances, raising caution against its widespread use as a health-promoting diet.

The researchers note that further studies are needed to explore how variations in fat composition and macronutrient ratios affect metabolic effects. This is especially important for making the ketogenic diet safer for people who rely on it to treat epilepsy.

Drinking any amount of alcohol likely increases dementia risk

Drinking any amount of alcohol likely increases the risk of dementia, suggests the largest combined observational and genetic study to date, published in BMJ Evidence-Based Medicine.

Even light drinking—generally thought to be protective, based on observational studies—is unlikely to lower the risk, which rises in tandem with the quantity of alcohol consumed, the research indicates.

Current thinking suggests that there might be an “optimal dose” of alcohol for brain health, but most of these studies have focused on older people and/or didn’t differentiate between former and lifelong non-drinkers, complicating efforts to infer causality, note the researchers.

To circumnavigate these issues and strengthen the evidence base, the researchers drew on observational data and genetic methods (Mendelian randomization) from two large biological databanks for the entire dose range of alcohol consumption. These were the US Million Veteran Program (MVP), which includes people of European, African, and Latin American ancestry, and the UK Biobank (UKB), which includes people of predominantly European ancestry.

Participants who were aged 56–72 at baseline were monitored from recruitment until their first dementia diagnosis, death, or the date of last follow-up (December 2019 for MVP and January 2022 for UKB), whichever came first. The average monitoring period was four years for the US group, and 12 for the UK group.

Alcohol consumption was derived from questionnaire responses—over 90% of participants said they drank alcohol—and the Alcohol Use Disorders Identification Test (AUDIT-C) clinical screening tool. This screens for hazardous drinking patterns, including the frequency of binge drinking (6 or more drinks at a time).

In all, 559,559 participants from both groups were included in observational analyses, 14,540 of whom developed dementia of any type during the monitoring period: 10,564 in the US group; and 3,976 in the UK group. In total, 48,034 died: 28,738 in the US group and 19,296 in the UK group.

Observational analyses revealed U-shaped associations between alcohol and dementia risk: Compared with light drinkers (fewer than seven drinks a week), a 41% higher risk was observed among non-drinkers and heavy drinkers consuming 40 or more drinks a week, rising to a 51% higher risk among those who were alcohol-dependent.

Mendelian randomization genetic analyses drew on key data from multiple large individual genome-wide association studies (GWAS) of dementia, involving a total of 2.4 million participants to ascertain lifetime (rather than current) genetically predicted risks.

Mendelian randomization leverages genetic data, minimizing the impact of other potentially influential factors, to estimate causal effects: genomic risk for a trait (in this case, alcohol consumption) essentially stands in for the trait itself.

Three genetic measures related to alcohol use were used as different exposures to study the impact of alcohol quantity, as well as problematic and dependent drinking on dementia risk.

These exposures were self-reported weekly drinks (641 independent genetic variants); problematic “risky” drinking (80 genetic variants); and alcohol dependency (66 genetic variants).

Higher genetic risk for all three exposure levels was associated with an increased risk of dementia, with a linear increase in dementia risk the higher the alcohol consumption.

For example, an extra 1–3 drinks a week was associated with a 15% higher risk. A doubling in the genetic risk of alcohol dependency was associated with a 16% increase in dementia risk.

But no U-shaped association was found between alcohol intake and dementia, and no protective effects of low levels of alcohol intake were observed. Instead, dementia risk steadily increased with more genetically predicted drinking.

Additionally, those who went on to develop dementia typically drank less over time in the years preceding their diagnosis, suggesting that reverse causation—whereby early cognitive decline leads to reduced alcohol consumption—underlies the supposed protective effects of alcohol found in previous observational studies, say the researchers.

They acknowledge that a principal limitation of their findings is that the strongest statistical associations were found in people of European ancestry, because of the numbers of participants of this ethnic heritage studied. Mendelian randomization also relies on assumptions that can’t be verified, they add.

Nevertheless, they suggest that their findings “challenge the notion that low levels of alcohol are neuroprotective.”

They conclude, “Our study findings support a detrimental effect of all types of alcohol consumption on dementia risk, with no evidence supporting the previously suggested protective effect of moderate drinking.

“The pattern of reduced alcohol use before dementia diagnosis observed in our study underscores the complexity of inferring causality from observational data, especially in aging populations.

“Our findings highlight the importance of considering reverse causation and residual confounding in studies of alcohol and dementia, and they suggest that reducing alcohol consumption may be an important strategy for dementia prevention.”