Stimulating the central thalamus during anesthesia sheds light on neural basis of consciousness

The brains of mammals continuously combine signals originating from different regions to produce various sensations, emotions, thoughts and behaviors. This process, known as information integration, is what allows brain regions with different functions to collectively form unified experiences.

When mammals are unconscious, for instance when they are under general anesthesia, the brain temporarily loses its ability to integrate information. Studying the mammalian brain both when animals are awake and unconscious could help to better understand the neural processes that contribute to consciousness, potentially offering insight into comatose states and other disorders characterized by alterations in wakefulness.

Researchers at University of Cambridge, University of Oxford, McGill University and other institutes worldwide set out to examine the brains of four different species of mammals during anesthesia. Their observations, published in Nature Human Behaviour, offer new insight into the brain regions and gene patterns associated with both unconsciousness and the regaining of consciousness.

“The paper is part of my research program on the neural basis of consciousness,” Andrea Luppi, first author of the paper, told Medical Xpress.

“For the last 10 years I have been pursuing this question. My earlier work focused on comparing what happens to the brain during the unconsciousness induced by general anesthesia, and during coma or other disorders of consciousness (such as what used to be called vegetative state).

“Our paper asks if anesthesia works similarly in the brains of humans, and of other species that are often used as models in neuroscience and clinical research.”

Switching the brain back ‘on’ during anesthesia

Luppi and his colleagues have been investigating the neural processes involved in conscious and unconscious states for almost a decade. Their recent paper focuses on four different animal species: humans, macaques, marmosets and mice.

“Our hope is that by studying different mammals and comparing them with humans, we may be able to narrow down on the most essential mechanisms of consciousness—and learn how to restore it in patients,” said Luppi.

As part of their study, the researchers measured the brain activity of humans and three types of animals they scanned while they were under general anesthesia, using functional magnetic resonance imaging (fMRI). This is a widely used and non-invasive imaging technique that measures brain activity by detecting changes in blood flow.

“Our approach allowed us to track over time how different brain regions interact,” explained Luppi. “We found that when humans and animals are awake, their brains are like a grand orchestra: though different brain parts play different roles, they are clearly all working together to produce the symphony. We call this ‘synergy.'”

The team observed that this orchestra-like collective activity ceases when all the animals they examined lose consciousness. However, they were able to restore it by stimulating the central thalamus, a region at the center of the brain that is known to relay sensory and motor information, but that may also be acting as conductor for the brain’s orchestra.

“The anesthetized brain is like a random assortment of instruments, each playing to its own tune regardless of what the others are playing,” said Luppi. “However, if you stimulate a small region deep in the brain, called the central thalamus, the animal wakes up from anesthesia—and the brain symphony is back.”

Using computational tools, Luppi and his colleagues modeled the connectivity between different brain areas and how different genes are expressed across areas, both while animals were unconscious and when they regained consciousness. This allowed them to identify neural mechanisms that play a key role in consciousness and that appear to be evolutionarily conserved across all the species they examined.

New insight into the neural roots of consciousness

This recent study improves the present understanding of how the brain restores wakefulness. In the future, the team’s observations could help to devise new treatments for disorders of consciousness that can emerge after brain injuries, infections or tumors, such as comatose, vegetative, minimally conscious and post-traumatic confusional states.

“Finding consistency across many species and many anesthetic drugs is important: what is conserved across evolution is often very fundamental,” said Luppi.

“Perhaps the most important contribution of our study is that we were able to build a computer model that predicts which region one should stimulate, to have the best chances of making the brain symphony-like again. This could be used for trying to identify which region one should stimulate in the brain of a chronically unconscious patient, to try and wake the patient up.”

Luppi and his colleagues are now planning further studies aimed at further exploring the neural mechanisms associated with a return of consciousness after periods of unconsciousness. Their hope is to ultimately inform the design of more reliable and targeted strategies to bring patients back from a coma or other prolonged unconscious states.

“My long-term goal is to understand the mechanisms that govern consciousness, and how we can use pharmacology or brain stimulation to restore consciousness in patients,” added Luppi.

Scientists now know why ovarian cancer spreads so rapidly in the abdomen

Ovarian cancer kills more women than any other gynecological cancer. Most patients receive their diagnosis only after the disease spreads throughout the abdomen. Until now, scientists have never fully understood why this cancer advances so fast.

A new study led by Nagoya University explains why. Published in Science Advances, the study shows that cancer cells recruit help from protective mesothelial cells that normally line the abdominal cavity. Mesothelial cells lead the invasion and cancer cells follow the pathways they create. These hybrid cell clusters resist chemotherapy better than cancer alone.

Researchers examined abdominal fluid from ovarian cancer patients and found something unexpected. Cancer cells do not float alone in the abdominal cavity. Instead, they often grab onto mesothelial cells and form hybrid spheres. About 60% of all cancer spheres contain these recruited mesothelial cells. The cancer cells release a protein called TGF-β1 that transforms the mesothelial cells and causes them to develop spike-like structures that cut through tissue.

Invadopodia, spike structures that do the digging for cancer

When ovarian cancer develops, cancer cells break off from the tumor. These cells enter the abdominal fluid and float freely. The fluid moves around as you breathe and move your body. This movement carries the cancer cells to different spots in the abdomen.

Most other cancers spread differently. Breast cancer or lung cancer cells enter blood vessels. They travel through the bloodstream to reach distant organs. Doctors can sometimes track these cancers through blood tests because blood moves in predictable paths through vessels.

Ovarian cancer cells avoid blood vessels entirely. They float in fluid that has no fixed path. This floating stage happens before the cancer cells attach to new organs. Scientists did not fully understand what happened during the floating period or how cells worked together to spread cancer so quickly.

The research team discovered that cancer cells recruit protective mesothelial cells that have shed from the abdominal cavity lining during this floating stage. The two cell types stick together and form hybrid spheres. The mesothelial cells then grow invadopodia, spike-like structures that drill into surrounding tissue. The hybrid spheres resist chemotherapy drugs more effectively and invade tissues faster when they land on organs.

Outsourcing the hard work of cell invasion

The researchers examined abdominal fluid from ovarian cancer patients using advanced microscopy to watch this process in real time. They confirmed their findings with mouse models and single-cell genetic analysis.

Lead author Dr. Kaname Uno, a former Ph.D. student and current Visiting Researcher at Nagoya University’s Graduate School of Medicine, explained that the cancer cells do not need to become more invasive themselves.

“They manipulate mesothelial cells to do the tissue invasion work. They undergo minimal genetic and molecular changes and just migrate through the openings that mesothelial cells create.”

Dr. Uno worked as a gynecologist for eight years before he pursued research. One of his patients changed his career path. She had clear screening results just three months before doctors found advanced ovarian cancer. Current medical tools failed to detect the cancer early enough to save her life. This motivated Dr. Uno to investigate why ovarian cancer spreads so rapidly.

This discovery opens new treatment possibilities. Current chemotherapy targets cancer cells but ignores the mesothelial accomplices. Future drugs could block the TGF-β1 signal or prevent the formation of these dangerous partnerships. The research also suggests that doctors could monitor these cell clusters in abdominal fluid to predict disease progression and treatment response.

Nasal spray for flu prevention shows promising trial results

Researchers have developed a nasal spray for flu prevention that has shown promising results in preliminary human trials. Seasonal influenza (the flu) is an acute respiratory infection that affects up to one billion people per year and causes hundreds of thousands of deaths. While flu shots can be effective, they are always a best guess because scientists are never fully certain about which strains will circulate. At best, they are only 50% effective, according to historical data.

Another way to try and tackle the flu is experimental monoclonal antibodies, which are proteins that mimic the immune system’s ability to fight off viruses, but these too have had limited success.

Spraying inside the nostrils

Scientists from Leyden Labs in the Netherlands and their partners have been working on a different approach: a spray you apply directly to the nostrils to block the virus at the point of entry. Their research is published in the journal Science Translational Medicine.

The spray contains an antibody called CR9114, which was developed by the pharmaceutical company Johnson & Johnson. Unlike conventional vaccines that can only recognize specific flu strains, CR9114 can recognize and block almost all types of influenza A and B.

Following successful tests in mice and macaques, in which the spray protected the animals against infection, the researchers conducted preliminary tests in 143 people aged 18 to 55.

Healthy volunteers were given either a single dose or a twice-daily dose for two weeks to test its safety and how it moves through the body (pharmacokinetics). The research team measured how long the antibody remained in the nose by collecting samples at different times. They also placed nasal samples in lab dishes with live flu viruses to see if CR9114 could neutralize the infection.

Optimal doses and next steps

No serious side effects were reported in any of the participants. Twice-daily doses were optimal because the nose naturally clears itself continually, and the antibody has a half-life of three hours. Even after being in the nose, the antibody was fully active and neutralized both influenza A and B.

One of the most impressive findings was that spraying the antibody directly into the nose resulted in antibody concentrations in the nasal lining up to 4,600 times higher than traditional IV methods and with smaller doses.

“Intranasal CR9114 is safe in humans and efficacious against influenza virus challenge in nonhuman primates. These studies pave the way toward intranasal antibody administration for broad, prophylactic protection against influenza virus infection and subsequent disease.”

While the research is promising, the scientists still need to confirm how well their spray works against natural infection and whether it stops the virus from spreading between people.

Written for you by our author Paul Arnold, edited by Gaby Clark, and fact

Statins do not cause the majority of side effects listed in package leaflets, large-scale analysis finds

Statins do not cause the majority of the conditions that have been listed in their package leaflets, including memory loss, depression, sleep disturbance, and erectile and sexual dysfunction, according to the most comprehensive review of possible side effects. The study was led by researchers at Oxford Population Health and appears in The Lancet.

Cardiovascular disease results in around 20 million deaths worldwide and causes around a quarter of all deaths in the UK. Statins are highly effective drugs that lower LDL (“bad”) cholesterol levels and have been repeatedly proven to reduce the risk of cardiovascular disease. However, there have been concerns about possible side effects.

The researchers gathered data from 23 large-scale randomized studies from the Cholesterol Treatment Trialists’ Collaboration: 123,940 participants in 19 large-scale clinical trials comparing the effects of statin therapies against a placebo (or dummy tablet), and 30,724 participants in four trials comparing more intensive versus less intensive statin therapy.

They found similar numbers of reports for those taking the statins and those taking the placebo for almost all the conditions listed in package leaflets as possible side effects. For example, each year, the number of reports of cognitive or memory impairment was 0.2% in those taking the statins, but also 0.2% in those taking the placebo. This means that while people may notice these problems while taking statins, there is no good evidence that they are caused by the statin.

Key findings:

  • There was no statistically significant excess risk from statin therapy for almost all the conditions listed in package leaflets as potential side effects.
  • Taking a statin did not cause any meaningful excess of memory loss or dementia, depression, sleep disturbance, erectile dysfunction, weight gain, nausea, fatigue or headache, and many other conditions.
  • There was a small increase in risk (about 0.1%) for liver blood test abnormalities. However, there was no increase in liver disease such as hepatitis or liver failure, indicating that the liver blood test changes do not typically lead to more serious liver problems.

Christina Reith, Associate Professor at Oxford Population Health and lead author of the study, said, “Statins are life-saving drugs that have been used by hundreds of millions of people over the past 30 years. However, concerns about the safety of statins have deterred many people who are at risk of severe disability or death from a heart attack or stroke. Our study provides reassurance that for most people, the risk of side effects is greatly outweighed by the benefits of statins.”

Previous work by the same researchers established that most muscle symptoms are not caused by statins; statin therapy caused muscle symptoms in only 1% of people during the first year of treatment with no excess thereafter. It has also shown that statins can cause a small increase in blood sugar levels, so people already at high risk may develop diabetes sooner.

Professor Bryan Williams, Chief Scientific and Medical Officer at the British Heart Foundation, noted, “These findings are hugely important and provide authoritative, evidence-based reassurance for patients. Statins are lifesaving drugs that have been proven to protect against heart attacks and strokes. Among the large number of patients assessed in this well-conducted analysis, only four side effects out of 66 were found to have any association with taking statins, and only in a very small proportion of patients.

“This evidence is a much-needed counter to the misinformation around statins and should help prevent unnecessary deaths from cardiovascular disease. Recognizing which side effects might genuinely be associated with statins is also important as it will help doctors make decisions about when to use alternative treatments.”

Professor Sir Rory Collins, Emeritus Professor of Medicine and Epidemiology at Oxford Population Health and senior author of the paper, added, “Statin product labels list certain adverse health outcomes as potential treatment-related effects based mainly on information from non-randomized studies that may be subject to bias. We brought together all of the information from large randomized trials to assess the evidence reliably.

“Now that we know that statins do not cause the majority of side effects listed in package leaflets, statin information requires rapid revision to help patients and doctors make better-informed health decisions.”

All of the trials included in the analyses were large-scale (involving at least 1,000 participants) and tracked patient outcomes for a median of nearly five years. The trials were double-blind, meaning that neither the trial participants nor those managing the participants or leading the study knew who was receiving which treatment, to avoid potential biases due to knowledge of treatment allocation. The list of possible side effects was compiled from those listed for the five most commonly prescribed statins.

Family dinners may reduce substance-use risk for many adolescents

A new study by researchers at Tufts University School of Medicine finds that regular family dinners may help prevent substance use for a majority of U.S. adolescents, but suggests that the strategy is not effective for youth who have experienced significant childhood adversity.

The findings provide important insights for practitioners looking to help families prevent substance use, as well as for researchers aiming to develop interventions that better account for adolescents’ unique experiences.

For the study, published in the Journal of Aggression, Maltreatment & Trauma, researchers analyzed online survey data from 2,090 U.S. adolescents ages 12 to 17 and their parents. Participants from around the country were asked about the quality of their family meals—including communication, enjoyment, digital distractions, and logistics—as well as adolescents’ alcohol, e-cigarette, and cannabis use in the previous six months.

The researchers then examined how these patterns differed based on adolescents’ experiences of household stressors and exposure to violence, as reported by both the children and parents. Instead of counting each adverse experience equally, the researchers created a weighted score based on how strongly the different experiences are linked to substance use in prior research and this national sample.

Higher family dinner quality was associated with a 22% to 34% lower prevalence of substance use among adolescents who had either no or low to moderate levels of adverse childhood experiences.

“These findings build on what we already knew about the value of family meals as a practical and widely accessible way to reduce the risk of adolescent substance use,” said Margie Skeer, the study’s lead author, professor and chair of the Department of Public Health and Community Medicine at the School of Medicine.

“Routinely connecting over meals—which can be as simple as a caregiver and child standing at a counter having a snack together—can help establish open and routine parent-child communication and parental monitoring to support more positive long-term outcomes for the majority of children,” added Skeer.

“It’s not about the food, timing, or setting; it’s the parent-child relationship and interactions it helps cultivate that matter.”

Adverse childhood experiences reported by participants in the study included parents being divorced; a family member being diagnosed with a substance-use disorder; someone in the family having a mental-health disorder; the adolescent witnessing violence; the adolescent often being teased about their weight; a parent using non-prescribed drugs daily; or the adolescent experiencing sexual or physical dating violence.

The study found that family meals offered little protection for adolescents whose adversity score reached the equivalent of four or more experiences—a population that encompasses nearly one in five U.S. high school students younger than 18, according to a study of the most recent Youth Risk Behavior Survey data.

“While our research suggests that adolescents who have experienced more severe stressors may not see the same benefits from family meals, they may benefit from more targeted and trauma-informed approaches, such as mental health support and alternative forms of family engagement,” said Skeer.

She added that future research should explore whether other supportive routines—beyond shared meals or outside the family environment—can help protect adolescents exposed to highly stressful or traumatic childhood experiences.

Addiction and appetite along the gut-brain axis: Vagus nerve may play a crucial role in the dopamine reward pathway

Dopamine—a neurotransmitter responsible for influencing motivation, pleasure, mood and learning in the brain—has experienced a bit of fame in recent years, acting as a sort of buzzword to describe a fleeting satisfaction from social media, food or shopping. Because of this, most people know that dopamine acts within the brain. In particular, it is associated with the mesolimbic pathway, which is a brain circuit connecting the ventral tegmental area (VTA) to the nucleus accumbens (NAc), amygdala, and hippocampus.

However, a recent study, published in Science Advances, indicates that the vagus nerve, which bridges the brain and gut, also plays a crucial role in regulating behaviors related to reward and motivation.

The gut-brain-vagal axis

The vagus nerve is the main pathway of the gut-brain axis, a complex communication network linking peripheral organs to the brain by transmitting interoceptive signals about mood, digestion, inflammation, and stress.

The authors of the new study explain, “Among metabolically active peripheral organs, the gut emerges as a central player in coordinating the body-brain tango through a multitude of long-range mechanisms, including hormonal signaling, microbiota-derived metabolites, and both local and gut-brain neuronal connections.”

While most prior studies have focused on brain-centric models of reward, some work has shown that gut-vagal signals have an effect on food-driven dopamine activity and eating behaviors. Yet it was still unclear whether this would extend to other forms of addiction fueled by dopamine.

Disrupted vagal signaling affects dopamine activity

To determine the extent to which the gut-brain-vagal axis is involved in dopamine reward activity, the research team conducted an array of experiments involving mice. Some experiments involved cutting the vagus nerve via subdiaphragmatic vagotomy (SDV) and comparing food and drug reward behaviors between SVA mice and unaltered (sham) mice. In vivo dopamine activity was also monitored through fiber photometry, molecular assays, and electrophysiology.

Results showed that the gut-brain vagal axis is essential for both food- and drug-induced reward behaviors in mice. In experiments featuring foods that mice would normally find addictive in nature, the SDV mice showed a slower and lower rate of food consumption, while the unaltered mice exhibited a rapid increase in food consumption over a 10-day period.

The team noted increased excitement in sham mice, but not SDV mice. They write, “Using our model, in combination with telemetric locomotor activity monitoring, we observed that sham mice displayed an increased locomotor activity before (food-anticipatory activity) and during (consumption) food intake.

“In contrast, SDV mice exhibited dampened locomotor activity during both phases. This reduction was not due to preexisting locomotor deficits, as both sham and SDV mice had similar locomotor profiles during the dark period (foraging period) or under basal conditions.”

Similar results were found with some experiments involving drugs, including cocaine, morphine and amphetamines. The team observed reductions in the elicited locomotor response in SDV mice for both morphine and cocaine, indicating that the vagus nerve might modulate dopamine dynamics and/or its postsynaptic integration. Amphetamines showed no significant differences and also depended on dose in conditioning experiments.

The study authors explain, “While sham mice were positively conditioned to cocaine, no significant preference was observed in SDV mice. The effect of amphetamine-induced CPP in SDV mice depended on the conditioning doses. At 2 mg/kg, we observed that both experimental groups were positively conditioned.

“However, when mice were conditioned with a lower dose of amphetamine (1 mg/kg), we observed a lower conditioning index in SDV mice compared to controls, suggesting that the physiological consequences of neuronal adaptations observed in SDV mice may be overridden at higher [dopamine] levels.”

Furthermore, in vivo experiments showed that vagal integrity is required for normal dopamine neuron firing, dopamine-dependent molecular changes and structural plasticity in reward circuits. Fiber photometry showed that when the vagus nerve was cut, dopamine responses were delayed within the nucleus accumbens or reduced during food anticipation, eating, and after drug administration.

However, dopamine function overall was not compromised, as it still functioned in processes related to movement. Still, activity was reduced, as dopamine neurons fired less and received weaker excitatory input.

Implications for addiction treatment in humans

The study helps to confirm that our gut, via the vagus nerve, plays a direct and essential role in how we experience reward and motivation. However, addiction treatments involving the reduction of vagus nerve signaling are still a way off.

Simply cutting the vagus nerve off surgically, as in the mouse study, is most likely not an option for humans and may have further side effects. In addition, the team notes that the gut may even induce compensatory changes over time to make up for lost signaling.

Clearly, more research is needed to refine these methods. The team suggests using more targeted genetic or viral tools to dissect specific vagal circuits in the future or exploring different methods for modulating vagal signaling. However, upon refining, there is a potential for eating disorders and addiction treatments in the future.

RNA molecule discovery could lead to potential new breast cancer therapy

QIMR Berghofer scientists have discovered a cancer-fighting RNA molecule that could hold the key to a new way of treating the most common form of breast cancer. The team are developing their findings into a potential RNA-based therapy for hormone receptor-positive (HR+) breast cancer, offering hope to women with advanced disease who are no longer responding to existing drugs.

RNA molecules are like the working copies of our DNA. Advances in technology are helping uncover their role in essential biological functions, and RNA-based therapies are emerging as a highly promising new approach for targeting cancer.

Seven-year search uncovers new molecule

QIMR Berghofer Professors Stacey Edwards and Juliet French study how our DNA and RNA work, with the goal of finding better ways to treat and ultimately cure breast cancer. Their seven-year study, published in the journal Molecular Cancer, details their discovery of the previously unknown RNA molecule that protects against HR+ breast cancer.

Professor French said the RNA molecule they discovered has a two-pronged mode of attack against the cancer cells that is both precise and potent.

“We think this is going to be a therapy that can treat this cancer and save women’s lives. When we introduce the RNA molecule into our preclinical models, it specifically kills only hormone receptor-positive breast cancer cells, and not healthy cells,” Professor French said.

“It does this in two different ways. It induces cancer cell death from within the tumor cell, and it also interacts with and binds to a receptor that activates the immune system to recognize and kill the cancer cells.”

A personal mission to beat breast cancer

Professor Stacey Edwards, who has dedicated her career to breast cancer research after losing her mother to the disease at a young age, said it was incredibly rewarding to discover a potentially transformative treatment.

“We knew we had something exciting, but we’d hit so many dead ends trying to understand how the RNA molecule was working. Then came the day in our lab when we saw the cancer cells completely destroyed, while the healthy cells were alive. It was a true eureka moment. We could hardly believe what we were seeing, so we just kept repeating the experiment to be sure the results were real,” Professor Edwards said.

“My beautiful mum developed breast cancer when she was just 34 years old and I was only five, so I grew up seeing her go through horrible treatments. From a very young age, I knew that I wanted to do breast cancer research to help my mum and others like her.

“Unfortunately, she passed away just as I finished my university degree, but she knew I was on my way. To now be developing something that we believe is going to make a difference is a very special moment.”

Breast cancer is the most commonly diagnosed cancer in women in Australia and globally. Around 70% of all cases are HR+ breast cancers.

While existing treatments, such as hormone-blocking therapies, have greatly improved survival rates, up to a third of patients do not respond or develop drug resistance over time, allowing their cancer to return and spread. HR+ breast cancer is often described as a “cold cancer,” meaning it can hide from the immune system, so immunotherapies also typically fail to work.

How the new RNA therapy could be used

The potential new RNA therapy could be used on its own, but because it also activates the immune system, it could make existing immunotherapies more effective. The team plans to test different combinations of the RNA therapy with current immunotherapies.

They are also developing lipid nanoparticles to help deliver the RNA therapy directly into the cancer cells.

One of the advantages of RNA-based therapies is that they can be developed more quickly than conventional drugs, as seen with the rapid development of RNA-based vaccines during the COVID-19 pandemic.

The newly discovered RNA molecule is a type known as a long noncoding RNA (lncRNA). Around 98% of the human genome is made up of RNA molecules that are “noncoding,” meaning they don’t translate into proteins but carry out other important functions like regulating gene expression. Once dismissed as “junk DNA,” this research breakthrough is another step towards realizing the potential of this type of RNA for targeting cancers.

GLP-1 drugs tied to lower-calorie, lower-sugar food purchases

Researchers at Steno Diabetes Center Copenhagen reported that starting a GLP-1 receptor agonist (GLP-1RA) coincided with slightly healthier supermarket purchases. Grocery purchases from GLP-1RA users in Denmark contained modestly fewer calories and sugars and slightly more protein per 100 g of food after treatment began than before.

Appetite suppression and lower calorie intake are commonly associated with GLP-1RAs, while day-to-day food choices have remained less clear in prior work. Case reports and small observational studies have suggested a shift away from ultra-processed, energy-rich snacks toward more whole, minimally processed foods after GLP-1RA initiation, while real-world purchase records at scale have remained unclear.

In the study, “Consumer Food Purchases After Glucagon-Like Peptide-1 Receptor Agonist Initiation,” published in JAMA Network Open, researchers analyzed supermarket receipt data to assess whether nutritional quality and processing levels changed after initiation of GLP-1RA therapy.

How purchases were measured

Receipt records came from the Danish SMIL cohort, which includes thousands of participants who shared supermarket receipts through a smartphone app. New GLP-1RA patients from 2019 to 2022 were matched 1:3 with nonusers of the same age, sex, and income.

All participants had at least one year of purchases before and after the first prescription date or matched date. Total enrollment included 1,177 adults with a median age of 53 years and 52.5% women, including 293 who started GLP-1RA therapy and 884 matched controls.

Food purchases were categorized by nutrient content and by the NOVA processing system, spanning from unprocessed to ultra-processed items. Comparisons focused on average energy density in kcal per 100 g and nutrients per 100 g of food, including sugar, carbohydrates, saturated fat, and protein, before versus after GLP-1RA initiation.

Small shifts

After GLP-1RA initiation, grocery purchases shifted modestly toward lower energy density and lower sugar and carbohydrate content. Mean energy density was reduced from 209.4 to 207.3 kcal per 100 g, sugar content slid from 15.7 to 15.1 g per 100 g, and total carbohydrates dropped from 19.8 to 19.3 g per 100 g. Saturated fat content decreased slightly from 7.3 to 7.2 g per 100 g, while protein increased from 6.6 to 6.9 g per 100 g of food intake.

Processing categories moved in the same direction, with unprocessed foods rising by about 0.9 percentage points from a baseline of 46.9% and ultra-processed foods falling by about 1.2 points from 39.2%. Little change, or opposite change, appeared in the matched control group.

Authors note that changes for each individual were modest, and observed changes may partly reflect the start of a weight-loss journey.

It takes two: Genes ATP13A2 and GBA1 interact to drive neurodegeneration

Parkinson’s disease (PD) is the second most common neurodegenerative disease after Alzheimer’s disease, affecting more than 10 million people worldwide. People with this condition may experience tremors, limb stiffness, gait and balance problems and move slowly, like when buttoning a shirt or walking. These symptoms happen because certain brain cells die over time. Although scientists have known some of the factors that raise a person’s risk, the question remains of why some people with genetic risk factors develop the disease while others never do.

A team at Baylor College of Medicine, the Duncan Neurological Research Institute (Duncan NRI) at Texas Children’s Hospital and collaborating institutions found in the laboratory fruit fly that it takes two mutant genes to drive neurodegeneration. People and fruit flies have two copies of each gene in most cells. Flies without one copy of the Gba1b gene, a common and potent PD genetic risk factor, do not develop neurological problems.

However, neurodegeneration occurs when flies lack both a copy of Gba1b and one copy of anne (the fruit fly equivalent of the human gene ATP13A2). Importantly, the researchers identified multiple PD individuals carrying ATP13A2 and GBA1 variants. The study appeared in Molecular Neurodegeneration.

“We knew from human studies that people carrying one copy of mutant GBA1 and one copy of normal GBA1 gene have a 5-fold increase in the risk of developing PD, but do not always develop the condition,” said corresponding author Dr. Hugo Bellen, Distinguished Service Professor of molecular and human genetics at Baylor and chair in neurogenetics in the Duncan NRI. “A second factor must be in place for the condition to arise.”

The researchers looked for the second factor among genes linked to lysosomes, structures inside cells responsible for breaking down and recycling cellular material, because many PD risk genes like GBA1 also are linked to lysosomes. The team worked with fruit flies to examine how the Gba1b mutant gene interacts with dozens of other genes involved in lysosome function. Their goal was to find out whether Gba1b drives neurodegeneration when paired with other lysosomal genes.

“We found that carrying one mutant copy of Gba1b and one mutant copy of anne drives slow, progressive neurodegeneration in flies. The flies developed movement problems, lost neurons and showed disruption in the communication between neurons and glia,” said first author Dr. Mingxue Gu, a postdoctoral associate in the Bellen lab.

A closer look at the process leading to neurodegeneration revealed that Gba1b and anne work on different cell types. Gba1b works mainly in glial cells, which support and protect neurons, and anne works mostly in neurons, which send electric signals that sustain neural networks. But how can problems in two different cell types lead to neurodegeneration?

“One surprising result was that the earliest signs of damage didn’t appear in neurons but in glial cells,” Gu said. “These cells began swelling, detaching from nearby neurons, and showing signs of distress. We tied this to accumulation of a fat molecule called glucosylceramide (GlcCer) in glial lysosomes.”

In flies also carrying a mutant anne, lysosomes in the neurons failed to maintain proper acidity. This caused neurons to produce excess GlcCer, which was then transferred to glial cells in amounts far beyond what glia can handle. It’s like having a recycling center (the glia) that’s already short staffed, suddenly being overwhelmed with extra garbage arriving from the neighborhood (the neurons).

As a result, waste materials piled up in glial cells, leading to swelling and structural damage. Without healthy glia to support and protect them, neurons eventually failed, especially those involved in vision and movement. The flies developed movement problems, lost neurons, and showed signs similar to early Parkinson’s disease.

“One important finding was that we found ways to help reduce the damage,” Bellen said.

Treatment with ML SA1, a drug that improves lysosome function, restored lysosomes’ healthier activity, and myriocin, which reduces GlcCer production, lowered toxic buildup. “These treatments don’t point to an immediate cure for Parkinson’s, but they reveal promising biological pathways that could be explored in future therapies,” Bellen said.

Shingles vaccination associated with delayed dementia onset in older adults

Every three seconds, someone, somewhere in the world, develops dementia. The number of people living with the condition is projected to rise dramatically, doubling from 78 million in 2020 to 139 million by 2050, making dementia an urgent public health concern of our time. Now, a Canadian study published in The Lancet Neurology found that herpes zoster vaccination, known by the brand name Zostavax, reduced or delayed dementia diagnosis by 2 percentage points over 5.5 years. The benefits were more pronounced in women than in men.

The researchers initially extracted records for nearly 230,000 elderly patients born between 1930 and 1960 who were living in Ontario and registered with primary care providers.

They carried out a natural experiment—where researchers study the effects of naturally occurring events without manipulating variables—that took advantage of a policy change in Canada in 2016, creating a clear comparison group. Ontario residents who turned 71 on or after January 1, 2017, became eligible for a free herpes or shingles vaccine, while those who reached the same age just before that date did not, allowing researchers to compare outcomes between the two groups.

They observed that, following the initiation of the vaccination program, there was a decline in dementia diagnoses among the Canadian population eligible for the free shingles vaccination compared with those who were not.

A clear reduction in new dementia cases was observed only among individuals born in 1945 and 1946—the birth cohorts eligible for Ontario’s publicly-funded live-attenuated shingles (varicella-zoster virus) vaccine. This pattern was not observed at other birthdate thresholds, nor in other Canadian provinces that did not offer the vaccine free of charge.

Targeting microbes to reduce dementia risk

There is a long-standing scientific theory that microbes play a major role in the development of dementia. Neurotropic herpesviruses have attracted research attention because they target the nervous system and are prone to reactivation with age.

Studies have found that these herpes viruses can trigger the brain to produce junk proteins called β-amyloid and tau in the brains of people—hallmark of Alzheimer’s disease.

The herpes zoster (shingles) vaccine is currently the only vaccine in clinical use that targets a neurotropic herpesvirus. While earlier studies have suggested that shingles vaccination may be associated with a lower risk of dementia, these findings have remained inconclusive.

Most studies on this topic were observational and relied on comparisons between vaccinated and unvaccinated individuals, making their findings vulnerable to bias. Some experts also suggest that since individuals who seek vaccination often have healthier lifestyles, it could also contribute to dementia risk reduction.

This new study, however, leveraged a natural experiment based on arbitrary birthdate eligibility rules, comparing individuals born just weeks apart who were otherwise similar, thereby reducing potential bias arising from vaccine access or choice. Furthermore, the team directly collected data from primary care providers rather than relying on insurance claims, thereby providing a holistic view of patient health over a longer period.

The results showed that seniors eligible for a free shingles vaccine were less likely to be diagnosed with dementia over the next 5.5 years. Their risk was 2 percentage points lower, and they spent more years dementia-free on average. One of the most striking revelations was that the protective effect was statistically significant in women (p-value: 0.029) but not in men (p-value: 0.52).

The researchers note that this natural experiment provides stronger evidence of a causal relationship than standard observational analyses. Further studies exploring this relation might help inform public health vaccination policies for aging populations.