Infants born with hearing loss show disruptions in brain design, underscoring the urgency of intervention

Infants born deaf or hard of hearing show adverse changes in how their brains organize and specialize, but exposure to sound and language may help them develop more normally, according to new research.

The study led by two neuroscientists found that infants with sensorineural hearing loss (SNHL) lacked the usual pattern of organization on the brain’s left side, which supports language and higher cognitive skills.

The findings also suggest that early auditory stimulation through hearing aids or cochlear implants, along with exposure to language, whether spoken or signed, could help preserve normal brain development.

“The first year of life is a critical window for brain organization,” the study said. “If infants miss auditory input or early language exposure during this period, the brain’s left and right hemispheres may not develop their usual balance.”

The study was spearheaded by Professor Heather Bortfeld of the University of California, Merced, and Professor Haijing Niu of Beijing Normal University. It is published in Science Advances.

How hearing loss alters infant brain organization

The study examined 112 infants aged 3 to 9 months, including 52 with congenital hearing loss and 60 with typical hearing. Using a noninvasive imaging method called functional near-infrared spectroscopy (fNIRS), the researchers tracked how efficiently different regions of the brain communicated.

They found both groups had strong “small-world” network organization, a sign of efficient brain function. But unlike typically hearing infants, those with SNHL did not develop stronger left-hemisphere specialization, which is normally linked to early language and cognitive growth.

The difference was most pronounced in infants with moderate to profound hearing loss, while those with mild loss retained some normal patterns of left-hemisphere activity.

Brain asymmetry—the tendency of certain functions to concentrate in one hemisphere—supports the development of language, reasoning and memory. In an infant who hears normally, the left hemisphere becomes dominant for processing speech and symbolic communication within the first months of life.

The importance of early language exposure

This specialization can falter when auditory or language input is missing. Previous research shows that deaf infants who have deaf parents and grow up with sign language still develop normal left-hemisphere organization, demonstrating that language access, not sound alone, can drive healthy neural growth.

“Early exposure—whether through cochlear implants, hearing aids or sign language—is essential,” the study said. “The brain needs structured input to build the networks that will later support communication and learning.”

The authors emphasized that intervention should begin as early as possible, ideally within the first few months of life—the brain’s period of maximum plasticity. Providing a rich linguistic environment can help reinforce neural pathways that otherwise might weaken or reorganize abnormally, they said.

Looking ahead: Future research and implications

While the study offers strong evidence of how hearing loss affects the infant brain, it only observed infants at one point in time. The researchers plan to follow children over longer periods to see whether early hearing and language interventions can normalize brain asymmetry and support later language and cognitive outcomes.

They also called for studies that combined fNIRS with magnetic resonance imaging and electroencephalograms to map how sound, language and cognition interact in early development.

“This work reframes hearing loss as a brain-development issue, not just an ear issue,” the study said. “We now know that timely access to sound and language is key to keeping the brain’s communication networks on track.”

Newly discovered RNA molecule could limit protein aggregation and prevent neuronal damage

Neurodegenerative diseases, such as Alzheimer’s disease and dementia, are medical conditions that entail the progressive loss of neurons and a decline in brain function. Past studies have found a link between these diseases and the buildup of misfolded proteins, such as tau and α-synuclein.

Tau is a protein found primarily in neurons that typically helps to stabilize structures that transport nutrients and molecules within neurons, known as microtubules. α-synuclein, on the other hand, is a small protein located at the tips of neurons (i.e., pre-synapses), which typically helps to regulate the function of synaptic vesicles, small sacs that release neurotransmitters across synapses.

While these proteins have an important function in the healthy brain, their abnormal aggregation has been found to be a hallmark of several neurodegenerative diseases. The molecular processes that prompt their accumulation, however, have not yet been fully elucidated.

Researchers at Washington University in St. Louis and University of California recently investigated the role of a newly uncovered RNA molecule, called FAM151B-DT, in the aggregation of tau and α-synuclein proteins. Their findings, published in Molecular Psychiatry, suggest that this RNA is a key regulator of protein homeostasis, or in other words, that it helps to keep a balance in the production and degradation of proteins in the brain.

“Neurodegenerative diseases share common features of protein aggregation along with other pleiotropic traits, including shifts in transcriptional patterns, neuroinflammation, disruption in synaptic signaling, mitochondrial dysfunction, oxidative stress, and impaired clearance mechanisms like autophagy,” wrote Arun Renganathan, Miguel A. Minaya and their colleagues in their paper.

“However, key regulators of these pleiotropic traits have yet to be identified. We used transcriptomics, mass spectrometry, and biochemical assays to define the role of a novel lncRNA on tau pathophysiology.”

A previously unknown RNA implicated in protein aggregation

As part of their study, Renganathan and his colleagues examined stem cells and tissue samples using a wide range of genetic and experimental tools. Specifically, they compared the levels of the protein lncRNA in brain tissues derived from individuals who were diagnosed with a neurodegenerative disease to those of people who were not.

“We discovered a long non-coding RNA (lncRNA), FAM151B-DT, that is reduced in a stem cell model of frontotemporal lobar dementia with tau inclusions (FTLD-tau) and in brains from FTLD-tau, progressive supranuclear palsy, Alzheimer’s disease, and Parkinson’s disease patients,” wrote Renganathan and their colleagues. “We show that silencing FAM151B-DT in vitro is sufficient to enhance tau and α-synuclein aggregation.”

The researchers grew stem cells in the lab and then silenced the protein that they identified in these cells. Notably, they found that this heightened the aggregation of proteins associated with various neurodegenerative diseases.

“To begin to understand the mechanism by which FAM151B-DT mediates tau aggregation and contributes to several neurodegenerative diseases, we deeply characterized this novel lncRNA and found that FAM151B-DT resides in the cytoplasm where it interacts with tau, α-synuclein,” wrote the authors. “HSC70, and other proteins involved in protein homeostasis. When silenced, FAM151B-DT blocks autophagy, leading to the accumulation of tau and α-synuclein.”

Informing the treatment of neurodegenerative diseases

Overall, the results of this recent study suggest that the RNA molecule FAM151B-DT is of key importance for the balancing of tau and α-synuclein proteins in cells. Silencing this molecule seems to prompt the undesirable aggregation of proteins linked to neuronal damage and the emergence of neurodegenerative diseases.

The insight gathered by Renganathan and their colleagues could improve the present understanding of various neurodegenerative diseases. In the future, the molecule they identified could prove to be a promising target for treating these diseases early or addressing some of their symptoms.

“Importantly, we discovered that increasing FAM151B-DT expression is sufficient to promote autophagic clearance of phosphorylated tau and α-synuclein, and reduce tau and α-synuclein aggregation,” wrote the authors. “Overall, these findings pave the way for further exploration of FAM151B-DT as a promising molecular target for several neurodegenerative diseases.”

Blood test offers hope for more effective ovarian cancer treatment

New clinical research has identified a blood test that can reveal which women are more likely to respond to a particular treatment for ovarian cancer, known as PARP inhibitor therapy.

The work appears in the journal Nature Communications.

More than 300,000 women are diagnosed with ovarian cancer globally each year, including 1,700 in Australia.

The four-year clinical trial across 15 Australian hospitals—known as SOLACE2—was co-led by the University of Sydney NHMRC Clinical Trials Centre, RMIT University and WEHI, and coordinated by the Australia New Zealand Gynaecological Oncology Group (ANZGOG).

The Phase II trial tested strategies for priming the immune system to enhance the effectiveness of PARP inhibitor therapy, which stops cancer cells from repairing their own damaged DNA by blocking the PARP enzyme.

It was during this trial that a new companion blood test for women with ovarian cancer was also evaluated, with promising results.

Precision targeting cancer treatment for better outcomes

PARP inhibitor therapy is currently offered to women whose cancer has a defect in DNA repair, known as homologous recombination deficiency. These cancers are called HRD positive.

However, clinicians have long recognized that some women with an HRD negative cancer can still benefit from PARP inhibitors, while others with an HRD positive ovarian cancer may not respond, suggesting other factors may influence treatment response.

RMIT lead researcher and co-senior author, Distinguished Professor Magdalena Plebanski, said there had been no easy way to more effectively target PARP inhibitor therapy, beyond the currently approved HRD test, until now.

“In SOLACE2, we demonstrated that a new immune test could better indicate which women will respond to PARP inhibitors,” said Plebanski, who heads RMIT’s Accelerator for Translational Research and Clinical Trials (ATRACT) Centre.

“We expect this promising new test will enable more effective screening and identification of eligible patients for PARP inhibitors, allowing us to provide this leading treatment to the women most likely to benefit.”

The new blood test measures the increase in expression of immune biomarkers that reflect the movement of good, cancer-destroying immune cells towards the cancer cells hiding in the body, together with a measure of important inflammatory processes that aid cancer growth and treatment resistance, providing a simple biomarker signature in the blood.

The team’s results reveal that the RMIT-patented biomarkers—easily identified through a simple blood test—may be a better guide to who’ll potentially benefit from PARP inhibitor therapy than the current gold standard HRD test, meaning it requires urgent validation.

The current HRD test requires sufficient cancer tissue and the ability to perform complex analysis of DNA repair, which is not always available or feasible. In addition, the test may not provide an accurate reflection of the current DNA repair capability of the cancer, as this can change over time.

“Our test focused on a real-time blood immune response rather than on the DNA repair capability of the cancer, which may no longer be accurate. In doing so, we more accurately identified which SOLACE2 patients would most benefit from PARP inhibitor therapy,” Plebanski said.

WEHI lead and joint-senior author Professor Clare Scott said an important finding was how immune cells in the cancer affected the response to PARP inhibitor therapy, particularly in combination therapy.

Scott, who leads WEHI’s Ovarian and Rare Cancer Laboratory and is the Chair of ANZGOG, said a clear indication of who would respond to treatment came from predicting whether effector T cells could increase their migration into the tumor, where they can start killing the cancer cells.

“Now that we understand this is a vital factor for cancer control, we could also potentially improve treatments by focusing on promoting this beneficial migration of immune cells in the future,” said Scott, who is also a medical oncologist at the Peter MacCallum Cancer Centre, Royal Women’s Hospital and Royal Melbourne Hospital.

The new test is not currently available for patients, as it still needs to undergo further testing and confirmation before obtaining necessary approvals for routine use.

Trial findings: Delayed cancer recurrence
Professor Chee Khoon Lee, clinical lead at the University of Sydney’s NHMRC Clinical Trials Centre (CTC) and co-study chair, said the SOLACE2 clinical trial showed that three months of immune priming helped delay ovarian cancer recurrence when followed by treatment with the PARP inhibitor and immunotherapy.

“Despite the treatment benefit we saw with this approach, the small trial did not provide the definitive clinical validation we were seeking, so more work will be needed to validate that,” Lee said.

“However, our study did successfully reveal this new test that has the potential to transform outcomes for many women diagnosed with ovarian cancer, helping clinicians to better personalize treatments, ensuring each woman receives the most effective therapy for her.”

Research reveals shared genetic roots for psychiatric and neurological disorders

Researchers from the Center for Precision Psychiatry at the University of Oslo and Oslo University Hospital have discovered extensive genetic links between neurological disorders like migraine, stroke and epilepsy, and psychiatric illnesses such as schizophrenia and depression. Published in Nature Neuroscience, this research challenges longstanding boundaries between neurology and psychiatry and points to the need for more integrated approaches to brain disorders.

“We found that psychiatric and neurological disorders share genetic risk factors to a greater extent than previously recognized. This suggests that they may partly arise from the same underlying biology, contrasting the traditional view that they are separate disease entities. Importantly, the genetic risk was closely linked to brain biology,” states Olav Bjerkehagen Smeland, psychiatrist and first author.

Nearly 1 million cases analyzed

The team analyzed genetic data from close to 1 million individuals with a wide range of psychiatric or neurological conditions. This large dataset made it possible to map both shared and disorder-specific genetic signals. “The findings are consistent with what we see clinically: patients often present with overlapping symptoms across neurology and psychiatry,” says Professor Ole Andreassen, leader of the Center for Precision Psychiatry. “Our results support a more unified view of neurological and psychiatric disorders.”

According to Smeland, the study suggests that patients could benefit from treatment strategies that take both biological and mental aspects into account. “We should ask whether patients receive the best care when neurology and psychiatry operate in parallel rather than together,” he says.

Genes and their varied influence on brain biology

While the study found substantial genetic overlap, the disorders still displayed partly distinct biological signatures. “For instance, genetic susceptibility to stroke was associated with risk factors for thrombosis, while epilepsy was connected to neurons—the brain’s nerve cells. The genetic risk for Alzheimer’s disease and multiple sclerosis, by contrast, was tied to the immune system, which also influences the nervous system. The genetic risk for psychiatric illnesses was consistently linked to neurons. This tells us that neurological and psychiatric disorders are heterogeneous, but may still be connected within a common biological framework,” Smeland explains.

While distinctions between neurological and psychiatric disorders do exist, this study paves the way for a more holistic understanding of brain disorders. “I believe that improved knowledge exchange and closer collaboration between psychiatry and neurology could substantially benefit patients, ” Smeland states.

Tylenol during pregnancy: No strong evidence ties use to autism or ADHD risk

Existing evidence does not clearly link paracetamol (acetaminophen) use during pregnancy with autism or ADHD in children, finds an in-depth evidence review published by The BMJ today, in direct response to recent announcements around the safety of using paracetamol in pregnancy.

The researchers say confidence in the findings of existing evidence reviews and studies on this topic is low to critically low, and suggest that any apparent effect seen in previous studies may be driven by shared genetic and environmental factors within families.

Regulatory bodies, clinicians, pregnant women, parents, and those affected by autism and ADHD should be informed about the poor quality of the existing reviews and women should be advised to take paracetamol when needed to treat pain and fever in pregnancy, they add.

Paracetamol (acetaminophen) is the recommended treatment for pain and fever in pregnancy and is considered safe by regulatory agencies worldwide.

Existing systematic reviews on this topic vary in quality, and studies that do not adjust for important factors shared by families or parents’ health and lifestyle cannot accurately estimate the effects of exposure to paracetamol before birth on neurodevelopment in babies.

To address this uncertainty, researchers carried out an umbrella review (a high-level evidence summary) of systematic reviews to assess the overall quality and validity of existing evidence and the strength of association between paracetamol use during pregnancy and the risks of autism or ADHD in offspring.

They identified nine systematic reviews that included a total of 40 observational studies reporting on paracetamol use during pregnancy and the risk of autism, ADHD, or other neurodevelopmental outcomes in exposed babies.

Four reviews included meta-analysis (a statistical method that combines data from several studies to give a single, more precise estimate of an effect).

The researchers used recognized tools to carefully assess each review for bias and rated their overall confidence in the findings as high, moderate, low, or critically low. They also recorded the degree of study overlap across reviews as very high.

All reviews reported a possible to strong association between a mother’s paracetamol intake and autism or ADHD, or both in offspring. However, seven of the nine reviews advised caution when interpreting the findings owing to the potential risk of bias and impact of unmeasured (confounding) factors in the included studies.

Overall confidence in the findings of the reviews was low (two reviews) to critically low (seven reviews).

Only one review included two studies that appropriately adjusted for possible effects of genetic and environmental factors shared by siblings, and accounted for other important factors such as parents’ mental health, background, and lifestyle.

In both these studies, the observed association between exposure to paracetamol and risk of autism and ADHD in childhood disappeared or reduced after adjustment, suggesting that these factors explain much of the observed risk, say the researchers.

They acknowledge some limitations. For example, the included reviews differed in scope and methods, they were unable to explore the effects of timing and dose, and their analyses were limited to autism and ADHD outcomes only.

However, they say this overview brings together all relevant evidence and applies established methods to assess quality, and shows “the lack of robust evidence linking paracetamol use in pregnancy and autism and ADHD in offspring.”

They conclude, “The current evidence base is insufficient to definitively link in utero exposure to paracetamol with autism and ADHD in childhood. High quality studies that control for familial and unmeasured confounders can help improve evidence on the timing and duration of paracetamol exposure, and for other child neurodevelopmental outcomes.”

Should kids be screened for high cholesterol genes? Study weighs costs and benefits

In the United States, 1 in every 250 people has inherited a genetic variant that leads to dangerously high cholesterol levels from birth.

If high cholesterol isn’t lowered early, people with this genetic condition, called familial hypercholesterolemia (FH), have a high risk of having a heart attack or stroke as early as their 30s or 40s. But only about 1 in 10 of those living with FH (1.5 million Americans) are aware of their condition.

A new modeling study conducted by researchers at Columbia and Harvard universities finds that while screening children or young adults for high cholesterol and FH genes would prevent a substantial number of premature heart attacks and strokes, such testing is currently too expensive to implement.

Instead, their study suggests that if a universal screening program led to more intensive monitoring and lifestyle changes in all children and young adults with high cholesterol—including those without FH genes—cholesterol screening would become cost-effective. The paper, “Familial Hypercholesterolemia Screening in Early Childhood and Early Adulthood: A Cost-Effectiveness Study,” was published Nov. 9 in JAMA.

“Early recognition and management of high cholesterol, even in childhood, can prevent or delay heart attacks, strokes, and maybe even dementia later in life,” says Andrew Moran, associate professor of medicine at Columbia University Vagelos College of Physicians and Surgeons and one of the study’s senior authors.

“Screening for FH is important for kids and young adults—and their family members—so we need to find a cost-effective way to screen early for FH. For young people with severely high cholesterol but without a known genetic cause, early cholesterol testing and management can also be the path to prevention.”

One in five adolescents has some abnormality on their regular lipid screen. The American Academy of Pediatrics and the American Heart Association recommended that all children have their cholesterol measured between the ages of 9 and 11 to identify cholesterol disorders, but less than 20% of children receive such testing.

Study details

The researchers’ model tested multiple scenarios of a two-stage screening strategy that first measured children’s cholesterol levels (LDL-C) and then conducted genetic testing to identify FH genes in those with high cholesterol numbers. This study looked at screening children at age 10 or age 18 and how the screening and subsequent treatments could prevent heart disease decades later.

“Though FH is among the most common and severe genetic disorders, it’s still relatively rare,” says Moran. “Because of the high upfront costs of screening millions to find a relatively small number of people with FH genes, our modeling found that none of the combined cholesterol plus genetic screening strategies were cost-effective compared to usual care.”

The model found that if cholesterol screening led to more intensive cholesterol management among all those with high cholesterol (LDL ≥130 mg/dL) regardless of genetic test result, screening in young adulthood (around age 18) would be the most cost-effective strategy.

Could newborn genetic screening be more effective?

Going forward, FH screening may be cost-effective if it is bundled with other, established childhood screening packages, including newborn screening.

A recent study in JAMA Cardiology that trialed paired cholesterol and genetic screening for FH from the blood spots collected for newborn screening shows that newborn FH screening may be feasible at scale. The Columbia and Harvard teams are working with those investigators to explore best approaches to newborn or infant FH screening.

An added benefit of childhood genetic testing for FH would be the opportunity to cascade screening and treatment to other family members who may also have unrecognized FH, a factor that the current model doesn’t consider.

“We haven’t landed on the best way to screen early for FH yet, but with our modeling, we’re leveraging the best evidence and efficient computer modeling methods to arrive at the most promising approaches to test in real clinical trials of screening,” Moran says.

Husbands’ self-esteem linked to lower risk of preterm birth in partners

A husband’s optimism and confidence may play a crucial, if often unseen, role in helping babies arrive healthy and on time.

A new study from University of California Merced psychology researchers found that when married fathers reported higher levels of resilience—a quality that includes traits such as optimism, self-esteem, and perceived social support—their partners showed lower levels of inflammation during pregnancy and carried their babies longer.

“This is one of the first studies to show that a father’s inner strengths, such as his optimism and ability to cope with challenges, can ripple through the family in measurable, biological ways,” said Professor Jennifer Hahn-Holbrook, a co-author.

The findings were published in the journal Biopsychosocial Science and Medicine.

The research team, led by Ph.D. student Kavya Swaminathan, analyzed data from 217 mother-father pairs who participated in the Community Child Health Network study across five sites in the U.S.

Mothers provided blood samples during pregnancy that were analyzed for C-reactive protein, a marker of inflammation associated with an increased risk of preterm birth. Both parents also completed surveys assessing resilience-related traits such as optimism, self-esteem and social support.

Preterm birth, defined as delivery before 37 weeks, is a leading cause of infant mortality and lifelong health complications, including heart disease and developmental disorders. High maternal inflammation is a well-established risk factor. The UC Merced study indicates one reason why some mothers may be biologically protected: their partners’ emotional resources.

In married couples in this study, higher paternal resilience was associated with lower maternal inflammation, which in turn predicted a longer gestation period. Every day in the womb is better for fetal health and development. Among unmarried or cohabiting couples, that connection was not seen.

“This study is exciting because it highlights how the people surrounding a pregnant woman can shape her biology in ways that affect both her health and her baby’s,” Swaminathan said.

The study does not prove cause and effect, but offers strong evidence that emotional and social strength in the father can have physical consequences for mothers and babies.

“Fathers who feel confident and supported might engage in more positive daily behaviors, such as cooking healthy meals, offering encouragement and reducing stress at home,” said Hahn-Holbrook, a Health Sciences Research Institute faculty member. “Emotional connections may also play a role, since couples tend to coregulate their moods and even their immune systems.”

The study draws on the biopsychosocial model, which examines how emotional and social factors interact with biological factors to shape health. Previous research has shown that chronic stress can increase inflammation during pregnancy. The UC Merced study flips the lens to examine how positive psychological resources can protect against it.

Others involved in the study included UCLA Professor Christine Dunkel Schetter, one of several primary investigators, along with UC Merced psychology Professor Haiyan Liu and Stony Brook University Professor Christine Guardino.

‘Mind-captioning’ technique can read human thoughts from brain scans

Reading brain activity with advanced technologies is not a new concept. However, most techniques have focused on identifying single words associated with an object or action a person is seeing or thinking of, or matching up brain signals that correspond to spoken words. Some methods used caption databases or deep neural networks, but these approaches were limited by database word coverage or introduced information not present in the brain. Generating detailed, structured descriptions of complex visual perceptions or thoughts remains difficult.

A study, recently published in Science Advances, takes a new approach. Researchers involved in the study have developed what they refer to as a “mind-captioning” technique that uses an iterative optimization process, where a masked language model (MLM) generates text descriptions by aligning text features with brain-decoded features.

The technique also incorporates linear models trained to decode semantic features from a deep language model using brain activity from functional magnetic resonance imaging (fMRI). The result is a detailed text description of what a participant is seeing in their brain.

Generating video captions from human perception

For the first part of the experiment, six people watched 2,196 short videos while their brain activity was scanned with fMRI. The videos featured various random objects, scenes, actions, and events, and the six subjects were native Japanese speakers and non-native English speakers.

The same videos previously underwent a kind of crowdsourced text captioning by other viewers, which was processed by a pretrained LM, called DeBERTa-large that extracted particular features. These features were matched to brain activity and text was generated through an iterative process by the MLM model, called RoBERTa-large.

“Initially, the descriptions were fragmented and lacked clear meaning. However, through iterative optimization, these descriptions naturally evolved to have a coherent structure and effectively capture the key aspects of the viewed videos. Notably, the resultant descriptions accurately reflected the content, including the dynamic changes in the viewed events. Furthermore, even when specific objects were not correctly identified, the descriptions still successfully conveyed the presence of interactions among multiple objects,” the study authors explain.

The team then compared the generated descriptions to both correct and incorrect captions across various numbers of candidates to determine accuracy, which they say was around 50%. They note that this level of accuracy surpasses other current approaches and holds promise for future improvement.

Reading memories

The same six participants were later asked to recall the videos under fMRI to test out the method’s ability to read memory, instead of visual experience. The results for this part of the experiment were also promising.

“The analysis successfully generated descriptions that accurately reflected the content of the recalled videos, although accuracy varied among individuals. These descriptions were more similar to the captions of the recalled videos than to irrelevant ones, with proficient subjects achieving nearly 40% accuracy in identifying recalled videos from 100 candidates,” the study authors write.

For people who have a diminished or lost capacity to speak, such as those who have had a stroke, this new technology could eventually serve as a way to restore communication. The fact that the system has proven itself capable of picking up on deeper meanings and relationships, instead of simple word associations, could allow these individuals to regain much more of their communication ability than some of the other brain-computer interface methods. Still, further optimization is necessary before getting to that point.

Ethical considerations and future directions

Regardless of some of the more positive applications for mind-captioning devices capable of reading human thought, there are certainly legitimate concerns regarding privacy and potential misuse of brain-to-text technology.

The researchers involved in the study note that consent will remain a major ethical consideration when employing mind-reading techniques. Before more widespread use of these technologies is common, important questions about mental privacy and the future of brain-computer interfaces will need to be addressed.

Still, the study offers up a new tool for scientific research into how the brain represents complex experiences and a potential boon for nonverbal individuals.

The study authors write, “Together, our approach balances interpretability, generalizability, and performance—establishing a transparent framework for decoding nonverbal thought into language and paving the way for systematic investigation of how structured semantics are encoded across the human brain.”

Chronic kidney disease is now the ninth leading cause of death, global analysis finds

Record numbers of men and women globally are now estimated to have reduced kidney function, a new study shows. Figures rose from 378 million people with the disease in 1990 to 788 million in 2023 as the world population grew and aged, making it for the first time a top 10 cause of death worldwide.

Led by researchers at NYU Langone Health, the University of Glasgow, and the Institute for Health Metrics and Evaluation (IHME) at the University of Washington, the analysis explored the rise of the illness, in which the kidneys gradually lose their ability to filter waste and excess fluid from the blood. Mild cases may have no symptoms while the most severe stages can require dialysis, kidney replacement therapy, or an organ transplant.

The findings revealed that about 14% of adults in the world have chronic kidney disease. Results further showed that about 1.5 million people died from the condition in 2023, an increase of more than 6% since 1993 when accounting for differences in countries’ age demographics over time.

“Our work shows that chronic kidney disease is common, deadly, and getting worse as a major public health issue,” said study co-senior author Josef Coresh, MD, Ph.D., director of NYU Langone’s Optimal Aging Institute. “These findings support efforts to recognize the condition alongside cancer, heart disease, and mental health concerns as a major priority for policymakers around the world.”

This May, the World Health Organization formally added chronic kidney disease to its agenda to reduce early deaths from noncontagious illnesses by one-third before 2030. To combat the epidemic, experts first need an up-to-date understanding of its population trends, says Coresh, who is also the Terry and Mel Karmazin Professor of Population Health at the NYU Grossman School of Medicine.

The new report, published online Nov. 7 in the journal The Lancet, is the most comprehensive estimate of the condition in nearly a decade, according to the authors. It is simultaneously being presented at the American Society of Nephrology’s annual Kidney Week conference.

The investigation was conducted as part of the Global Burden of Disease (GBD) 2023 study, the world’s most comprehensive effort to track health loss across countries and over time. Its findings are widely used to guide policymaking and inform global health research.

For the study, the team analyzed 2,230 published research papers and national health datasets in 133 countries. Besides looking for patterns in diagnoses and mortality, the team examined the toll of disability brought about by chronic kidney disease.

Another major finding was that impaired kidney function, on top of killing people directly, was a key risk factor for heart disease, contributing to about 12% of global cardiovascular mortality. The results showed further that in 2023, the condition was the 12th leading cause of diminished quality of life from disability. The biggest risk factors for kidney disease were found to be high blood sugar, high blood pressure, and high body mass index (a measure of obesity).

Most people with chronic kidney disease in the study were in the early stages of the condition. This is important, says Coresh, because swift treatment with drugs and lifestyle changes can prevent the need for more dramatic and expensive interventions such as dialysis and kidney transplantation.

He adds that in sub-Saharan Africa, Southeast Asia, Latin America, and other low-income regions, relatively few people receive dialysis or kidney transplants—likely because these treatments are less available and harder to afford in those areas.

“Chronic kidney disease is underdiagnosed and undertreated,” said study co-lead author Morgan Grams, MD, Ph.D. “Our report underscores the need for more urine testing to catch it early and the need to ensure that patients can afford and access therapy once they are diagnosed.”

Grams, the Susan and Morris Mark Professor of Medicine at the NYU Grossman School of Medicine, notes that new medications have become available in the past five years that can slow kidney disease progression and reduce the risk of heart attack, stroke, and heart failure. However, it will take time to see improvements on a global scale.

Grams also cautions that since chronic kidney disease is undertested, it may be even more common than the current results suggest.

Disagreement between two kidney function tests predicts serious health problems

A mismatch between two common tests for kidney function may indicate a higher risk for kidney failure, heart disease, and death, a new study shows.

Health care providers for decades have measured blood levels of the molecule creatinine to track the rate at which kidneys filter waste from muscle breakdown in the bloodstream. According to more recent guidelines, levels of cystatin C, a small protein made by all cells in the body, can also be used to measure kidney function. Since these two tests are influenced by different factors—including some related to disease or aging—using both markers together can provide a better measure of kidney function and risk of organ failure than either one alone.

Led by NYU Langone Health researchers, the new work reveals that many people, especially those who are sick, often have a large gap between the two readings, which may be a signal of future disease. Specifically, the global study shows that more than a third of hospitalized participants had a cystatin C-based readout of kidney function that was at least 30% lower than one based on their creatinine levels.

“Our findings highlight the importance of measuring both creatinine and cystatin C to gain a true understanding of how well the kidneys are working, particularly among older and sicker adults,” said study co-corresponding author Morgan Grams, MD, Ph.D. “Evaluating both biomarkers may identify far more people with poor kidney function, and earlier in the disease process, by covering the blind spots that go with either test.”

The study is publishing online Nov. 7 in the JAMA and is simultaneously being presented at the American Society of Nephrology’s annual Kidney Week conference.

Beyond detecting signs of disease, assessing patients’ kidney function is important for calculating the appropriate dosage for cancer medicines, antibiotics, and many other drugs, says Grams, the Susan and Morris Mark Professor of Medicine at the NYU Grossman School of Medicine.

During another investigation, the results of which were published the same day, the same research team found that a record number of people worldwide have chronic kidney disease, which is now the ninth leading cause of death globally. Having new ways to spot the condition early can help ensure that patients receive swift treatment and avoid more dramatic interventions such as dialysis and organ transplantation, says Grams, who is also a professor in the Department of Population Health at NYU Grossman School of Medicine.

For the recent investigation, the research team analyzed health care records, blood tests, and demographic data collected from 860,966 men and women of a half-dozen nationalities. All participants had their creatinine and cystatin C levels measured on the same day and received follow-ups 11 years later on average. The team considered factors unrelated to kidney function that influence the biomarkers’ readings, such as smoking, obesity, and history of cancer.

Performed as part of the international Chronic Kidney Disease Prognosis Consortium, the study is the largest to date to explore differences between the two tests and whether they may signal potential health problems, the authors say. Established to better understand and treat the condition, the consortium provides evidence for global definitions of chronic kidney disease and related health risks.

According to the new findings, those whose cystatin C-based measures of kidney filtration were at least 30% lower than their creatinine-based measures were at higher risk for death, heart disease, and heart failure than those who had a smaller difference between the two metrics. The former group was also more likely to be diagnosed with severe chronic kidney disease that required dialysis or an organ transplant. The same was found for 11% of outpatients and seemingly healthy volunteers.

Grams notes that while cystatin C testing was first recommended in 2012 by the international organization Kidney Disease—Improving Global Outcomes, a 2019 survey revealed that less than 10% of clinical laboratories in the United States performed it in-house. The two largest laboratories, Quest Diagnostics and Labcorp, now offer the test.

“These results underscore the need for physicians to take advantage of the fact that more hospitals and health care providers are starting to offer cystatin C testing,” said study co-corresponding author Josef Coresh, MD, Ph.D., director of NYU Langone’s Optimal Aging Institute. “Physicians might otherwise miss out on valuable information about their patients’ well-being and future medical concerns.”

Coresh, who is also the Terry and Mel Karmazin Professor of Population Health at NYU Grossman School of Medicine, cautions that among the hospitalized Americans in the study, less than 1% were tested for cystatin C.