Digital aging twin measures how organs age at different speeds across adulthood

Aging is a complex process, and precisely measuring how the human body declines has long been a challenge. Two people of the same chronological age can have very different health trajectories. Scientists have also struggled to move beyond identifying aging markers to pinpointing what actually drives aging itself.

Now researchers from China’s Aging Biomarker Consortium (ABC) have built a computational framework—the Digital Aging Twin—to study aging at the individual level in order to predict biological age and track the different aging rates of individual organs.

The study, which was conducted by researchers from the Institute of Zoology, the China National Center for Bioinformation (Chinese Academy of Sciences), Xuanwu Hospital of Capital Medical University, and seven other institutions, marks a major breakthrough by moving from simply describing aging to systematically quantifying it, potentially paving the way for future interventions.

The findings were published in Cell on May 8.

Building a massive aging dataset

The team recruited 2,019 healthy individuals aged 18 to 91 from Chinese cities including Beijing, Quzhou, Ningbo, and Nanchang to create a standardized multicenter cohort called mCAS (multicentric Chinese Aging Standardized). The researchers collected data on 240 parameters for each participant, using clinical tests, cognitive and motor function assessments, brain and retinal imaging, gait analysis, and several layers of molecular data, including DNA methylation, RNA transcripts, proteins, metabolites, and gut microbiomes.

This dataset, comprising more than a billion high-quality data points, serves as the foundation for a three-tiered system of “clocks” to measure aging.

How the three-tier clock works

The first tier is the core capacity clock, which integrates 240 physiological indicators to reflect overall functional decline. The second and most powerful tier is the multimodal clock. It integrates multiple layers of molecular data (“omics” layers) via a deep learning process that employs “attention mechanisms”—which prioritize the most informative data—to quantify the contribution of different data types. This multimodal clock predicts chronological age with a mean absolute error of only 3.87 years, outperforming all single-omics clocks. The third tier comprises organ-specific clocks for the brain, liver, lungs, muscles, blood vessels, and skin, each based on clinical markers, plasma proteins, and imaging features.

One of the most striking findings is that organs age asynchronously. For example, the liver reaches a critical aging inflection point around age 40, whereas the brain’s aging accelerates at around age 50. The analysis also uncovered two major nonlinear waves of aging-related change: one occurring between ages 40 and 50, and another between 60 and 70.

Pinpointing coagulation factors as drivers

Seeking the causes of these aging-related changes, the researchers analyzed plasma proteomics, examined stained liver tissue from human donors, and conducted experiments using human cell cultures and animal models. They identified age-driven accumulation of liver-derived coagulation factors—particularly F13B, as well as F9 and F10—as a direct driver of vascular and systemic aging.

For example, when human aortic endothelial cells were exposed to these factors, they showed clear signs of senescence: elevated aging markers, impaired tube formation (a measure of blood vessel health), and increased inflammation. Similarly, injecting F13B into mice accelerated aging across multiple tissues, including the liver, heart, aorta, and kidney, accompanied by immune cell infiltration and inflammatory signals. These results show that coagulation factors are not just passive biomarkers but actionable drivers of aging.

Simplifying clocks for clinical use

To make the aging clock approach clinically practical, the researchers developed simplified “proxy clocks” using just 100 to 108 plasma proteins. These protein-based proxies closely match the predictions of the much more complex core capacity clock and organ clocks, suggesting that a relatively simple blood test might one day provide a comprehensive aging assessment.

The study also identified lifestyle factors influencing biological aging. Greater fruit intake, consistent sleep routines, and moderate walking were linked to slower aging. In contrast, smoking, insufficient sleep, and high meal frequency were associated with accelerated aging.

Implications for China’s X-Age Project

This research marks the first proof-of-concept achievement of the X-Age Project (also known as “耄耋”), a major national initiative led by the Aging Biomarker Consortium to build a comprehensive system of aging clocks for the Chinese population.

Although the current aging-clock framework is built on cross-sectional data, it is being continuously refined with longitudinal follow-up data and larger, more diverse populations. Future updates will also generate lower-cost and more sensitive detection approaches.

Despite the limitations of using cross-sectional data, the study has far-reaching implications for aging research as a whole. The Digital Aging Twin framework represents a fundamental shift in aging science—from description to prediction and from identifying correlations to pinpointing drivers.

Researchers now have a standardized, quantifiable, and interpretable system that can tell how fast a person is aging, which organs are aging most rapidly, and where interventions might be most effective.

RNA therapy slows harmful heart remodeling after heart attack in clinical trial

Following an acute heart attack, pathological remodeling processes occur in the heart. One consequence is so-called left ventricular systolic dysfunction, in which the pumping function of the left ventricle is impaired. To compensate for this, the heart muscle enlarges excessively, thereby becoming further weakened. The key regulator of this harmful growth of heart muscle cells is microRNA-132 (miR-132).

A team led by Prof. Dr. Thomas Thum, Director of the Institute for Molecular and Translational Therapeutic Strategies at Hannover Medical School (MHH), has produced a synthetic antagonist called CDR132L, which can block the main switch for cardiac hypertrophy and reverse chronic heart failure. The researchers have already demonstrated this in animal models and early clinical trials.

The drug candidate CDR132L has now been investigated in an international Phase II clinical trial involving patients who have recently suffered a heart attack and have heart failure. The HF-REVERT study has shown that patients with already advanced cardiac remodeling at the start of the study could benefit particularly from treatment with CDR132L. The results of the study were published in the journal Nature Medicine.

MicroRNA regulates pathological heart muscle growth

MicroRNA molecules belong to a class of molecules known as non-coding RNA (ncRNA), meaning they are not translated into specific proteins. Instead, they regulate a wide range of cellular processes—such as how a cell grows, whether it divides, or what type of cell it develops into. On the other hand, excessive microRNA activity can alter gene regulation and thereby trigger diseases.

One example of this is miR-132. More than ten years ago, Professor Thum’s research team discovered that a massive accumulation of this microRNA is directly linked to the pathological proliferation of heart muscle cells. The antisense oligonucleotide blocker CDR132L is the first ncRNA-based therapy to be used in Phase II trials for heart disease.

CDR132L is particularly beneficial for seriously ill patients

The HF-REVERT trial was conducted at around 80 study centers across seven European countries and the United Kingdom. A total of 294 patients were randomly assigned to three groups within three to 14 days of suffering a heart attack. In addition to standard treatment for heart failure, they received either CDR132L in two different doses or a placebo, administered in three intravenous doses at four-week intervals. The analysis was based on 280 patients who had received at least one dose.

“Our study has demonstrated that CDR132L is safe and well-tolerated and does not cause any harmful side effects on the liver, kidneys, the hematopoietic system or the heart,” says Professor Thum. CDR132L was particularly effective in patients with already advanced cardiac remodeling, meaning their hearts were already severely damaged.

“These findings support the further clinical development of the drug, particularly in the field of chronic heart failure,” emphasizes the cardiologist. The results represent an important step towards RNA-based therapies in cardiology. They offer great potential for positively influencing the progression of the disease in patients with heart failure.

CDR132L was developed by Cardior Pharmaceuticals GmbH, which was founded in 2016 by Prof. Dr. Dr. Thomas Thum and is based on his research at the Institute for Molecular and Translational Therapy Strategies. This MHH spin-off highlights the importance of university-based innovation for the development of new therapeutic approaches. Two further clinical trials are currently underway in patients with chronic heart failure, led by the Danish pharmaceutical group Novo Nordisk, which acquired the biopharmaceutical company Cardior in 2024.

Like mother, like fetus: Study finds contagious yawning begins in the womb

Yawning is incredibly contagious, and more often than not, seeing someone yawn right in front of us makes us instinctively do the same. It is often tied to social and emotional connection and brain mirroring, where we automatically align and simulate the emotions and actions of the people around us. A recent study published in Current Biology has found that this behavior begins even before birth.

Researchers recorded the facial expressions of pregnant women while an ultrasound machine captured real-time images of their fetuses’ faces. By comparing the two recordings, the researchers observed that fetuses were more likely to yawn after their mothers yawned, with a delay of about 90 seconds.

Tracking mother–fetus yawn sync

Yawning in humans begins far earlier than most people realize. Fetuses start yawning in the womb at around 11 weeks of development. Since there is no air for the fetus to draw in, during a yawn, they slowly open their mouths, perform movements that resemble breathing in and out, and then gently close their mouths again. For a long time, scientists believed that fetal yawning was thought to be driven purely by internal biological processes, but there wasn’t enough evidence to prove it either right or wrong.

In this study, the researchers wanted to see if fetuses in the womb would catch a yawn from their mothers. For this, they recruited 38 pregnant women who were between 28 and 32 weeks along, all with healthy, uncomplicated pregnancies.

The experiments involved the mothers watching three different types of video in a quiet room: a yawning video, a mouth-movement video, and a still-face video. While a video camera monitored the mother’s face, the researchers used a 2D ultrasound machine to provide a real-time view of the fetus’s nose and lips.

Three experts, who didn’t know what the mother was watching, reviewed the collected footage and verified the yawns. The researchers used an AI tool called DeepLabCut to precisely track subtle lip and nose movements, then trained a neural network to see whether a mother’s yawn mirrored the movement pattern of her fetus’s.

The researchers found that fetal yawning increased significantly only when the mother yawned, not when she simply opened and closed her mouth or kept her face still. They called this phenomenon prenatal behavioral contagion. The fetal yawns were not random either; they typically appeared about 90 seconds after the mother yawned, which is similar to the response time seen in contagious yawning among adults.

These findings suggest that fetal yawning may be part of an early mother-baby connection, where a mother’s behavior can influence how the fetus responds. Further research into how deeply this behavioral connection works, and whether it has long-term developmental effects, could reshape prenatal care.

Treatment-resistant depression may yield to combinations of medications already in clinical use

Many people with major depressive disorder get no relief from current treatments. Newer combinations of existing medications might help, researchers report in JAMA Psychiatry.

Persistent low mood, a loss of interest in previously enjoyable activities, lack of energy, feelings of worthlessness, poor concentration and appetite, and suicidal thoughts are symptoms of major depression. A significant percentage of adults suffering from it don’t get much relief from conventional antidepressant therapies. Doctors then have to look for alternative therapies to help these patients.

“At least one-third of adults with depression do not respond to at least two trials of conventional antidepressant therapies. These patients are considered to have treatment-resistant depression, and alternative therapies should be considered for such patients,” says UConn School of Medicine psychiatric epidemiologist T. Greg Rhee.

Rhee and colleagues at Harvard, Yale, and the University of Toronto, among others, have two recent studies in JAMA Psychiatry evaluating existing drugs used in new ways to treat major depression. One looks at the efficacy of intravenous ketamine, and the other at combinations of antidepressants with anti-psychotics.

Ketamine was originally developed as a fast-acting surgical anesthetic. There is also evidence it can rapidly alleviate depression in some individuals. The US Food and Drug Administration has approved esketamine, a version of the ketamine molecule, in nasal spray form as a treatment for depression. But intravenous ketamine is still being evaluated.

The researchers analyzed 26 existing randomized controlled trial studies that compared intravenous ketamine with controls. They found that ketamine was more effective than a placebo over the short term of a few days, but the effects were less pronounced after a few weeks. And ketamine seemed to work about as well as esketamine. Both drugs were very effective in rapidly reducing suicidal impulses in people who were in immediate danger of harming themselves.

Their second JAMA Psychiatry study compared how well combinations of antidepressants with antipsychotics worked in people with treatment-resistant depression. The researchers performed a meta-analysis of 22 studies, looking both at reduction of depressive symptoms and at side effects of the drugs. They found some antipsychotics were significantly more likely to help decrease symptoms of depression. But the antipsychotic that was most likely to help, lumapeterone, was also most likely to be discontinued by the patient because of side effects.

“These studies could potentially guide practicing psychiatrists and other clinicians to consider these new approaches of modalities for patients with moderate to severe depression, who did not previously respond to conventional antidepressant therapies,” says Rhee.

Rhee also adds, “We plan to conduct population-level epidemiologic studies to further examine the effectiveness and safety profiles of these treatment options.”

Gold-coated microneedles can detect subtleties in how liver and kidneys process drugs in real time

Scientists have taken a giant leap forward with the development of tiny microneedles designed to detect subtle but critical changes in how the liver and kidneys process therapeutic drugs. The experimental technology, under development at the University of California, Los Angeles, aims to overcome longstanding limitations that have hindered wearable microneedle biosensors.

“Wearable microneedle biosensors promise real-time molecular monitoring for precision medicine but are limited by low sensitivity and tissue abrasion,” writes Dr. Jialun Zhu, lead author of a new study published in Science Translational Medicine.

“Overcoming these challenges, we recast electrode functionality not merely as a sensing substrate but as a mechanism for resilient, high signal-to-noise ratio measurements in tissue,” added Zhu, a bioengineer in UCLA’s Samueli School of Engineering.

In plainer English, the multidisciplinary UCLA team was able to get their system to work flawlessly compared with similar devices by other research groups.

Tracking drug clearance in real time

The team has developed a biosensor that in early research already shows promise for real-time in-tissue monitoring of drug pharmacokinetics. These preclinical studies also show that the device is both safe and highly accurate. However, it has not yet been tested in humans.

Still, if the technology seems futuristic, it may be because tiny wearables and implants that measure any number of biological processes have been themes in science fiction for decades. With refinements underway to improve precision medicine, the future has already arrived. Scientists involved in the project ranged from molecular and cellular biologists to biochemists and a team of bioengineers like Zhu.

One goal in precision medicine has been the development of a minimally invasive device that can monitor the clearance of drugs from a patient’s kidneys and liver, providing more accurate dosing guidelines. To address persistent problems that have impeded progress toward reaching that goal, the team engineered what it calls a “resilient nanostructured bioelectrode” using a microscopically thin layer of a precious metal.

“Our microneedle-based resilient nanostructured bioelectrode is fabricated using a bilayer process that strengthens the electrode with a micrometer-thick gold adhesion layer,” Zhu noted in the research paper.

The key reason that an accurate wearable biosensor is needed is explained by the growing number of medications with narrow therapeutic ranges. That means it is possible to provide doses that are either too low or too high. With a wearable such as the one under development, doctors can tell if they have prescribed a precise dose, and how well the drug is being processed and excreted.

Toward precision monitoring in organ dysfunction

In preclinical experiments, the biosensor enabled continuous in-tissue monitoring of drug pharmacokinetics, including changes associated with liver and kidney dysfunction. Scientists found in the animal model research that their experimental technology measured drug kinetics for six days and produced accurate parameters for drug dosing while also monitoring drug clearance from the liver and kidneys.

The system revealed, for example, that one chemotherapy drug, irinotecan, cleared out slowly in mice with liver damage. The technology also traced the kinetics of several antibiotics during various stages of chronic kidney disease.Comparison between conventional blood-based therapeutic drug monitoring and wearable ISF-based therapeutic drug and metabolic function monitoring. Credit: Science Translational Medicine (2026). DOI: 10.1126/scitranslmed.adr5493

An approach by other research groups has involved the use of wearable biosensors that incorporate microneedles, which measure minute molecular changes in drug concentrations. However, current microneedles suffer from issues such as low sensitivity and poor mechanical durability.

In contrast, Zhu and colleagues developed a more resilient, nanostructured microneedle that analyzes the biochemistry of interstitial fluids between cells. Their design incorporates sensors that endow it with a high degree of specificity and features a strong layer of gold that increases the needle surface area and resists corrosion.

Technology holds promise

In an editorial commentary, Molly Ogle, an associate editor at Science Translational Medicine, notes that wearable technology could play an important role in precision medicine. “The study demonstrates preclinical promise for minimally invasive therapeutic drug monitoring and functional assessment of hepatic and renal drug processing,” Ogle wrote.

Zhu and colleagues underscored, meanwhile, that their device not only has marked improvements over similar technology but also could be economically manufactured. They predict that their resilient nanostructured bioelectrode could be mass produced at less than $1.50 per sensor.

“These results establish the resilient nanostructured bioelectrode as a viable microneedle platform for high-fidelity in vivo deployment of electrochemical biosensors, enabling minimally invasive, longitudinal monitoring of low-concentration analytes and real-time assessment of organ function,” Zhu and the UCLA team concluded.

Smartwatches and GPS devices show promise for tracking environmental impacts on health in real time

As climate change drives more frequent extreme heat and worsening air pollution, researchers are seeking better ways to understand how these exposures affect health in real time. A new pilot study led by researchers at The City University of New York demonstrates the feasibility of combining wearable devices, smartphone location data, and real-time surveys to capture individuals’ environmental exposures and their immediate physical and emotional effects.

The study, “Feasibility of Integrating Wearable Devices and Ecological Momentary Assessment for Real-Time Environmental Exposure Estimation,” appears in the journal JMIR Formative Research.

The study was co-authored by Sameera Ramjan and Melissa Blum (co-first authors), Rung Yu Tseng, Katherine Davey, and Duke Shereen, with Yoko Nomura as senior author.

“People move through many different environments each day, and this approach lets us capture that in real time,” said Ramjan, a doctoral student in the CUNY Graduate Center Psychology program.

“We were struck by how quickly the data revealed patterns—changes in heart rate variability, shifts in mood—that lined up with where participants had been and what they were exposed to.”

For the study, participants wore Fitbit smartwatches for roughly a month while completing short mood surveys known as ecological momentary assessments several times a day.

Researchers combined these data with smartphone location tracking to estimate exposure to heat and air pollutants such as nitrogen dioxide, particulate matter, and sulfur dioxide based on where participants spent time throughout the day.

The findings suggest that this integrated approach is not only feasible but also revealing. On days with higher exposure to heat and nitrogen dioxide, participants showed changes in heart rate variability, a marker of the body’s ability to recover from stress. Higher exposure to sulfur dioxide was associated with increased feelings of nervousness and hopelessness.

Interestingly, higher heat exposure was linked to lower self-reported sadness, a counterintuitive finding that may reflect seasonal patterns in outdoor activity and social engagement during warmer weather, underscoring the need for larger studies to disentangle these effects.

“Even in a small pilot, we could see that the relationship between environmental conditions and people’s physiological and emotional responses is more complex than traditional methods can capture,” said Blum, a medical student at the Icahn School of Medicine at Mount Sinai.

“By combining wearable sensors, GPS data, and real-time surveys, we’re able to build individualized exposure profiles that move with people throughout their day. That’s a real shift from relying on stationary monitors or home addresses.”

“To our knowledge, this is the first study to combine wearable devices, ecological momentary assessment, and continuous GPS tracking to measure environmental exposures and their immediate health impacts,” said senior author Nomura, a distinguished professor of Psychology at the CUNY Graduate Center and Queens College with an appointment at the Icahn School of Medicine at Mount Sinai.

“It’s a small pilot, but it demonstrates an integration between consumer technology and environmental epidemiology that could open the door to personalized approaches for preventive medicine.”

The pilot study also identified areas for improvement, including simplifying the system and increasing participant adherence—lessons that have already been incorporated into the next phase of the research.

Building on these findings, Nomura’s team is now applying the refined system to a larger, National Institutes of Health (NIH)-supported study examining how prenatal and current environmental exposures affect brain development and mental health in adolescents.

The work comes at a critical moment. Exposure to extreme heat and air pollution is increasing, with disproportionate impacts on vulnerable populations, including children, pregnant individuals, people experiencing homelessness, and those with lower socioeconomic status. Children are particularly at risk because environmental exposures can have lasting effects on brain development and behavior.

Beyond research, the approach could have clinical applications. Real-time environmental exposure monitoring could one day help clinicians make more informed decisions about patient care, particularly for individuals with conditions sensitive to heat or air quality.

“This is still early-stage work, and we’re cautious about reading too much into a small sample,” Nomura said. “But improving how we measure exposure is a critical step toward protecting public health, and these results give us confidence that the approach can scale.”

Key magic mushroom ingredient makes fish less aggressive and lazier

More than 200 mushrooms—primarily those belonging to a genus of gilled mushrooms called Psilocybe—contain the psychoactive compound psilocybin. In the brain of mammals, this chemical can bind to serotonin receptors and influence behavior and emotions, including aggression, appetite, and mood. Its effects on the social behavior of animals, however, remain largely undescribed.

In a new Frontiers in Behavioral Neuroscience study, researchers in Canada have tested whether the effects of psilocybin extend to the social behavior of the amphibious mangrove rivulus fish (Kryptolebias marmoratus).

“We show that an acute, low dose of psilocybin significantly reduces activity and aggressive attack behavior during social interactions in adult mangrove rivulus fish, a species that is naturally highly aggressive,” said first author Dayna Forsyth, a research associate and former MSc student at Acadia University in Nova Scotia.

“These findings provide the first evidence that psilocybin can selectively reduce escalated aggression in a vertebrate model without suppressing social interaction,” added senior author Dr. Suzie Currie, a biologist at The University of British Columbia.

Calm waters

Mangrove rivulus fish are innately aggressive, especially when paired with another individual. Their aggressive behaviors are straightforward and subtle changes can easily be detected. These fish are also self-fertilizing and produce embryos that are genetically identical. Therefore, this model ensures all observed effects are caused by psilocybin treatment rather than genetic differences between fish.

The team used three genetically distinct, laboratory-bred lines. Fish from one line were exposed to psilocybin, fish from a second served as stimulus fish. A third line was used to quantify whole-body concentrations and absorption of psilocybin.

For the first phase of the experiment, the focal fish was added into a tank containing a stimulus fish to measure baseline behavior. The fish were separated by an opaque cover placed over a fiberglass mesh barrier through which the fish could see and smell, but not reach, each other. After a five-minute adjustment period to the shared tank, the opaque barrier was removed and interaction monitored.

Twenty-four hours later, the same focal fish was put in a water tank in which psilocybin was dissolved. After exposure to the substance for 20 minutes, the fish was added into the tank occupied by the same stimulus fish of the day before. After removal of the opaque barrier, interaction was observed again.

Magic mushroom, mellow fish

Observation of behaviors to measure activity (time spent moving) and aggression levels (including swimming bursts) revealed that fish dosed with psilocybin showed decreased levels of activity and performed fewer swimming bursts compared to specimens that hadn’t received psilocybin treatment.

“Swimming bursts are high‑energy attack behaviors that represent an escalation of aggression towards the stimulus fish without making physical contact,” explained Currie. “Other types of aggressive behaviors, like head‑on displays, are more about communication and social assessment and require very little energy.”

“Psilocybin’s calming effect appears to selectively reduce energetically costly, escalated behaviors while lower‑energy social display behaviors remained largely unchanged,” said Forsyth. “This suggests that this compound can selectively dampen escalated social conflict rather than shutting down behavior altogether.”

Psilocybin also influenced activity levels, with dosed fish spending less time moving than control fish when paired with a conspecific.

Diving deeper

In the long run, non-human models in drug-screening experiments like this can provide robust results that can later be translated to humans. In the future, findings like those made here could help inform therapeutic research by clarifying which aspects of social behavior are most sensitive to psilocybin.

The team cautioned, however, that the current study did not test clinical treatments and results from fish cannot be directly extrapolated to humans.

The study also focused on single doses and short periods of exposure, and didn’t examine long-term effects, repeated dosing, or adaptation over time. Future studies are needed to confirm whether the lower level of aggression observed here can be sustained.

“Future studies can build on this work to explore how psilocybin alters neural signaling, which serotonin pathways are involved, and why some aspects of social behavior are affected while others are not,” concluded Currie. “These are questions that are difficult or impossible to answer directly in humans.”

Large study finds a strong link between depression and problem cannabis use

A new meta-analysis of 55 studies involving more than 3 million people has found that 31% of individuals with cannabis use disorder (CUD) also struggle with major depressive disorder (MDD). While a link between these two conditions has been known for some time, this study provides the clearest evidence to date that the relationship goes both ways. CUD was also found to be present in 10% of those with MDD.

The findings are published in the Journal of Psychiatric Research.

An international team of scientists searched major databases to identify relevant studies published in English and Portuguese through to 2024. They used mathematical models to combine results from millions of people, ensuring they accounted for differences in age, gender, and location.

Setting matters

The team discovered that the overlap between MDD and CUD varies significantly depending on the setting. For example, in the community (among the general public or volunteers in the studies), rates of cannabis use disorder among people with depression are relatively low. However, in psychiatric clinics, the connection is much stronger. More than 28% of patients being treated for depression also meet the criteria for CUD.

The meta-analysis also revealed that the two disorders are often linked throughout a person’s life, even if they are not present at the same time. While 20% of those with cannabis use disorder were found to be depressed at the time they participated in their respective study, 35% had struggled with depression at some point during their lives.

The problem of overlapping symptoms

Despite these findings, the researchers note that several limitations may prevent us from getting a clear picture of what is happening. One major challenge is an overlap in diagnosis where symptoms of cannabis withdrawal, such as anxiety, irritability, and sleep disturbances, are very similar to the clinical signs of depression. This makes it difficult for doctors to determine whether someone is suffering from a depressive disorder or the effects of their cannabis use.

Additionally, much of the data comes from North America, so it may not reflect conditions in other countries or cultures.

Future screening

However, due to the high percentages involved, the study authors recommend that health care providers regularly screen for cannabis use in depressed patients. Likewise, they suggest evaluating depression levels in those seeking help for CUD.

“Differences between psychiatric and community samples—especially the markedly higher current CUD prevalence in patients with MDD—underscore the need for systematic screening across treatment settings,” wrote the team in their paper.

Because the two disorders appear deeply linked, catching one early may prevent the other from worsening.

Digital therapy outperforms referrals to campus clinics among college students

College students with anxiety, depression and eating disorders may be more likely to start and to respond more positively to therapy offered via a digital app compared to referrals to in-person campus clinics, according to a study led by Penn State researchers and published in the journal Nature Human Behaviour.

Globally, an estimated 40% to 60% of college students experience a mental health disorder at some point, and the need for campus counseling services has increased faster than institutions’ capacity to provide these services, according to the researchers.

The research team wanted to see if a proactive intervention using a digital therapy app could effectively treat anxiety disorders, depression and eating disorders, as well as address the increased need for psychological services.

How the digital therapy app works

The commercially available app incorporates cognitive behavioral therapy (CBT) principles that coach individuals through identifying negative thinking patterns and developing skills and behavioral changes to address these patterns.

The researchers found that students receiving the digital intervention were more likely to report being symptom free at the six-week, six-month and two-year marks, and that these students were more likely to engage these services compared to the campus referral group.

Specifically, service uptake—or when a person actually receives a service—was seven times greater for college students assigned to a digital intervention than to on-campus clinic referrals. Approximately 74% of individuals given access to the digital intervention started the program, compared to 30% of individuals who were given a referral to a campus clinic and received at least one therapy session or a new medication prescription.

“One of the challenges with any digital intervention is that people sometimes download an app but then do not use it,” said lead author Michelle Newman, professor of psychology and psychiatry at Penn State.

“We were also interested in learning the extent to which people actually received services after being randomized to the app or on-campus counseling center. We found that uptake was significantly better in the digital intervention than referral to the counseling center.”

How the study was conducted

To test the effectiveness of the digital intervention, the researchers worked with 26 colleges and universities across the U.S. to send an email to the entire student body—what researchers call a population-level approach—inviting them to take part in a mental health screening.

Of the 39,194 individuals who completed the screening, 6,205 had clinical levels of or were at high risk of developing generalized anxiety disorder, panic disorder, social anxiety disorder, depression or an eating disorder. Those individuals completed an additional baseline survey and were randomized into one of two groups. One group received access to the coached digital intervention for six months, while the other group received referrals to their campus counseling center.

The therapy app offered six to eight 20-minute-long modules for each mental health problem. Participants in the digital therapy group completed an average of 2.4 modules and received about 15 messages from a trained therapy coach.

Newman explained that individuals in the digital therapy group began with modules addressing their main mental health concern and then worked with their coaches to receive additional modules that addressed co-occurring issues.

Measuring multiple disorders over time

“A unique aspect of the work was that we screened for five disorders—generalized anxiety disorder, social anxiety disorder, panic disorder, depression and eating disorders—and measured all disorders at every point in the treatment, because we know that disorders like depression and anxiety often co-occur, but that co-occurrence doesn’t necessarily happen simultaneously,” Newman said.

“The digital intervention overall had a significantly larger number of individuals who had no disorders at every timepoint in the study. We did not just treat individuals with clinical levels of these disorders, but we also prevented the onset in more of those in the digital intervention who screened to be at risk.”

For example, compared to the campus referral group, those who used the digital intervention had a 4.3% lower prevalence of having any mental health disorder at the six-week mark, 4.9% lower prevalence at the six-month mark and 3.8% lower prevalence at the two-year follow-up.

This result showed that the coached digital intervention both prevented the development of new disorders as well as treated disorders that were present before the intervention.

Implications beyond the pandemic

The researchers conducted the study during the height of the COVID-19 pandemic, recruiting participants from October 2019 to November 2021 and completing their data collection by October 2023. The results, they said, highlight the effectiveness of digital interventions at times when access to traditional, in-person services may have been constrained.

The population-level screening and digital therapy approach can complement existing in-person services beyond college campuses, Newman said.

“This approach could potentially be used anywhere where you have access to a full population in terms of email addresses, like at a company, to help disseminate mental health services that people might not think about seeking,” she said, explaining that the proactive screening process taken in the study helped individuals prevent disorders for which they were at high risk of developing and treated disorders for which they may not have sought face-to-face services.

Future work to personalize treatment

Next steps will make use of work led by Penn State graduate student Adam Calderon and Newman, who will use data from the current study and previous work by Newman’s lab to examine which individual characteristics may predict who would benefit from digital interventions, Newman said.

Can AI-embodied surgical robots revolutionize surgery?

Embodying surgical robots with next-gen AI can safely augment practice if ethical and regulatory questions are addressed, say experts writing in Frontiers in Science. A team of pioneering surgeons and researchers from King’s College London says AI-enhanced surgical robotics could enable “true personalized surgery” and enhance the performance, situational awareness, decision-making, and effectiveness of surgical teams.

Their analysis also addresses regulatory questions including reducing risks from systems that continue to learn and change after approval. It also tackles how we can prevent dataset biases from reinforcing inequalities, and how we address the concentration of research and industry in resource-rich nations.

Lead author and robotic urological surgeon Prof Prokar Dasgupta, formerly of King’s College London and Guy’s Hospital, London—who recently performed the UK’s first long-distance robotic operation—said, “Using advanced AI and robotics in the operating room is very exciting. The next few years will see intelligent robots impact all stages of surgery—including techniques, emergency responses, team roles, workflows, and assistive functions.”

The authors warn that AI must sustain—not disrupt—operating rooms, and should support advances and refinement in surgical skill, procedure and technology, they warn. Most importantly, its use should be safeguarded by robust human and regulatory oversight, with surgeons remaining chief decision-makers.

Prof Dasgupta added, “With AI’s promise comes profound implications for clinical practice and the continued safe function of surgical teams. These warrant multistakeholder discussion to ensure clarity of liability, minimization of bias, integration of autonomous robotic systems within surgical teams, global equity, and robust product regulation.”

True personalized surgery

Anticipated advances include AI embedded into surgical robots, known as “embodied AI,” linked to sensor-equipped operating rooms that generate spatial understanding, adaptive learning, performance benchmarking, autonomous surgical assistance, and feedback to teams mid-operation.

Future surgical AI will also harness new data streams—gathered from patients, surgical teams, and sensors in robots—to provide real-time mid-operation guidance and decision support to optimize surgical actions.

Predictive AI could also allow surgeons to accurately visualize the outcomes of various actions before taking them—called cause-and-effect recognition. This could in the future be used to help improve patient outcomes.

First author Dr. Alejandro Granados from King’s College London said, “Surgery is on the brink of a profound transformation, where technology will not only help predict outcomes but also guide clinicians toward the most optimal, personalized treatment for each patient.”

Regulating adaptive systems

Currently, regulators authorize medical technologies based on their submitted form—but AI-embodied surgical robots present a challenge given their ability to learn, adapt, and change post-approval.

To address this challenge, the authors call for regulatory reforms, including changes to licensing pathways, device classifications, post-market monitoring, and compliance standards to better serve the higher risk profile of changing systems.

Dr. Granados said, “AI’s ability to learn presents an unprecedented puzzle. We are at a pivotal time in surgery where we need to begin answering those questions to ensure patients can benefit from the wealth of benefits AI-powered operating rooms bring.”

Clinical trials, the paper asserts, should adopt standardized metrics for evaluating AI software and assessing human–AI and human–robot interactions. It also recommends that regulators work alongside professional bodies to oversee surgical training as practice transitions from clinical expertise to data-driven approaches.

It also recommends new models of collaboration between academia, industry and health care systems in lower income countries to build cost-effective AI and robotic ecosystems from which all can benefit.

Prof Dasgupta said, “We require a new set of frameworks—spanning regulatory and compliance, trial methods, reporting standards, and training approaches—to ensure the ongoing safety and effectiveness of robotics and AI in surgery.”

Dr. Granados said, “Realizing this vision on a global scale will require careful stewardship. We must ensure that health care professionals and patients everywhere can benefit equitably from the compelling potential of AI and robotics innovation that is coming.”

Human decision-makers

The authors expect future iterations of robotics to operate with ever-greater degrees of autonomy while maintaining essential human oversight.

They describe how the surgeon’s role will shift towards supervision, coordination and high-level decision-making, while nurses, anesthetists and assistants can expect to gain additional skills. They also expect surgical teams to be complemented by clinical data scientists plus AI and robotic integration engineers.

Prof Dasgupta said, “Human surgeons must continue to be the chief decision-makers, and insights from AI models must be presented differently to members of the surgical team, based on their role, if we are to maintain the clear chain of authority necessary for safe surgical practice.”

Dr. Granados said, “AI and robotics, strategically deployed in the operating room, will form the foundation of the shift towards systems that learn from every procedure, support surgical teams in real time, and potentially deliver safer, more precise, and better outcomes for patients.

“However, we must ensure that human judgment remains central, while addressing today’s unmet surgical needs and disparities in who benefits from access.”