3D-printed brain sensors may unlock personalized neural monitoring

Soft electrodes designed to perfectly match a person’s brain surface may help advance neural interfaces for neurodegenerative disease monitoring and treatment, according to a new study led by Penn State researchers. Neural interfaces are powered by tiny sensors capable of tracking biophysical signals, known as bioelectrodes. These sensors are usually made from stiff materials in a one-size-fits-all design that struggles to match the brain’s complex structure. The researchers have created a novel approach to 3D printing bioelectrodes that can stretch and morph to fit the minor differences that make every brain unique.

Simulating unique brain structures

The team used software to simulate detailed brains based on MRI scans taken from 21 human patients, shaping a set of electrodes tailored for brains’ specific structures before 3D printing the electrodes and models of the brains. In a paper published in Advanced Materials, they reported that their electrodes better fit the structure of the brain than traditional designs, while remaining effective and biologically compatible, even in tests done in rats.

The folds in the human brain are created through a process known as gyrification, where the cortical sheet on the outer wall of the brain bunches up into ridges, known as gyri, and grooves, known as sulci. This helps cells across the brain communicate at high speeds, and allows for a relatively large organ to fit compactly in the skull—a spread-out adult brain would be around 2,000 square centimeters, or about the size of two large pizzas.

Why one-size-fits-all falls short

Although the major cortical folds are consistent across individuals, the precise layout of the brain’s gyri and sulci changes substantially from person to person, according to Tao Zhou, Wormley Family Early Career Professor, assistant professor of engineering science and mechanics and corresponding author on the paper. However, traditional bioelectrode designs don’t take this into account.

“Each person has a different brain structure, depending on their height, weight, age, sex and more,” said Zhou, who also holds an affiliation in biomedical engineering and the center for neural engineering at Penn State. “Despite this, we try to fit neural interfaces onto brains like they have identical structures. This motivated us to create electrodes that are tailored for each individual, based on the structure of their brain.”

Hydrogel and honeycomb design

The electrodes are built mainly from a water-rich material known as hydrogel to better match the soft tissues and patient-specific geometry of a brain. Furthermore, the team used a novel honeycomb-inspired structure that offers flexibility and strength, while remaining cost-effective and quick to print, according to Zhou.

“The honeycomb structure helps us significantly reduce the stiffness of the electrodes, without sacrificing their mechanical strength,” Zhou said. “What’s more, the structure helps us reduce the overall material used during fabrication, reducing production time, cost and environmental impact.”

From MRI scan to 3D-printed match

Production starts by taking an MRI scan of a patient’s brain, which is used to conduct finite element analysis—a process that creates a detailed simulation of a person’s neural structure. This analysis is then rendered as a 3D model of the patient’s brain, where the team uses computer software to tailor a bioelectrode specifically morphed to fit the ridges and grooves of the cerebral cortex.

After shaping, the team 3D prints the hydrogel electrode using direct ink printing, a technique that can create electrodes capable of monitoring and transmitting brain signals over a relatively small surface. For this study, the team 3D printed models of 21 different participant brains, applying their electrodes and physically measuring how accurately the electrodes could fit the brain surface. Zhou explained how traditional fabrication approaches require specialized facilities like clean rooms, making them incredibly expensive to customize—3D printing allows the team to personalize and manufacture electrodes much faster, for a fraction of the price.

Softer contact, stronger brain signals

Compared to traditional approaches, the hydrogel-based electrodes follow the structure of the brain more precisely. Zhou said their approach produces electrodes that exhibit nearly perfect connectivity to electrical signals present in the brain. Additionally, because the stretchy gel is so malleable, it can be applied to the soft brain tissue without causing damage, compared to the stiff materials comprising other designs that could damage tissue.

According to Zhou, the softness of their electrodes enables closer and more stable contact with the brain, in turn facilitating higher-quality, more reliable monitoring. Moreover, bioelectrodes made with this approach don’t impact fluid transport around the brain, a critical aspect of brain function that many traditional electrodes disrupt.

“Personalizing the electrodes to the brain’s specific structure substantially improves their reliability,” Zhou said. “Because they conform to the brain better, the signal quality itself is significantly improved.”

Testing in rats and future use

To further study their electrodes, the team placed them onto the brains of rat models over a period of 28 days. The rats did not exhibit any immune response to the printed electrodes, a key consideration in biodevice development, Zhou said. Additionally, the electrodes did not exhibit performance degradation, while offering sensitive and accurate readings of the electric and physiological signals in the brain.

Zhou said he believes that this printing method could serve as a framework for the commercial-scale printing of bioelectrodes customized for specific patients. Although these systems are traditionally used for monitoring neural activity, the team plans to explore how personalized electrodes may contribute to neurological treatments.

“We are looking to further improve this technology to optimize the electrodes to monitor for specific diseases,” Zhou said. “In the future, we would really like to work with patients to see how this approach could support brain monitoring and disease treatment in clinical settings.”

How a tiny circle of repeat offenders poisoned 100s of gold-standard medical trials for over a decade

Randomized Controlled Trials (RCTs) are the gold standard of medical research as random assignment approach helps eliminate bias and yields the most reliable evidence on whether a treatment truly works. Since RCTs sit at the top of the evidence hierarchy, retractions can send ripple effects across the entire system. A fraudulent study with fabricated data or results can influence the credibility of systematic reviews and meta-analyses, and those distortions can quietly shape clinical practice guidelines that influence real-world medical care.

In a recent study, researchers set out to investigate how many retracted randomized clinical trials were linked to superretractors (authors with the most retractions) and to highly cited authors with multiple retractions.

They found that just 6 superretractors were co-authors on 22% of all retracted clinical trials studied, 5 were based in Japan, and 1 was from Germany. Also, a group of 18 top-cited scientists were involved in 25% of all retracted trials. The retractions were highly concentrated in specific areas like anesthesiology, endocrinology and metabolism.

The findings are published in JAMA Network Open, as well as an Invited Commentary.

Identifying the superretractor concentration

To become a superretractor, first, a researcher must produce large volumes of unreliable, duplicate, or fabricated work, often fueled by the publish or perish system of academia that rewards output over rigor and lacks strong oversight. Second, that misconduct has to be uncovered through investigation and exposure.

Superretractors can also act as superspreaders of contaminated research. When flawed or fabricated trials enter systematic reviews and meta-analyses, they are amplified and woven into widely used evidence summaries. By the time a study is retracted, it has often already shaped these studies referenced for developing clinical guidelines that doctors rely on. The result is a cascade of distorted evidence that can translate into incorrect, even harmful, decisions in patient care.

A major concern is the rise of zombie studies—research that appears fake or lacks credible data yet remains in medical literature, often without being retracted by journals as it should be.

By pinpointing a small group of highly cited, influential authors behind many retractions, researchers can more quickly flag fraudulent and zombie literature at scale and trace related problematic studies through their co-author networks.

So, in this study, the researchers used a dataset called VITALITY, which includes 1,330 randomized clinical trials (RCTs) that have been retracted as of late 2024. They focused on three particular groups of scientists: superretractors on the Retraction Watch Leaderboard, scientists who are 2% of their subfield and have 10 or more retractions not due to editor or publisher errors, and top-cited scientists in 2024 who also have 10 or more such retractions.

They found that a very small group of people were responsible for a disproportionately large number of retracted medical trials. Of the 30 global superretractors, six individuals were involved with coauthoring 290 retracted trials and among the 163 highly influential scientists, just 18 were linked to 327 retracted trials.

Also, papers written by these high-profile authors remained in the scientific literature far longer before being retracted, taking an average of about 14 years, compared with just over a year for other researchers. As a result, these papers accumulated far more citations, allowing potentially flawed findings to spread more widely through the scientific community. Many of these authors also collaborated with other scientists whose papers were retracted, as co-authors.

These findings point to a clear need for systematic approaches to actively trace how untrustworthy data spreads and to prevent its continued contamination of the scientific record. The information highlighted in this study can guide journal editors, funders, and institutions in identifying high-risk authors and fields, directing attention where it is needed most.

Cutting calories to slow aging—without compromising health

Restricting calorie intake in species such as mice, rhesus monkeys, and fruit flies has been shown to extend their lifespans. In some cases, these animals not only live longer, but are also free of disease. But when pushed too far, calorie restriction can have negative impacts. Mice that undergo a 40% reduction in calorie intake, for example, are more susceptible to infections, less likely to reproduce, and experience stunted growth.

Scientists have wondered whether there is a way to reap the longevity benefits of calorie restriction in humans without its negative repercussions. And in a new study published in Nature Aging, they found a potential answer in an immune-related protein called complement component 3 (C3).

Yale researchers have previously shown that people who undergo moderate calorie restriction—a 14% reduction in calorie intake—for two years developed better immune defense without any growth or reproductive trade-offs.

“This concept demonstrates that aging is actually malleable and a process that can be targeted,” says senior author Vishwa Deep Dixit, Ph.D., Waldemar Von Zedtwitz Professor of Pathology, professor of immunobiology and of comparative medicine, and director of the Yale Center for Research on Aging (Y-Age) at Yale School of Medicine.

Calorie restriction reduces inflammation-related protein

In the new study, Dixit and his colleagues at YSM analyzed the plasma samples of 42 individuals who took part in a two-year trial called the Comprehensive Assessment of Long-Term Effects of Reducing Intake of Energy or CALERIE.

“It’s the only trial of its kind that has been done with such rigor and control and demonstrates relevance to human physiology,” Dixit says. During the trial, participants were able to reduce their calorie intake by 11 to 14% without feeling deprived.

In their analysis, the researchers detected more than 7,000 proteins in the longitudinal plasma samples. Among them was an immune-related protein called C3 that was significantly reduced following calorie restriction. C3 was of particular interest to the scientists as prior studies have suggested that activation of the complement system—a network of proteins involved in the defense against pathogens—could drive chronic inflammation, a major hallmark of aging and age-associated diseases.

“But the causal effects of C3 in aging and chronic inflammation have not been identified. So, we were very excited to find that in our study,” says Hee-Hoon Kim, Ph.D., a postdoctoral associate in the Dixit lab and a co-first author of the paper.

A target to slow aging

When comparing the protein levels before and after two years of calorie restriction, the researchers identified white adipose tissue—the main type of fat tissue in mammals—as the primary site affected by calorie restriction.

The researchers confirmed their findings in animals. As with the human plasma, they found that C3 expression increased with age in mice. Further biochemical analyses showed that visceral white adipose tissue was responsible for an increase in C3 during aging.

“We were not expecting that because these proteins are mainly synthesized in the liver,” says Manish Mishra, Ph.D., a postdoctoral associate in the Dixit lab and a co-first author of the study.

Single-cell RNA sequencing further revealed that the protein is produced by age-associated macrophages—essential white blood cells—within the adipose tissues.

“This whole process was unknown in the beginning,” Mishra says. “Just to narrow it down to the subtypes of macrophages responsible for this complement protein production was very challenging.”

Macrophages are the body’s first line of immune defense, mostly known for their role in engulfing pathogens. These immune cells also help maintain the balance of tissue functions, Dixit adds.

The question is whether the benefits gained from a reduction in C3 can be achieved without weight loss.

The researchers initially suspected that the shedding of adipose tissue or body fat due to weight loss may have stalled C3 production and slowed down the aging process. After all, most of the study participants lost about 18 pounds after two years of moderate calorie restriction.

However, when the researchers analyzed the body mass index of the study participants, they did not observe any correlation between weight loss and a decrease in complement proteins.

“This suggests that calorie restriction has a beneficial effect that is unique to adipose tissues and is likely independent of weight loss,” Kim says.

Further, when the researchers inhibited C3 activation using a drug to mimic the effect of calorie restriction, the mice experienced less age-related inflammation.

The finding demonstrates that what is beneficial early on in life can be detrimental later on, Dixit says. This theory, known as antagonistic pleiotropy, was first proposed by biologist Peter Medawar in 1952 to describe the aging process. A prime example of this theory is growth hormone production, which is essential in early development but could also drive cancer later in life.

Proteins like C3 are evolutionarily designed to protect us from infections, but as humans live much longer than their ancestors, these molecules can come back to harm us. Lowering the level of C3 proteins may be the key to enhancing health span, Dixit says.

The researchers are now investigating whether they could hold back C3 production to slow down aging in humans using FDA-approved inhibitor drugs. “The idea is not to remove complement systems that are required for us to fight infections,” Dixit says. “Instead, the goal is to restore the balance.”

A major pregnancy scare collapses: Tylenol shows no autism risk in more than 1.5 million children

Acetaminophen, which also goes by names like paracetamol or Tylenol, is a common over-the-counter pain reliever and fever reducer. It is often prescribed during pregnancy to help with mild to moderate pain. Recently, there has been a lot of discourse about its safety. Claims have been made suggesting that taking acetaminophen during pregnancy may increase the likelihood of autism in children.

A large study from Denmark adds clarity to this debate, finding no increased risk of autism in children exposed to acetaminophen before birth. These results were consistent across both the general population analysis and sibling-matched comparisons, and did not vary with the timing or dosage of exposure. The findings are published in JAMA Pediatrics.

Risk or no risk

Public concern over whether taking acetaminophen during pregnancy could increase autism risk quickly moved into the spotlight—dominating news channels, newspapers, and social media. The debate became so intense that in September 2025, the U.S. Food and Drug Administration advised clinicians to consider limiting its use for routine low-grade fevers during pregnancy.

Yet the scientific evidence remained conflicted—some studies have reported a small risk, while others have found no link at all. A large study from Sweden initially reported a slight rise in autism risk when analyzing the general population. However, when the same researchers compared siblings within the same families—a method that accounts for shared genetic and environmental factors—the apparent link disappeared.

A sibling-matched analysis is a research method that scientists use to compare siblings from the same family to better understand how a particular exposure affects an outcome. Since siblings share many of the same genes and a common home environment, this approach helps account for hidden factors that might otherwise influence the results.

In search of more definitive evidence, the researchers carried out a nationwide cohort study in Denmark. They examined all children born from single pregnancies between 1997 and 2022, using official national health records to follow more than 1.5 million children over time. The analysis focused only on children who were alive at age one and excluded cases with missing data or conditions already known to be linked to autism.

The researchers found that taking acetaminophen during pregnancy is not linked to an increased risk of autism in children. The findings were the same for the general population and siblings comparisons. The adjusted hazard ratio—a measure of relative risk between two groups—was close to 1, indicating no increased risk.

The study closely examined both the amount of medication taken—low, medium, or high—and the stage of pregnancy during which it was used, covering each trimester. Across all these variations, the team found no evidence of an increased risk.

These findings could help ease anxiety among parents while also giving medical practitioners the evidence they need to clearly explain to patients and support their informed decision-making.

Your phone already sees the warning signs: Sleep, movement and mood data can spot depression early

Depression is among the most widespread mental health disorders worldwide, affecting an estimated 1 in 20 people. It is characterized by persistent sadness, hopelessness, disrupted sleep patterns, changes in appetite and a loss of interest in everyday activities.

While there are now various treatments for depression, including different types of antidepressant medications and psychotherapeutic approaches, not all depressed individuals have access to these resources or benefit from them. Reliably detecting the first signs of depression could be highly advantageous, as it could ultimately allow mental health services to intervene early, before symptoms worsen and the disorder becomes debilitating.

The analysis of data collected by smartphones, smartwatches and other wearable devices could potentially help to detect some early signs of depression, such as a lower mood, increased stress levels and behavioral changes. While various past studies explored the potential of mobile technologies for the early detection of depressive symptoms, the factors influencing the effectiveness of these tools remain poorly understood.

Researchers at Ghent University recently set out to better understand what contributes to the effectiveness of these technology-based solutions, by reviewing earlier papers that assessed their potential. The team’s review paper, published in Nature Mental Health, pinpoints types of data that are particularly helpful for detecting signs of depression, while also identifying computational models that appear to be the most effective for this specific application.

“Early detection of depressive symptom changes is vital for timely interventions,” Yannick Vander Zwalmen and Matthias Maerevoet wrote in their paper. “Mobile and wearable technologies enable continuous, unobtrusive monitoring of behavioral, psychological and physiological data, offering new possibilities for digital phenotyping and just-in-time prediction of depression. This scoping review synthesized findings from 52 studies to identify commonly used features, evaluate their predictive value and assess methodological approaches.”

Using smartphone data to predict mood changes

Vander Zwalmen, Maerevoet and their colleagues reviewed 52 past research studies that focused on predicting early signs of depression. These studies collected data using smartphones or wearable devices, then analyzed it with computational models to predict early signs of depression.

The data collected ranged from movement or location-related information, sleep patterns, physical activity patterns, communication patterns (i.e., how many calls users made and how many messages they sent or received), heart rate variability (HRV) and self-reported mood ratings. By reviewing the findings of earlier studies, the team tried to identify the data patterns that were most closely linked to early symptoms of depression.

“Features such as time spent at home, sleep variability and reduced mobility were strongly associated with depressive symptoms,” wrote the authors. “Combining physiological, behavioral and self-report data enhanced predictive performance. Personalized models and anomaly detection approaches outperformed generalized ones in predicting individual symptom changes.”

The researchers’ analyses revealed that depression symptoms were typically linked with irregular sleep patterns, a reduction in movement, little physical activity and a self-reported bad mood. In addition, models that were adjusted to consider a user’s unique habits and average biological signals appeared to predict early signs of depression better than general models.

Towards better mental health monitoring tools

Overall, this review study confirmed the potential of data collected by portable and wearable devices for the prediction of early depressive symptoms. In the future, it could guide the development of new mental health apps or other technological tools that detect signs of depression and share useful resources or the contacts of local mental health services with users.

“Mobile and wearable data show strong potential for just-in-time depression prediction,” wrote Vander Zwalmen, Maerevoet and their colleagues. “Future research should emphasize new features, diverse populations and personalized models to improve accuracy and real-world applicability.”

Air pollution associated with increased migraine activity

Air pollution is associated with increased migraine activity, according to a study published in Neurology. Both short-term and cumulative exposure to air pollution as well as climate factors such as heat and humidity were associated with increased migraine activity.

The study does not prove that air pollution causes migraine attacks; it only shows an association.

“These results help us to better understand how and when migraine attacks occur,” said study author Ido Peles, MD, of Ben-Gurion University of the Negev in Be’er Sheva, Israel.

“They suggest that for people who have a susceptibility to migraine to begin with, environmental factors may play two roles: intermediate-term factors such as heat and humidity may modify the risk for attacks, while short-term factors such as spikes in pollution levels may trigger attacks.”

How the long-term study was done

The study involved 7,032 people with migraine who lived in Be’er Sheva in the Negev desert and were followed for an average of 10 years.

Researchers looked at daily exposure to air pollution from traffic, industry and dust storms, as well as weather conditions. Then they looked at how often and when people had to visit the hospital or a primary care office with an acute migraine and compared that to the pollution and weather conditions that day and up to seven days earlier, since pollution effects may take a few days to affect the body.

They also looked at cumulative exposure to air pollution and migraines. As another measure of migraine activity, researchers checked pharmacy records to see how many doses of the migraine medications, called triptans, participants needed.

What the migraine data revealed

During the study, 2,215 people, or 32%, had at least one visit to the hospital or clinic for acute migraine. A total of 47% of the people had purchased triptan medications during the study, with average use at two tablets per month and 2.3% of people using 10 or more tablets per month.

The researchers found an association between air pollution and visits to the hospital or clinic for migraine.

On the day with the highest number of visits to the hospital or clinic, air pollution levels were elevated compared to the average amount over the study period. On that day, the level of particulate matter 10, or PM10, which includes dust, was 119.9 microns per square meter (µm/m3), compared to an average of 57.9 during the study.

For PM2.5, which includes particles from motor vehicle exhaust and the burning of fuels from power plants and other industries, the level on that day was 27.3 µm/m3, compared to an average of 22.3 during the study. For nitrogen dioxide, or NO2, a gas mostly from traffic emissions, the level on that day was 11.2 parts per billion, compared to an average of 8.7.

The day with the fewest visits to the hospital or clinic also had lower than average pollution levels.

Pollution types and relative risks

After adjusting for other factors that could affect the risk of migraine attacks, such as sex and socioeconomic status, researchers found that people with short-term exposure to high levels of NO2 were 41% more likely to go to the hospital or clinic for migraine than people not exposed to high levels.

People exposed to high levels of solar radiation, or ultraviolet (UV) rays from the sun, were 23% more likely to seek help for migraine than those not exposed to high levels.

People with cumulative exposure to high levels of NO2 were 10% more likely to have high use of migraine drugs than people without cumulative exposure to high levels. People with cumulative exposure to high levels of PM2.5 were 9% more likely to have high use of the drugs.

Researchers found that climate conditions played a role in the effects of pollution. High temperatures and low humidity amplified the effect of NO2, while cold and humid conditions intensified the effect of PM2.5.

Implications for care and prevention

“These findings highlight opportunities for anticipating what care will be needed,” Peles said. “As climate change intensifies the frequency of heat waves, dust storms and pollution episodes, we will need to integrate these environmental risk factors into our guidance for people with migraine.

“When high-risk exposure periods are in the forecast, doctors can advise people to limit their outdoor activity and use air filters, take short-term preventative medications and start using their migraine drugs at the first sign of a problem to ward off attacks.”

Study limitations and who it reflects

A limitation of the study is that exposure to air pollution was measured by monitoring stations and did not take into account individual behaviors such as amount of time spent indoors, use of air conditioning or air filters, type of job and daily activities.

In addition, since the information on migraine activity was gathered through hospital and clinic visits and pharmacy data, the findings mainly reflect people with severe migraine and may not be applicable to people with milder episodes of migraine or those they manage on their own.

Genetic atlas reveals how human liver cells divide their labor

If scientists could shrink themselves to microscopic size and take a journey through the human body—like the submarine crew in the 1966 science fiction classic “Fantastic Voyage”—one of their first stops would no doubt be the liver. The unique structure of our largest internal organ comprises small, hexagonal functional units called lobules, each carrying out more than 500 functions simultaneously. Studies from the 1970s and 1980s revealed that liver cells divide these many tasks among themselves according to their location within each subunit; however, the technology available at the time provided only a blurred picture of this division of labor.

A new high‑resolution liver atlas

In a new study published in Nature, scientists from the Weizmann Institute of Science, together with colleagues at Sheba Medical Center and the Mayo Clinic, present the first genetic atlas of a healthy human liver at a resolution of 2 microns. The findings show that the division of labor in the human liver differs from that of other mammals and is more extensive than previously recognized, helping explain why certain regions of the liver are particularly vulnerable to fatty liver disease.

In recent years, technological advances have made it possible to identify which genes are active in each individual cell while also mapping the cells’ precise spatial positions within the tissue. Still, a comprehensive map of functional division in the human liver remained elusive, largely due to the difficulty of obtaining tissue samples from healthy donors.

Leveraging living liver donations

Researchers in Prof. Shalev Itzkovitz’s group at the Weizmann Institute realized that the solution could come from altruistic living liver donation. Because the liver has a remarkable capacity for regeneration, healthy individuals can donate a substantial portion of their livers to patients in need.

With the help of Prof. Ido Nachmany and Prof. Niv Pencovich from the Department of General Surgery and Transplantation at Sheba Medical Center, and Dr. Timucin Taner from the Mayo Clinic in Minnesota, the researchers obtained eight liver samples from healthy donors and constructed a gene expression atlas of the human liver.

“Thousands of genes were found to be active at different levels in liver cells across various locations, pointing to a far more precise and complex internal organization than previously thought,” says Itzkovitz. “Instead of the coarse division into three functional zones that has been accepted for decades, the atlas reveals eight regions with distinct roles. This precise mapping now enables any laboratory worldwide to dive deep into the liver and investigate why different regions are susceptible to different diseases.

“Metabolic diseases, for example, tend to originate in the center of the lobule, whereas viral and autoimmune inflammations primarily appear at its periphery. Likewise, liver cancer and metastases from other cancers have their preferred locations. The key to understanding these patterns lies in the detailed genetic data we have collected.”

How human livers differ from animals

To enable comparison with other species, Itzkovitz’s laboratory also mapped healthy livers in mice, as well as in larger mammals—pigs and cows—whose metabolic rates and lobule sizes are similar to those of humans. In all mammals, blood flows through the lobule from the periphery to the center, supplying oxygen and nutrients to cells along the way. As a result, the periphery is characterized by abundance of resources, while the center experiences relative scarcity.

In all the mammals studied except humans, these depleted conditions at the center of the lobule resulted in relatively lower cellular activity. In humans, however, the core of the lobule was found to carry out numerous functions, including synthesizing fats from excess energy, producing glucose from non-carbohydrate sources during fasting, filtering toxins and secreting bile to aid digestion.

Another striking difference between the human liver and those of other mammals concerns glucose storage. The liver functions as the body’s “fuel tank,” efficiently absorbing sugars during meals and releasing them in a controlled manner between meals. The study found that in humans, glucose uptake occurs mainly in the centers of the lobules, rather than at their periphery, unlike in mice.

The double‑edged nature of efficiency

“This division of labor is both a blessing and a curse,” Itzkovitz explains. “It allows the liver to store carbohydrates efficiently: Cells at the center of the lobule absorb and store glucose directly from the blood, while cells at the periphery convert lactate into glucose, further contributing to the energy reserves used during fasting. However, this efficient system was not designed for a modern diet rich in fats and carbohydrates, which may help explain why we tend to accumulate excess fat in the liver and develop liver fibrosis.”

To cope with cellular wear and tear and prevent disease, a unique turnover mechanism appears to have evolved in the center of the human liver lobule. “We found that in humans, unlike in other mammals, a particular type of immune cell prefers to reside in the core of the lobule rather than guarding its periphery—the entry point of blood into the tissue,” says Dr. Oran Yakubovsky of Itzkovitz’s lab, who led the study and is also a surgical resident at Sheba Medical Center.

“Kupffer cells are specialized scavenger cells that can offer protection against infections but also engulf, break down and recycle the remains of worn-out cells. We hypothesize that in humans they ‘relocated’ to the center to cope with the increased cellular attrition occurring there.”

Linking the atlas to liver disease

In the final part of the study, the scientists demonstrated how their new atlas can be used to trace disease development. They focused on fatty liver disease associated with metabolic dysfunction—a common condition, linked to obesity and diabetes, in which fat accumulates in the liver and may lead to inflammation and fibrosis.

Comparing healthy liver cells with those that had begun to accumulate fat revealed a protective response: Cells that started to “gain weight” switched off genes involved in fat production and uptake while activating genes associated with fat breakdown. However, the human liver has a limitation that reduces the efficiency of this process: Fat accumulation also leads to decreased production of certain components of the mitochondria, the organelles responsible for breaking down fats.

“Based on the precise mapping of the liver, it may become possible to develop treatments that will target the genes responsible for making specific regions particularly vulnerable to certain diseases,” says Itzkovitz. “Moreover, the approach of constructing a single-cell–resolution genetic atlas from healthy donor samples can be applied to other organs that have not yet been accurately mapped in humans. It could fundamentally change how we understand the structure and function of the human body.”

Blood test predicts kidney failure risk to Black Americans years before onset

A new blood test can identify which individuals of African ancestry carrying high-risk APOL1 gene variants are most likely to develop kidney failure, years before clinical disease becomes apparent. Findings on the new test, developed by a team from the Perelman School of Medicine at the University of Pennsylvania, are published in Nature Medicine.

“What has been missing is a way to identify early disease activity before we see changes in standard clinical measures,” said senior author Katalin Susztak, MD, Ph.D., a professor in Renal Electrolyte and Hypertension and director of the Penn/CHOP Kidney Innovation Center. “This approach allows us to intervene early enough and lessen the severity of, or even prevent, kidney disease in some patients.”

African Americans develop kidney failure at nearly four times the rate of those of European ancestry, driven in part by variants in the APOL1 gene. The APOL1 gene helps protect against certain infections, but some versions of it can also increase the risk of serious kidney disease.

An estimated 4–5 million people in the United States carry these high-risk variants. However, most will never develop kidney disease, and until now there has been no reliable way to determine who is truly at risk before kidney function begins to decline.

Substantial findings after analyzing blood samples

Researchers analyzed blood samples from more than 850 people of African ancestry enrolled in the Penn Medicine BioBank, all of whom carried APOL1 high-risk variants and had normal kidney function at the start of the study.

Using a small panel of circulating proteins measured from a routine blood draw, the team developed a risk score that predicts the likelihood of kidney failure, significant decline in kidney function, or death over the following ten years.

The differences between groups were substantial. More than 60% of individuals in the highest-risk category experienced renal failure requiring the need for dialysis or transplantation within ten years, compared to fewer than 1% in the lowest-risk group.

The proteins included in the score are linked to pathways involved in kidney injury and fibrosis, suggesting that the test captures early biological changes that precede measurable loss of kidney function.

Two samples show similar results

The findings were validated in two independent cohorts in the United States and the United Kingdom. Across all populations studied, the risk score consistently outperformed existing clinical prediction tools.

The work is aligned with a growing body of research demonstrating that circulating protein markers can reflect underlying tissue-level injury and disease progression. Together, these approaches are helping to move risk assessment in kidney disease beyond traditional clinical measures toward more direct readouts of disease biology.

Researchers say the test could be incorporated into routine care to guide monitoring and treatment decisions, particularly as therapies targeting APOL1-associated disease continue to advance. Several such therapies are currently in development, like experimental drugs designed to block the harmful effects of high-risk APOL1 variants in the kidney, with the goal of slowing or preventing kidney damage.

“One of the challenges in developing new therapies has been identifying the right patients early enough,” Susztak said. “This provides a way to focus treatment on those most likely to benefit.”

The team is now working toward bringing the test into the clinic and evaluating how patients’ scores could support both individual patient care and the design of future clinical trials.

A molecular movie captures cancer’s great escape from targeted therapy

Cancer drugs are designed to shut tumors down. But sometimes, in the very act of attacking a tumor, treatment can also help a small fraction of cancer cells become harder to kill. A new study from researchers at the Institute for Systems Biology (ISB) shows that cancer cells may begin escaping therapy much earlier—as the therapy itself triggers a stress response that drives some cancer cells into a temporary drug-tolerant state.

Published in Nature Communications, the study shows that melanoma cells exposed to BRAF-targeted therapy do not simply wait for resistance mutations to emerge.

Instead, they launch an early, coordinated survival program that pushes them into a temporary drug-tolerant state, allowing them to endure treatment long before permanent genetic resistance takes hold.

Using high-resolution, time-series multi-omics and computational modeling, the researchers reconstructed what they describe as a “molecular movie” of this transition—capturing the earliest events that occur within hours to days after treatment begins.

Rather than comparing cells only before treatment and after resistance had already emerged, the team tracked the escape process in real time, revealing that it follows an ordered sequence of events rather than a random drift into resistance.

“We tend to think of drug resistance as something that happens later, after tumors evolve new mutations,” said Wei Wei, Ph.D., co-senior author of the study and associate professor at ISB.

“What we’re seeing here is that the escape process begins almost immediately. Cells actively reprogram themselves to survive the initial shock of therapy.”

A rapid identity shift—not just genetic resistance
The study focused on melanoma driven by mutations in the BRAF gene, a common target of precision therapies. While these drugs can produce strong initial responses, in many patients, the tumors eventually find a way back.

The researchers found that, in response to treatment, melanoma cells undergo a reversible shift away from their original, drug-sensitive identity into a more primitive, therapy-tolerant state. This transition is not random. It unfolds through two sequential “transcriptional waves” that progressively reorganize gene activity and cellular identity.

Even more striking, when the drug is removed, the cells do not simply retrace their steps. They return by a different route, retaining a form of “molecular memory” of prior treatment.

“This tells us that resistance isn’t just about which mutations a tumor has,” said ISB President and Professor Jim Heath, Ph.D., co-senior author of the study. “It’s also about the cell states that treatment itself pushes cancer cells into—and how those states shape future behavior.”

An early molecular trigger: Stress, NF-κB, and chromatin remodeling
At the center of this adaptive response is NF-κB, a well-known regulator of cellular stress and survival.

The study shows that NF-κB is not just a general marker of cellular distress. It acts as an early trigger that converts the shock of targeted therapy into a survival program.

Specifically, targeted therapy disrupts antioxidant defenses and leads to a buildup of reactive oxygen species (ROS). This oxidative stress activates NF-κB, which then drives widespread changes in gene regulation.

Once activated, NF-κB recruits epigenetic enzymes that modify chromatin—the packaging system that determines which parts of DNA are open for reading and which are closed off.

In effect, the stress response begins to rewrite which genetic instructions the cell can access. One key target is SOX10, a transcription factor essential for maintaining the melanocytic state. As those identity genes are shut down, melanoma cells shift into a drug-tolerant condition that allows them to persist under therapy.

Toward more durable cancer treatments
While the findings are preclinical, they point to a new therapeutic strategy: preventing cancer cells from entering this escape state in the first place.

Rather than waiting for resistance to emerge, the researchers suggest that combining targeted therapies with drugs that disrupt the epigenetic programs downstream of this stress response could help cut off the escape route at its earliest, still-reversible stage.

The team also found evidence that similar stress-driven pathways operate in other cancers, including lung and colon cancer, suggesting that this may represent a broader mechanism of therapy resistance.

More broadly, the work reframes cancer resistance not simply as a genetic problem, but as a dynamic cell-state problem—one in which treatment itself can create the stress conditions that help some tumor cells survive unless that early escape program is blocked.

“Resistance may begin not only when cancer cells acquire new mutations, but when treatment itself pushes surviving cells into a stronger, more evasive state,” Wei said.

“If we can intervene early—at the level of cell-state transitions—we may be able to extend the effectiveness of targeted therapies across multiple cancer types.”

Neuroinflammation triggers autism-like regression in mouse model

Autism spectrum disorder (ASD) is a neurodevelopmental condition estimated to affect approximately 1 in 100 children worldwide. This condition is characterized by differences in how people communicate and interact with others, as well as restricted interests and repetitive behaviors.

Some autistic people have been found to exhibit mutations in SHANK3, a gene that encodes a key protein contributing to the formation and maintenance of junctions between nerve cells (i.e., synapses). Past studies have shown that approximately 40% of autistic individuals with SHANK 3 haploinsufficiency (i.e., presenting one functional copy of the gene instead of 2) tend to lose previously acquired social, communication-related or motor skills over time.

Researchers at Yale University School of Medicine recently set out to explore the possible contribution of neuroinflammation to the behavioral regression observed in many cases of SHANK3-associated ASD. The findings of their study, which focused on a mouse model of SHANK3-related autism, are published in Molecular Psychiatry.

“We previously reported that significant behavioral regression in a small cohort of patients with SHANK3 haploinsufficiency, triggered by subclinical infections, responded to immunomodulator treatments,” wrote Sheng-Nan Qiao, Sung Eun Wang and their colleagues in their paper. “We hypothesize that behavioral regression results from the interplay between SHANK3 deficiency and neuroinflammation.”

Studying a mouse model of ASD

Qiao, Wang and their collaborators performed a series of experiments involving mice that exhibited a partial loss of the SHANK3 gene as well as regular mice. As part of these experiments, they injected the mice with lipopolysaccharides (LPS), a molecule that triggers immune responses.

The team then observed the mice’s behavior following LPS injection, comparing it to that of regular wild mice. Interestingly, they found that injecting LPS and triggering immune responses altered the behavior of SHANK3-deficient mice over time, with the animals exhibiting anxiety-like behaviors, repetitive actions and difficulties with movement.

“Using Shank3 exon 4–22 deletion heterozygous mutant (Sh3+/−) mouse, which shows no significant behavior impairments, we established a preclinical model—Shank3 haploinsufficiency mouse undergoing a systemic inflammation challenge via intraperitoneal injection of LPS,” wrote the authors.

“We found that, two weeks after LPS challenge, wild-type mice (WT) recovered but Sh3+/− mice exhibited motor impairment, anxiety-like behaviors, and excessive grooming, similar to Shank3 exon 4–22 deletion homozygous mutants. Anti-inflammatory treatment partially reversed LPS-induced behavioral changes.”

The researchers found that anti-inflammatory treatments reversed the behavioral changes observed in the mice after LPS injection, reducing repetitive grooming, anxiety-like behaviors and improving their motor skills. These results suggest that neuroinflammation contributes to the behavioral regression observed in the SHANK 3-deficient model of ASD, and thus, it could be a promising therapeutic target.

“Transcriptomic analysis revealed upregulation of neuroinflammation-related genes and downregulation of synaptic function-related genes in LPS-challenged Sh3+/− mice,” wrote Qiao, Wang and their colleagues. “Especially, pro-inflammatory genes and microglia markers were overly activated that may result from the increased toll-like receptor 4 (TLR4) in Sh3+/− mice. Microglia overactivation elevated synapse engulfment and disrupted synaptic protein may underlie LPS-triggered worsened behavior phenotypes in Sh3+/− mice.”

The link between genetic vulnerability and immune activation

This recent study could soon inspire other researchers to further explore the link between neuroinflammation and behavioral regression in ASD linked with SHANK3 haploinsufficiency.

If the team’s findings are validated in humans, they could also pave the way for new therapeutic strategies tailored for autistic people with this specific genetic vulnerability.

By reducing the immune system’s activation, these new treatments might prevent or reverse behavioral changes that interfere with the daily functioning of individuals with SHANK3 haploinsufficiency-related ASD.

“Together, our findings indicate that neuroinflammation increases the penetrance of behavioral impairment in Shank3 haploinsufficiency mice and supports a potential mechanism for behavioral regression in human SHANK3 related disorders for future investigations,” wrote the authors.