Nurses can deliver hospital care just as well as doctors, review finds

Nurses can safely deliver many services traditionally performed by doctors, with little to no difference in deaths, safety events, or how patients felt about their health, according to a new review, appearing in the Cochrane Database of Systematic Reviews. In some cases, nurse-led care even outperformed doctor-led care.

Health care services are facing pressure due to an aging population, complex health needs, long waiting lists, and doctor shortages. Receiving care from nurses, rather than doctors, has been proposed as one way to improve access to hospital services for patients who may otherwise face long waits.

A group of researchers from Ireland, United Kingdom, and Australia evaluated nurse-doctor substitution in inpatient units and outpatient clinics, analyzing 82 randomized studies involving over 28,000 patients across 20 countries.

Studies included advanced nurse practitioners, clinical nurse specialists and registered nurses substituting for junior or senior doctors across specialties such as cardiology, diabetes, cancer, obstetrics/gynecology, and rheumatology.

Nurse-led hospital care matches doctor-led care for safety and effectiveness

The review found little to no difference between nurse-led and doctor-led care for critical outcomes, including mortality, quality of life, self-efficacy, and patient safety events.

While most clinical outcomes showed no difference between groups, nurses may achieve better outcomes in some areas, including diabetes control, cancer follow-up, and dermatology. Doctor-led care performed slightly better in a small number of sexual health and medical abortion follow-up services.

“Our findings show that nurse-led services provide care that is just as safe and effective as doctor-led services for many patients,” said lead author Professor Michelle Butler from Dublin City University. “In some areas, patients actually experienced better outcomes when nurses led their care.”

The models of substitution varied widely, with different grades of nurses operating autonomously, under supervision, or following specialized protocols. There were also differences in training, level of responsibility, and mode of substitution, all of which may influence outcomes.

Butler added, “In some cases, patients had earlier, more frequent, or on-demand appointments with nurses, or had an additional educational component to their care, which may have helped to improve their outcomes.”

Evidence on direct costs was limited and varied across studies, partly due to differences in reporting methods, currencies and time periods. Seventeen studies reported reduced costs for nurse-led care, while nine suggested higher costs due to longer consultations, referrals, or prescription differences.

Not a one-size-fits-all solution

However, nurse-doctor substitution is not a one-size-fits-all approach. The authors caution that these interventions should always be interpreted within context.

“Nurse substitution isn’t simply a one-for-one replacement,” said Timothy Schultz, senior author and researcher from Flinders Health and Medical Research Institute.

“To work well, these services need the right training, support and models of care, but the evidence shows patients are not disadvantaged and can benefit in meaningful ways.”

Expanding nurse-led services may help address doctor shortages, but the authors urge that policymakers should consider the impact of these interventions on the nursing workforce, including training and organization.

While the evidence base was substantial, the authors note important gaps. Most studies were from high-income countries, with the majority (39%) conducted in the United Kingdom.

The authors call for more studies across specialties, nurse roles and patient types not yet evaluated, as well as stronger consistency in how outcomes are measured. They also highlight the need for more research in low- and middle-income countries, where nurse-led roles could potentially improve access to care in regions facing doctor shortages.

Study finds PM2.5-linked cardiovascular deaths fell 45% since 2001, disparities persist

Clean air laws have led to a significant reduction in long-term exposure to fine particulate air pollution across much of the United States over the past two decades, yet tens of thousands of Americans still die each year from cardiovascular disease linked to polluted air. A new study led by researchers at the Yale School of Public Health (YSPH) shows that air pollution-related cardiovascular deaths are increasingly concentrated among traditionally underserved communities and driven by specific chemical components rather than overall pollution levels.

The authors say their work offers insights for more precise and equitable air pollution control strategies that target not only how much pollution is in the air, but what it’s made of and who is most affected.

The research, published in the journal Science Advances, finds that cardiovascular deaths attributable to fine particulate matter, known as PM2.5, fell nearly 45% between 2001 and 2020. But the decline has been uneven across regions and racial and ethnic groups, and progress has stalled in recent years, the authors say. Crucially, the study shows that a handful of PM2.5 components account for most of the current health burden.

“Air quality regulations have worked, but they’ve worked unevenly,” said Dr. Kai Chen, Ph.D., associate professor of environmental health sciences at YSPH and the study’s senior author. “Our study shows that even at relatively low levels of overall PM2.5, specific components continue to drive cardiovascular mortality.”

PM2.5 is a complex mixture of fine particles produced by sources such as power plants, vehicles, agriculture, wildfires, and dust. While air quality standards focus on total PM2.5 concentrations, the Yale team analyzed six major components—sulfate, ammonium, black carbon, organic matter, nitrate, and soil dust—to determine which ones were most strongly associated with cardiovascular deaths across more than 3,100 U.S. counties.

Using two decades of nationwide mortality and pollution data, the researchers estimated that PM2.5 contributed to about 42,000 cardiovascular deaths in 2001, declining to roughly 23,500 deaths by 2020. Much of that improvement was driven by reductions in sulfate and ammonium, particles closely linked to coal-fired power plants and agricultural emissions.

“Sulfate and ammonium accounted for nearly three-quarters of the decline in PM2.5-attributable cardiovascular deaths over the study period,” said Dr. Ying Hu, Ph.D., the study’s lead author and a postdoctoral researcher in environmental health sciences at YSPH. “But as those components declined, others, such as black carbon, became increasingly important contributors.”

By 2020, black carbon, a component of soot produced by traffic, diesel engines, and burning, had emerged as the largest contributor to PM2.5-related cardiovascular mortality nationwide, according to the study.

Racial disparities widen

The study also identified persistent and widening racial and ethnic disparities. Although PM2.5-related cardiovascular death rates declined for all groups, non-Hispanic Black and Hispanic populations experienced slower improvements than non-Hispanic white populations. Differences in population growth also contributed to this trend. Non-Hispanic Black and Hispanic populations grew faster than the non-Hispanic White population, which further widened disparities in PM2.5-related deaths.

The researchers also examined how the contribution of individual PM2.5 components to cardiovascular deaths differed across racial and ethnic groups. Compared with non-Hispanic whites, non-Hispanic Blacks were disproportionately affected by black carbon and sulfate, while Hispanic populations faced higher burdens from black carbon, dust, and organic aerosols, which are compounds released into the air by wildfires, fossil fuel consumption, soil disturbance, and other activities.

“These disparities reflect decades of structural and environmental inequities,” Dr. Chen said. “Communities of color are more likely to live near highways, industrial facilities, and other pollution sources, resulting in disproportionately higher exposure to air pollution. What’s more, they also experience systemic disparities in health care access and endure higher baseline cardiovascular risk factors, contributing to the higher burden of PM2.5-related cardiovascular deaths.”

The researchers hope their findings lead to more targeted air pollution control policies that focus on the impact of individual chemical components rather than overall pollution levels, especially as gains in U.S. air quality have slowed in recent years.

“If we want to keep reducing cardiovascular deaths and close racial and regional gaps, we need targeted strategies,” Dr. Chen said. “That means addressing PM2.5 components such as sulfate and black carbon from fossil fuel combustion and ammonium from agricultural emissions—not just lowering the average PM2.5 concentration.”

Robotic medical crash cart eases workload for health care teams

Health care workers have an intense workload and often experience mental distress during resuscitation and other critical care procedures. Although researchers have studied whether robots can support human teams in other high-stakes, high-risk settings such as disaster response and military operations, the role of robots in emergency medicine has not been explored.

Enter Angelique Taylor, the Andrew H. and Ann R. Tisch Assistant Professor at Cornell Tech and the Cornell Ann S. Bowers College of Computing and Information Science. She is also an assistant professor in emergency medicine at Weill Cornell Medicine and director of the Artificial Intelligence and Robotics Lab (AIRLab) at Cornell Tech.

In a pair of articles published at the Institute of Electrical and Electronics Engineers (IEEE) conference on Robot and Human Interactive Communication (RO-MAN) in August 2025, Taylor and her collaborators at Weill Cornell Medicine, associate professor Kevin Ching and assistant professor Jonathan St. George, described research on their new robotic crash cart (RCC)—a robotic version of the mobile drawer unit that holds supplies and equipment needed for a range of medical procedures.

“Health care workers may not know or may forget where all the various supplies are located in the cart drawers, and often they’re kind of shuffling through the cart,” Taylor said. This can cause delays during emergency procedures that require iterative tasks with precise timing, exacerbating medical errors and putting patients at risk, she noted.

To create the RCC, Taylor and her team outfitted a standard cart with LED light strips, a speaker, and a touchscreen tablet integrated with the Robot Operating System. This middleware connects computer programs to robot hardware, enabling them to work together to provide users with verbal and nonverbal cues.

During an emergency procedure, a user can request the location of a supply on the tablet. Then the lights around the drawer with that supply blink, or a spoken instruction plays through the speaker. Users can also receive prompts to remind them about necessary medications and recommend supplies.

In their article, “Help or Hindrance: Understanding the Impact of Robot Communication in Action Teams,” Taylor’s team conducted pilot studies of the RCC. One pilot involved 84 participants, aged 21 to 79, about half of whom had a clinical background. Working in groups of 3 to 4, they conducted a series of simulated resuscitation procedures with a manikin patient using three different carts: a RCC with blinking lights for object search and spoken task reminders, a RCC with blinking lights for task reminders and spoken language for object search, or a standard cart.

The team found that participants preferred the RCC that provided verbal and nonverbal cues over no cues with the standard cart—rating it lower in terms of workload and higher in usefulness and ease of use.

“These results were exciting and achieved statistical significance, suggesting that the use of a robot is beneficial,” said Taylor. The article, by Taylor, Ph.D. student Tauhid Tanjim, and colleagues at Weill Cornell, was a Kazuo-Tanie Paper Award finalist, an honor given to the top three papers in their category at the conference.

In the second article, “Human-Robot Teaming Field Deployments: A Comparison Between Verbal and Non-verbal Communication,” the research team began testing the RCC under more realistic conditions. Participants were health care workers from across the United States, and actors played frantic family members during the simulations.

Similar to the pilot studies, Taylor, along with colleagues at Cornell and Michigan State University, found that the RCC reduced participant workload, depending on whether the robot provided verbal or non-verbal cues. However, they evaluated robots with only one type of cue, not both, and identified room for improvement, particularly in the robot’s visual cues. They are now studying health care workers’ impressions of an RCC with multimodal communication.

Taylor hopes that other research teams will start exploring how robots can support health care teams in critical care settings. To that end, Taylor and her colleague presented an article at the February 2025 Association for Computing Machinery/IEEE International Conference that offers a toolkit for researchers to build their own RCC.

The papers are all published on the arXiv pre-print server.

Specific brain signals rapidly eliminate body fat in mice

Researchers at WashU Medicine have identified a potent pathway that begins in the brain and leads to loss of all body fat without reducing food intake. The study is reported in Nature Metabolism.

The team—led by senior author Erica L. Scheller, DDS, Ph.D., an associate professor in the Division of Bone and Mineral Diseases in the Department of Medicine; Xiao Zhang, Ph.D., a former graduate student in Scheller’s lab who is now a postdoctoral fellow at the University of Pennsylvania School of Medicine; and Sree Panicker, a graduate student in Scheller’s lab—was inspired by a unique population of fat cells located deep within the skeleton.

“About 70% of our bone marrow is filled with fat that doesn’t respond to diet or exercise,” said senior author Scheller. “We wanted to figure out why.”

The team found that these special cells, called constitutive bone marrow adipocytes, expressed high levels of proteins that inhibit fat breakdown. This causes resistance to fat loss in day-to-day settings. “We call these cells stable adipocytes,” said Zhang, the study’s first author. In mice, sustained injection of leptin, a hormone, into the brain was able to unlock the stable adipocytes by putting the body into a state of low glucose and insulin. This reduced the inhibitors of fat breakdown, causing complete loss of body fat within days, even though the mice were still eating normally.

This pathway is so powerful that scientists caution against using it in humans until it is better understood. Stable adipocytes occur in places like the bone marrow, in the hands and feet, and around important glands. In severe wasting disorders, loss of fat within these cells is associated with bone fractures and reduced quality of life.

Scheller’s team hopes to prevent this loss and preserve health in patients with severe wasting disorders by defining the mechanisms of stable fat loss. Conversely, methods to activate fat loss from stubborn adipocytes may support future treatments for obesity.

Why working out may not help you lose weight

According to conventional wisdom, a great way to lose weight is to do some exercise. While being active is beneficial in many ways for our health, it may not be very helpful if you want to shed a few inches off your waistline. And now, a new study published in Current Biology offers an explanation for why.

For decades, scientists have used a simple mathematical formula to calculate how much energy we use, which is essentially: Total Burn = Living Cost + Exercise. This is known as the Additive Model, and it means that every calorie you burn during a workout is simply added to the calories you use just to stay alive.

So, for example, if you burn 2,000 calories a day during your normal activities and then go for a run and burn 400 calories, you’ll burn 2,400 calories in a day according to the formula. The thought has been that this extra burn could lead to weight loss.

However, in recent years, another model has emerged called the Constrained Model. It says that our bodies have a limit on how much energy they spend. So if we burn more calories through exercise, the body reduces internal tasks, such as cell repair, to keep total energy expenditure within a narrow, predictable range. It means that the extra calories you thought you were burning at the gym are partially offset.

Comparing the models

Two researchers from Duke University in the U.S., Herman Pontzer and Eric T. Trexler, decided to compare these two ideas to see which one is supported by the data.

They analyzed 14 different studies involving 450 people who participated in exercise programs, as well as several animal studies. By comparing the energy these subjects were expected to burn with the energy they actually burned, the scientists calculated how much the body was compensating. They also compared data from different populations.

Their results suggested that the Additive Model often overestimates how much total daily energy expenditure rises with exercise. Instead, they found that as people and animals become more active, they may compensate by reducing energy spent on other processes or activities.

Realities of burning calories

On average, about 72% of the calories burned during exercise are added to the total daily burn. The remaining 28% may be offset through compensation. However, this is partial. Exercise still increases total energy use, but less than a simple additive calculation would predict. The researchers also noted that the 28% figure is an average and varies widely between individuals.

“Humans and other animals respond to increased physical activity by reducing energy expenditure on other tasks, supporting a constrained model of energy expenditure,” commented the researchers in their paper.

These findings may explain why exercise alone often leads to less weight loss than expected, and why diet plays such a key role.

Lucid dreaming could be used for mental health therapy, new study says

Lucid dreaming (LD) is one of the most fascinating parts of human consciousness, where you realize you are actually dreaming while you’re still asleep and, in some situations, can decide what happens next. There is a growing interest in lucid dreaming among scientists, but research is often scattered across different fields and long-term evidence of how it affects our health is lacking. So, a group of researchers conducted a massive review of existing studies to pull all the evidence together and discovered that this state of mind could help treat mental health issues like chronic nightmares and post-traumatic stress disorder (PTSD).

Understanding lucid dreaming

The team analyzed 38 peer-reviewed studies involving both healthy adults and those with conditions such as PTSD or Parkinson’s disease. They included only research that demonstrated lucid dreaming with objective data, such as specific eye-movement signals or brainwave patterns measured by an EEG (electroencephalogram).

Their review is published in the journal Annals of Medicine & Surgery.

When people enter a lucid dream, certain areas of the brain, like the prefrontal cortex (PFC), become more active. This region is associated with several key functions, including planning and decision-making, impulse control, working memory, and focus. During regular dreaming, it is usually much less active.

Some of the studies reviewed by the team also show increased gamma-band activity (around 40 Hz) in the frontal regions. These fast brainwaves are linked to higher-level thinking and help dreamers realize they are dreaming.

This awareness gives them a sense of control, which the researchers believe means that lucid dreaming could be used as a treatment for nightmares and PTSD.

Healing through dream control

For example, the researchers suggest that because lucid dreamers can confront and change the content of their dreams, treatments could be designed to help those with PTSD break the cycle of reliving traumatic memories. In other words, change or reframe a scary dream into a harmless one.

“Although evidence remains preliminary, LD shows promise as a therapeutic remedy for PTSD and anxiety symptoms, including a reduction in nightmares,” wrote the researchers in their paper. “It combines neuroscience and self-agency, highlighting the need for more funding and public awareness campaigns to harness its scientific and clinical prospects.”

While the study authors caution that their findings are still in the early stages, they believe future studies could explore using wearable technology to help people induce lucid dreaming on their own at home, away from a clinical setting.

Structural differences found in brains of people with panic disorder

Panic disorder (PD) is a mental health disorder characterized by recurring panic attacks, episodes of intense fear and anxiety accompanied by physical sensations and physiological responses such as a racing heart, shortness of breath, dizziness, blurred vision and other symptoms. Estimates suggest that approximately 2–3% of people worldwide experience PD at some point during their lives.

Better understanding the neural underpinnings and features of PD could have important implications for its future treatment. So far, however, most neuroscientific studies have examined the brains of relatively small groups of individuals diagnosed with the disorder, which poses questions about whether the patterns they observed are representative of PD at large.

Researchers at the Amsterdam University Medical Center, Leiden University and many other institutes worldwide recently carried out a new study shedding new light on the neuroanatomical signatures of PD, via the analysis of a large pool of brain scans collected from people diagnosed with the disorder and others with no known psychiatric diagnoses. Their paper, published in Molecular Psychiatry, identifies marked differences in the brains of individuals with PD, such as a slightly thinner cortex and frontal, temporal and parietal brain regions that are smaller than those of people with no known mental health disorders.

“Neuroanatomical examinations of PD typically involve small study samples, rendering findings that are inconsistent and difficult to replicate,” Laura K. M. Han and Moji Aghajani, first and senior authors of the paper, respectively, told Medical Xpress.

“This motivated the ENIGMA Anxiety Working Group to collate data worldwide using standardized methods, to conduct a pooled mega-analysis. The main goal was to provide the most reliable test to date of whether PD is associated with robust neuroanatomical alterations, and whether these differences may vary by age or clinical features (e.g., age of onset, medication use, severity).”

Uncovering the brain signatures of panic disorder

As part of their study, Han, Aghajani and their colleagues analyzed brain scans collected by different research teams worldwide from a total of almost 5000 people between 10 and 66 years old, including 1,100 individuals with PD and 3,800 healthy control subjects. The brain scans were collected using magnetic resonance imaging (MRI), an imaging technique that is commonly used by both scientists and doctors to study and diagnose various diseases.

“Using harmonized ENIGMA protocols and the FreeSurfer brain segmentation software, we measured cortical thickness, cortical surface area, and subcortical volumes,” explained Han and Aghajani. “Statistical mixed-effects models compared PD and healthy controls on these brain metrics, while accounting for individual variations in age, sex, and scanning site.”

The team’s analyses allowed them to pin-point various marked differences between the brains of people with PD and others with no known psychiatric or mental health disorders. The researchers found that people with PD had a slightly thinner cortex and that some parts of their brain had a smaller surface area or a reduced volume.

“We identified subtle but consistent reductions in cortical thickness and surface area in frontal, temporal, and parietal regions, along with smaller subcortical volumes within the thalamus and caudate volumes, among individuals with PD,” said Han and Aghajani.

“Among other things, these regions govern how emotionally salient information is perceived, processed, modulated, and responded to. The analyses also showed that some differences are age-dependent and that early-onset PD (before age 21) is linked to larger lateral ventricles.”

Paving the way for the detailed mapping of psychiatric disorders

Overall, the findings of this recent study appear to confirm existing models of PD that suggest that the disorder is linked to disruptions in brain regions associated with the processing and regulation of emotions. In the future, they could inspire other researchers to conduct further research that closely examines some of the newly uncovered neuroanatomical signatures of PD, perhaps also looking at how they change at different stages of development or when patients are responding well to specific treatments and psychotherapeutic interventions.

The team’s mega-analysis also highlights the value of examining large amounts of data, showing that this can contribute to the detection of subtle neuroanatomical changes or differences that might be hard to uncover in smaller samples. A similar approach has also been used to study the neuroanatomy of other neuropsychiatric disorders, such as generalized anxiety disorder (GAD), depression, obsessive compulsive disorder (OCD), bipolar disorder (BP), schizophrenia or substance use disorders (SUDs).

“Future studies could track individuals with PD longitudinally to clarify developmental and aging effects, integrate genetics and environmental risk factors, and combine structural imaging with functional and connectional brain examinations,” added Han and Aghajani. “The results also motivate transdiagnostic comparisons across anxiety disorders and efforts to link brain differences to prognosis, treatment response, or prevention strategies, rather than diagnosis alone.”

Point-of-care hepatitis B DNA testing proves as accurate as lab tests

A clinical trial led by Kirby Institute at UNSW Sydney has found that point-of-care testing for hepatitis B DNA is as effective as traditional laboratory testing, paving the way for faster diagnosis and treatment in hard-to-reach communities. The results have been published in the Journal of Clinical Microbiology.

“The results of our trial found that the fingerstick point-of-care test is highly accurate, closely matching the accuracy of traditional tests,” explains Professor Gail Matthews, who led the research at the Kirby Institute. “This is a very important finding because it has the potential to expand access to testing and treatment globally, and especially in resource-limited settings or remote areas, where current testing access is poor.”

Hepatitis B is a viral infection that attacks the liver, causing inflammation and, over time, serious complications such as cirrhosis, liver failure, and liver cancer. It is responsible for over 1 million deaths per year, but is preventable by vaccination, and effective treatment is available for chronic infection.

While most high-income countries have strong vaccination programs and reasonable access to care, the majority of people with chronic hepatitis B live in low- and middle-income countries, where access to testing and treatment is limited. Even in Australia, hepatitis B DNA testing is more difficult to access for people living in remote areas.

“Not everyone who has hep B needs treatment,” explains Associate Professor Behzad Hajarizadeh, who is first author on the paper. “People with higher levels of the virus are more likely to benefit from treatment, so DNA tests are required to determine the levels of virus in the system. DNA testing is also used once a patient starts treatment, to help understand if the treatment is working.”

Currently, hepatitis B DNA testing, for both diagnosis and monitoring, requires collecting a venous blood sample to be processed in centralized laboratories, meaning patients can need to travel long distances to take the test, and then often wait days or weeks for results. This delay, and the multiple clinic visits involved, can hinder timely treatment and care.

By contrast, point-of-care is a type of test that can be done in small health clinics using a fingerstick blood sample, which can be performed by a broader range of health care workers, and provides a result within 60 minutes. It is an effective alternative to laboratory testing for many infectious diseases, including hepatitis C, but until now, its efficacy for hepatitis B DNA using finger stick blood has been unknown.

“Our research demonstrates that point-of-care testing for hepatitis B DNA using fingerstick blood is, indeed, highly accurate and effective. Given the technology is already in use for a range of other infectious diseases globally, our evidence paves the way for integrating infectious disease care significantly enhancing access to hepatitis B testing, monitoring and treatment, no matter where someone lives,” says the Kirby Institute’s Associate Professor Tanya Applegate.

Most recent World Health Organization figures (2022) estimate that there are 254 million people living with chronic hepatitis B infection worldwide, yet only 14% were diagnosed and just 8% were receiving treatment, representing a major global health challenge. It is currently estimated that no country is on track to meet the WHO target of elimination of hepatitis B as a public health threat by 2030. As part of a push to increase testing and treatment, most recent WHO guidelines include a new recommendation supporting the use of hepatitis B point-of-care DNA fingerstick tests globally. Data from this study supports that recommendation.

“Access to testing is a major barrier to progress on hepatitis B elimination,” says Associate Professor Thomas Tu from Hepatitis B Voices. “We are hopeful that this research will support the roll-out of point-of-care testing for hepatitis B, enhancing access and ultimately, improving health and saving lives.”

i-DNA ‘peek-a-boo structures’ form in living cells and regulate genes linked to cancer

DNA’s iconic double helix does more than “just” store genetic information. Under certain conditions, it can temporarily fold into unusual shapes. Researchers at Umeå University, Sweden, have now shown that one such structure, known as i-DNA, not only forms in living cells but also acts as a regulatory bottleneck linked to cancer.

“You can think of i-DNA as a kind of ‘peek-a-boo structure’ in the DNA molecule. Its formation is tightly controlled in time and it must be resolved at precisely the right moment. We believe it plays an important role in gene regulation, because these structures can appear and disappear in sync with changes in the cell’s state,” says first author Pallabi Sengupta, postdoctoral researcher at the Department of Medical Biochemistry and Biophysics at Umeå University. The study is now published in Nature Communications.\

A highly unusual DNA structure

The familiar double helix can be imagined as a twisted ladder with sugar-phosphate backbones as side rails and base pairs—adenine (A) paired with thymine (T), and cytosine (C) paired with guanine (G)—forming the rungs.

i-DNA, however, bears little resemblance to this shape. Instead, it is more like a distorted, self-folded ladder tied into a knot. It consists of a single DNA strand folding back on itself to form a four-stranded structure. At the molecular level, the structure is held together not by standard A–T and C–G base pairs, but by pairs of cytosines.

These rare, short-lived structures appear and disappear depending on the cellular environment. For decades, they were dismissed as too unstable to exist inside cells and regarded as laboratory artifacts. With new experimental techniques, researchers in Umeå can now demonstrate that i-DNA does form, but only briefly, just before DNA replication begins.

Key protein controls structure resolution

The study further shows that the protein PCBP1 acts as a critical regulator. It unwinds i-DNA at the right moment, allowing the DNA replication machinery to proceed. If the structures fail to open in time, they block replication, increasing the risk of DNA damage—a hallmark of heightened cancer vulnerability.

The researchers also discovered that i-DNA is not uniform: some structures are easy to unwind, while others are highly resistant, depending on the underlying DNA sequence.

“The more cytosine base pairs that hold the knot together, the harder it is to resolve. In some cases, hybrid structures can form, making i-DNA even more stable,” explains Nasim Sabouri, professor at the Department of Medical Biochemistry and Biophysics at Umeå University, who led the study.

Notably, many i-DNA structures are located in regulatory regions of oncogenes—genes that drive cancer development—suggesting a direct link between i-DNA and disease.

To study these short-lived structures, the team combined biochemical assays, computational modeling and cell biology. They successfully visualized how PCBP1 progressively opens i-DNA and captured the structures in living cells at the exact moment in the cell cycle when they appear.

“By connecting molecular mechanisms to actual effects in cells, we can show that this is biologically relevant and not a laboratory phenomenon,” says Ikenna Obi, staff scientist at the Department of Medical Biochemistry and Biophysics at Umeå University.

New opportunities for drug development

The discovery reframes i-DNA from a molecular oddity to a potential weakness in cancer cells. Because cancer cells often experience high replication stress attempting to divide so rapidly that their DNA replication machinery approaches breakdown, any disruption in i-DNA handling may have severe consequences.

“If we can influence i-DNA or the protein that unwinds it, we may be able to push cancer cells beyond their tolerance limit. This opens completely new avenues for drug development,” says Nasim Sabouri.

The study was conducted in collaboration with Natacha Gillet, researcher at the Centre National de la Recherche Scientifique (CNRS) in France.

Surgical innovation may cut ovarian cancer risk by nearly 80%

A prevention strategy developed by Canadian researchers can reduce the risk of the most common and deadly form of ovarian cancer by nearly 80%, according to a new study published today in JAMA Network Open by researchers at the University of British Columbia (UBC).

The strategy, known as opportunistic salpingectomy (OS), involves proactively removing a person’s fallopian tubes when they are already undergoing a routine gynecological surgery such as hysterectomy or tubal ligation, commonly called “having one’s tubes tied.”

The Canadian province of British Columbia (B.C.) became the first jurisdiction in the world to offer OS in 2010, after a team of researchers from UBC, BC Cancer and Vancouver Coastal Health designed the approach when it was discovered that most ovarian cancers originate in the fallopian tubes rather than the ovaries. OS leaves a person’s ovaries intact, preserving important hormone production so there are minimal side effects from the added procedure.

The new study, led by a B.C.-based international collaboration called the Ovarian Cancer Observatory, provides the clearest evidence yet that the Canadian innovation saves lives.

“This study clearly demonstrates that removing the fallopian tubes as an add-on during routine surgery can help prevent the most lethal type of ovarian cancer,” said co-senior author Dr. Gillian Hanley, an associate professor of obstetrics and gynecology at UBC. “It shows how this relatively simple change in surgical practice can have a profound and life-saving impact.”

New hope against a deadly cancer

Ovarian cancer is the most lethal gynecological cancer. Approximately 3,100 Canadians are diagnosed with the disease each year and about 2,000 will die from it.

There is currently no reliable screening test for ovarian cancer, meaning that most cases are diagnosed at advanced stages when treatment options are limited and survival rates are low.

The OS approach was initially developed and named by Dr. Dianne Miller, an associate professor emerita at UBC and gynecologic oncologist with Vancouver Coastal Health and BC Cancer. She co-founded B.C.’s multidisciplinary ovarian cancer research team, OVCARE.

“If there is one thing better than curing cancer, it’s never getting the cancer in the first place,” said Dr. Miller.

The new study is the first to quantify how much OS reduces the risk of serous ovarian cancer—the most common and deadly subtype of the disease. It builds on previous research demonstrating that OS is safe, does not reduce the age of menopause onset, and is cost-effective for health systems.

The study analyzed population-based health data for more than 85,000 people who underwent gynecological surgeries in B.C. between 2008 and 2020. The researchers compared rates of serous ovarian cancer between those who had OS and those who had similar surgeries but did not undergo the procedure.

Overall, people who had OS were 78% less likely to develop serous ovarian cancer. In the rare cases where ovarian cancer occurred after OS, those cancers were found to be less biologically aggressive. The findings were validated by data collected from pathology laboratories from around the world, which suggested a similar effect.

From B.C. innovation to global impact

Since its introduction in B.C. in 2010, OS has been widely adopted, with approximately 80% of hysterectomies and tubal ligation procedures in the province now including fallopian tube removal.

Globally, professional medical organizations in 24 countries now recommend OS as an ovarian cancer prevention strategy, including the Society of Obstetrics and Gynaecology of Canada, which issued guidance in 2015.

“This is the culmination of more than a decade of work that started here in B.C.,” said co-senior author Dr. David Huntsman, professor of pathology and laboratory medicine and obstetrics and gynecology at UBC and a distinguished scientist at BC Cancer. “The impact of OS that we report is even greater than we expected.”

The researchers say expanding global adoption of OS could prevent thousands of ovarian cancer cases worldwide each year.

Extending OS to other abdominal and pelvic surgeries where appropriate could further increase the number of people who could benefit from the prevention strategy. B.C. recently became the first province to expand OS to routine surgeries performed by general and urologic surgeons through a project supported by the Government of B.C. and Doctors of BC.

“Our hope is that more clinicians will adopt this proven approach, which has the potential to save countless lives,” said Dr. Huntsman. “Not offering this surgical add-on may leave patients unnecessarily vulnerable to this cancer.”