Compact CRISPR system unlocks targeted in-body gene editing, with up to 90% efficiency

A research team has discovered an enhanced CRISPR gene-editing system that could enable targeted delivery inside the human body—a key step toward broader clinical use. Researchers identified a naturally occurring enzyme, Al3Cas12f, that is small enough to fit into adeno-associated virus vectors, a leading targeted delivery method for gene therapies. They then engineered an enhanced version that dramatically improved gene-editing performance in human cells.

The advance addresses a major limitation in CRISPR technology. Commonly used gene-editing proteins are too large for targeted delivery systems, restricting clinical applications to cells modified outside the body, such as blood and bone marrow.

“Smart delivery of gene editing systems is a powerful notion with broad clinical implications, and this basic science finding takes us a significant step toward that future,” said Erica Brown, Ph.D., acting director of NIH’s National Institute of General Medical Sciences (NIGMS).

Using imaging and machine learning tools, researchers at the University of Texas at Austin analyzed the enzyme’s structure. They found it forms a more stable and tightly connected complex than other enzymes of a similar size, allowing it to function more effectively in human cells.

“The expanded interface means the enzyme is much more stable. Compared to the others we looked at, Al3Cas12f basically comes preassembled and ready to go shortly after its pieces are produced,” said David Taylor, Ph.D., a molecular bioscience professor at UT Austin and corresponding author of the paper published in Nature Structural & Molecular Biology.

The team then engineered a variant, known as Al3Cas12f RKK, which significantly improved editing efficiency from less than 10% to more than 80% across tested targets. In a commonly edited region of the genome, efficiency reached 90%.

Of the many variants the team produced, Al3Cas12f RKK stood above the rest. The team introduced instructions for RKK directly into a line of human cells originally isolated from a patient with leukemia. Mutations in several of the genes they aimed to edit were associated with diseases such as cancer, atherosclerosis, and amyotrophic lateral sclerosis (ALS).

The authors expect to build on their encouraging results. They next plan to conduct tests of the nuclease’s performance when packaged into AAV vectors, which, if successful, could bring gene editing therapy for many diseases much closer to reality.

Designing better membrane proteins by embracing imperfection

Scientists at the VIB–VUB Center for Structural Biology have uncovered a counterintuitive principle that could reshape how membrane proteins are designed from scratch: Sometimes, making a protein less stable helps it fold correctly. In their study published in the Proceedings of the National Academy of Sciences, the researchers demonstrate that introducing carefully placed “imperfections,” a strategy known as negative design, enables synthetic membrane proteins to fold and assemble efficiently in artificial membranes.

Membrane protein stability

Membrane proteins are essential for life and biotechnology, acting as gateways, sensors, and drug targets. Yet designing them from scratch remains notoriously difficult. Unlike soluble proteins, they must navigate a complex folding process while inserting into lipid membranes and during this step, many designs fail.

Traditional protein design focuses on maximizing the stability of the final folded structure. But the new study shows that, for transmembrane β-barrel proteins, this approach can backfire.

Using a cell-free protein synthesis system combined with synthetic lipid vesicles, the team found that highly optimized designs often misfold and aggregate instead of inserting into membranes.

“Designing for maximum stability alone can actually trap these proteins in the wrong state,” says first author and Ph.D. student Giacomo Pedrelli (VIB-VUB). “They become too eager to fold too early, which leads to aggregation in water before they ever reach the membrane.”

The power of ‘negative design’

To overcome this, the researchers introduced subtle destabilizing features to disrupt premature folding. This negative design strategy reduced aggregation and significantly improved membrane insertion and assembly. Remarkably, these changes did not substantially compromise the final stability of the proteins. Instead, they helped guide the folding pathway, ensuring the protein reached the membrane in a foldable state.

The study also revealed that a protein language model (ESM3), trained on evolutionary data, outperformed traditional physics-based methods in identifying beneficial negative design mutations. While conventional tools predicted these mutations would destabilize the protein, the AI model successfully pinpointed changes that improved assembly in membranes.

The ability to reliably design transmembrane β-barrels opens exciting possibilities. These proteins can form nanopores—tiny channels with applications in biosensing, molecular detection, and next-generation sequencing technologies.

“This work shows that we need to think beyond static structures,” says Prof. Anastassia Vorobieva (VIB-VUB). “By designing not just the final state, but taking into consideration the entire folding journey, we can unlock new possibilities for engineering functional membrane proteins.”

This negative design approach for designing membrane proteins could accelerate the development of synthetic proteins for biotechnology, medicine, and nanotechnology.

Obesity can derail vaccine response, forcing lung T cells to defend instead

New findings reveal that obesity significantly impaired the quality and longevity of antibody responses to a Pseudomonas aeruginosa vaccine in a mouse model. The impaired antibody production was due to defects in germinal centers, a transient part of the immune system where specialized immune cells, called B cells, produce antibodies and build memory against pathogens.

Researchers say the findings, which are published in The Journal of Immunology, provide an important reason for why traditional vaccines, which rely on high antibody production, tend to underperform in people with obesity.

“We hope these findings shift the focus of vaccine design and lead to more effective, tailored vaccines for the millions of people living with obesity who are at higher risk for severe respiratory infections,” said Wendy L. Picking, Ph.D., Professor in the Department of Pathobiology and Integrative Biomedical Sciences at the University of Missouri and lead author of the study.

Unexpected strength of lung T cells

Though the antibody response was decreased, the vaccine did generate a strong response from lung tissue-resident memory T cells. These specialized cells live permanently in the lungs and do not circulate through the bloodstream.

In response to the P. aeruginosa vaccine, resident memory T cells provided early, critical protection against infection that was not observed in mice fed a normal or low-fat diet. This suggests that the tissue-resident memory T cells could be compensating for antibody deficiencies.

“Instead of just trying to boost blood antibody levels, we should intentionally design vaccines that prioritize tissue-resident immunity, ensuring protection directly where pathogens like Pseudomonas enter the body,” shared Dr. Picking.

Growing urgency for better vaccines

P. aeruginosa is a leading cause of severe pneumonia for people with obesity, and emerging antibiotic resistance increasingly makes the infection difficult to treat, highlighting the need for effective vaccines.

To date, no other studies have examined the effectiveness of vaccines targeting gram-negative bacterial pathogens, like P. aeruginosa, in people with obesity. Understanding the relationship between obesity and the immune system addresses a significant gap in current vaccine research.

Next steps in vaccine development

The researchers plan to build on these findings by identifying the specific molecular signals that enable the lung tissue-resident memory T cells to become activated despite the chronic inflammation associated with obesity. This could allow researchers to optimize vaccine formulations to further boost these resident memory cells.

Ultimately, the researchers seek to create a vaccine that ensures robust protection for all individuals, regardless of their metabolic health.

Fighting malaria more effectively with climate data

In many parts of East Africa, small pools of water that form after heavy rainfall are ideal breeding sites for the Anopheles mosquitoes that transmit malaria. Researchers at the Karlsruhe Institute of Technology (KIT) have analyzed how such environmental conditions affect the effectiveness of mosquito nets. They combined high-resolution climate and hydrology models with malaria data from Kenya to enable better assessments of when and where the nets are especially effective at preventing infections. Their results have been published in Scientific Reports.

More than 600,000 people die of malaria in sub-Saharan Africa every year. How widespread the disease becomes depends not only on medical care and preventive measures but also on environmental factors such as rainfall, temperature, and especially the formation of temporary bodies of water.

“Such pools of water determine where Anopheles mosquitoes breed and increase the risk of infections,” said Professor Harald Kunstmann of the Institute of Meteorology and Climate Research Atmospheric Environmental Research (IMKIFU) at KIT’s Campus Alpin in Garmisch-Partenkirchen. “Thanks to today’s high-resolution environmental models, we know exactly when and where that occurs.”

With his team, Kunstmann has investigated whether and how such data can be used to maximize the effectiveness of countermeasures. “One of the simplest tools for fighting malaria is mosquito nets that protect people from mosquito bites at night,” said Dr. Diarra Dieng of the IMKIFU, who was a major contributor to the project. “We wanted to find out how much they actually reduce transmission, and where their use has the greatest impact.”

From rain to infection: A modeling chain

The researchers combined various model types for their study, with climate models providing temperature and precipitation data and hydrological simulations showing where water can accumulate to form potential breeding sites. Based on these data, an epidemiological model predicts the resulting spread of malaria. The analysis was based in part on malaria data from Kenya.

“Our approach is the first to consider the entire chain, from atmospheric processes to the formation of breeding sites to disease transmission, enabling us to make the first experimental determination of how effective mosquito nets really are at reducing infections,” Dieng said.

The researchers quantified the extent of changes in malaria transmission and incidence under different environmental conditions with and without mosquito nets. They were able to show that systematic use of mosquito nets significantly reduced the number of infectious insect bites, causing malaria incidence to decrease by around 40% on average, and in some regions by over 50%.

They also showed the extent to which trends were influenced by local environmental factors. Temperature, precipitation, and the availability of temporary breeding sites determine when and where mosquitoes can breed most successfully, which in turn determines the effectiveness of preventive measures.

Planning targeted preventive measures

The study shows how climate data can be used for practical health care decisions. High-resolution environmental data make it possible to assess malaria risk with much greater geographical precision and to estimate the expected benefits of preventive measures. Health programs could use this information to identify regions where targeted intervention would be especially effective and where additional measures might be needed.

“For the first time, we have data that show what really helps,” Dieng said. “If we understand the relationships between environmental conditions and preventive measures, we can put limited resources to better use.”

Inquiry-based biomimicry course inspires students to design solutions by learning from nature

Research and innovation in Texas A&M University’s biomedical engineering department often centers around clinical impact on patients. Beyond the lab, however, some faculty are finding breakthroughs in the classroom.

Dr. Charles Patrick, professor of practice, published findings in the journal Biomedical Engineering Education. The study details his success implementing a scaffolded inquiry-based learning model in the classroom. Patrick found learning outcomes improved through an approach that allowed students to practice design throughout the semester before a summative final design project.

“Students were highly engaged and it’s been well published that the more engaged a student is, the more they learn,” Patrick said. “They worked in small groups, which helped develop their teamwork and communication skills. We also measured their imagination competency at the beginning of the semester using validated surveys. This increased when measured at the end of the semester.”

Patrick tested the approach in a course titled “Biomimicry, Biomimetics, and Bioinspired Approaches to Medical Device and Technology Design.” The aim is for students to use nature as a model to solve engineering problems.

“This class is focused on teaching students how to not necessarily start with a blank sheet of paper, but to look at nature and see how it has optimized or influenced some aspect of medical design,” he said. “We can make an exact copy of nature, or emulate it, or just be inspired by it.”

Biomimicry is responsible for some of the world’s most recognizable inventions. For example, bur seeds inspired Velcro and whale fins inspired wind turbines.

Texas A&M biomedical engineering researchers Drs. Taylor Ware and Abhisek Jain each found inspiration in nature. Ware developed self-assembling polymers inspired by fire ants, while Jain created vessel-chips by mimicking human microvasculature.

“Nature has already optimized the energy and mechanics of processes while we’re still struggling to make medical devices efficient,” Patrick said. “We are looking to nature to teach us how to optimize engineering.”

The course is one of three design frameworks the department offers students to build competency and gain experience before their final capstone class, where they work with companies to refine or design a real biomedical device.

“This course is a nature-inspired framework,” Patrick said. “Another is with NASA—a space engineering design framework—and the third is the department’s core biodesign framework. Students can choose to learn all three different types of design frameworks.”

To build familiarity with design principles in biomimicry, Patrick used three lessons to increase competency before a final project. The first assignment used LEGO sets as a low-risk way for students to familiarize themselves with the process through a method they already understood. He took inspiration from other universities as well as his own children at home using LEGO Serious Play as a means of learning.

For the assignment, the students are tasked with grasping a yellow sphere. Patrick gives them a curated list of “critters” to choose from—like a hawk—to ideate how they might grasp the object. The students then decide how they’ll emulate or copy that function.

“I wanted something they’re familiar with that still stoked creativity,” Patrick said. “There are enough different parts that they could mimic what they see in nature. It lowers activation energy due to their familiarity. They get to be creative, but still learn new concepts at the same time.”

After the LEGO project, students used virtual reality tools to redesign surgical instruments. Finally, they took a trip to The Gardens at Texas A&M University to draw inspiration for their final project.

“Whether it’s a bee, a certain flower or plant, or an animal they see, that’s their inspiration for their final design project,” Patrick said. “Every time I’ve done it, the students say this is the first time they’ve ever actually gotten outside of the classroom to do an assignment. They enjoy being able to think differently and look at things from another perspective.”

Patrick hopes the course leaves students with a newfound confidence in using their imagination in engineering.

“Something our education system does well in K-12 is making sure that students use their imagination and creativity,” Patrick said. “When students get to college, for some reason, we often stop that. The greatest thing is a blank sheet of paper and a pencil, unlimited imagination and brainstorming. This class helps them do that.”

Hackers meet their match: New DNA encryption protects engineered cells from within

Engineered cells are a high-value genetic asset that is key to many fields, including biotechnology, medicine, aging, and stem cell research, with the global market projected to reach $8.0 trillion USD by 2035. Yet the only ways to keep the cells safe are strong locks and watchful guards.

In Science Advances, a team of U.S. researchers present a new approach to genetically securing precious biological material. They created a genetic combination lock in which the locking or encryption process scrambled the DNA of a cell so that its important instructions were non-functional and couldn’t be easily read or used.

The unlocking, or decryption, process involves adding a series of chemicals in a precise order over time—like entering a password—to activate recombinases, which then unscramble the DNA to their original, functional form.

The researchers conducted an ethical hacking exercise on the test lock and found that random guessing yielded a 0.2% success rate, remarkably close to the theoretical target of 0.1%.

Turning the assets into locks
The U.S.’s Centers for Disease Control and Prevention (CDC) and Department of Homeland Security have reported an uptick in the theft and smuggling of high-value biological materials, including specially engineered cells. In recent years, there has also been a record rise in unauthorized shipments and attempts at industrial espionage. In the wrong hands, these materials could be misused to create bioweapons or deliberately harm the environment.

Currently, valuable cells are primarily protected by physical measures such as locks, cameras, and guards. Once these barriers are breached, there is little left to prevent the cells from being stolen and misused.

In this study, researchers used a cybersecurity-inspired approach to protect cells at the DNA level by using the cell’s own biological security system. They developed a scenario-based simulation using a designing group (blue team) and a decrypting group (red team).

First, the development (blue team) scrambled the DNA by rearranging and flipping genetic instructions so the cell could no longer read them correctly. They started with a functional genetic unit that includes a promoter (the ON switch) and the gene of interest. They then broke this unit into separate parts, arranged them in the wrong order, and flipped some segments backwards.

To make sure these can be unscrambled later, they placed special DNA sequences called recombinase attachment sites around them.

For decryption, the team used a precise sequence of chemicals to trigger the cell’s machinery to physically rearrange the scrambled DNA and restore it to its functional state. They created a biological keypad with nine distinct chemicals, each acting as a one-digit input.

By using the same chemicals in pairs to form two-digit inputs, where two chemicals must be present simultaneously to activate a sensor, they expanded the keypad to 45 possible chemical inputs without introducing any new chemicals. They also added safety penalties—if someone tampers with the system, toxins are released—making it extremely unlikely for an unauthorized person to access the cells.

Then the red team, who kept out of the encryption process development, stepped in as ethical hackers, tasked with trying to break into the system and access the hidden genetic information. In their first attempt, they uncovered 10 different chemical combinations that partially unlocked the cells, revealing weak spots in the design.

After the developers patched these flaws, the hackers tried again. This time, only the exact passcode worked, showing that the odds of an unauthorized person guessing it had dropped to just two in 990, or 0.2%.

The researchers note that the strong performance of this biological lock signals a shift in biological security, in which genetic material is protected by safety algorithms built into the DNA itself, making the assets their own protectors.

This study designed the system around engineered E. coli cells, but further research is needed to determine whether it can be applied to other organisms and scaled to protect multiple genes or assets within a single cell.

AI and drones can select the most resilient wheat

Making wheat more resilient to climate change without compromising yields has become an urgent priority for the agricultural sector. Now, a study led by a research team from the University of Barcelona and the Agrotecnio research center has identified an innovative way to address this challenge: combining advanced technology and artificial intelligence to select the best varieties of this crop.

The study, published in the journal Plant Phenomics, suggests a shift in perspective: it is necessary to focus not only on yield, but also on wheat’s ability to maintain consistent harvests despite changing weather conditions. The findings indicate that this combination of productivity and stability is key to ensuring safe harvests under variable environmental conditions.

The authors of the study are researchers Jara Jauregui, José Luis Araus and Shawn Carlisle Kefauver, from the Department of Evolutionary Biology, Ecology and Environmental Sciences at the UB’s Faculty of Biology and Agrotecnio; Nieves Aparicio and Sara Álvarez, from the Agro-technological Institute of Castilla y León (ITACyL), and María Teresa Nieto, from the National Institute for Agricultural and Food Research and Technology (INIA-CSIC).

Drones for monitoring wheat crops

The team analyzed 64 varieties of durum wheat grown under two different Mediterranean conditions: irrigated and rain-fed. The aim was to identify which genotypes combine high yields with a stable performance across variable environments, with differences in temperature and water availability.

One of the most surprising findings is that the selected varieties are not those that retain their green leaves the longest until the end of the season, but rather those that grow vigorously at the start and mature slightly earlier.

In contrast, the rejected lines showed low initial vigor and retained their green leaves for longer, which does not guarantee a better yield.

As part of the project, the team used ground sensors and drones equipped with RGB, multispectral and thermal cameras, enabling them to monitor crop development throughout the entire growing cycle. This technology provides key information about the wheat before harvesting, eliminating the need for harvesting and reducing both the costs and the time required for analysis.

Using all this data, the team trained artificial intelligence models capable of predicting both the yield and the stability of production for the different varieties with a high degree of accuracy.

This strategy could be a very useful tool for plant breeding programs and could help develop wheat varieties that are equipped to meet the challenges of climate change.

Greener doesn’t always mean better

The researchers first analyzed, separately, the yield and stability traits of durum wheat. They found that the genotypes with the highest yields are characterized by high initial vigor and sustained greenness during the rapid growth phases up to the end of the growing season. In contrast, the most stable genotypes exhibit lower initial vigor, slower growth and a shorter cycle, enabling them to make better use of the resources available for grain production. To identify a balance between these compensatory mechanisms, the experts developed a varied selection method that combines competitive yield with good stability.

The study concludes that vigorous early growth combined with early maturation is a key factor to achieving more consistent yields under variable environmental conditions, helping wheat cope better with drought and high temperatures.

Why experts say now is the time to vaccinate US dairy cattle against bird flu

Bird flu—specifically H5N1—is no longer just a poultry problem in Asia. What started as a major United States outbreak, first in wildlife, then in poultry, and later in dairy cattle, is raising new concerns about food security, the economy, the health of farm workers, and the potential for future human outbreaks.

In a commentary published in The Journal of Infectious Diseases, Dr. Gregory Gray, a professor in the Division of Infectious Disease and Department of Microbiology and Immunology at The University of Texas Medical Branch (UTMB), writes that vaccinating dairy cattle could be one of the most important steps the U.S. takes to get ahead of this evolving threat.

“This virus has changed and now seems to have become entrenched or ‘enzootic’ in North American wildlife,” Gray said. “We used to think of H5N1 as a bird problem in Asia. Now it’s clearly something bigger and here in our own backyard, and we need to respond accordingly. At first, people thought this was a one-off event, but it spread—and it’s still spreading.”

The current wave of H5N1 began sweeping through U.S. poultry flocks in 2022, leading to the loss of more than 190 million birds. By 2024, the virus had made an unexpected jump into dairy cattle. So far, more than 1,000 dairy herds across at least 19 states have been affected, contributing to an estimated $14 billion hit to the U.S. economy, including roughly $4 billion in losses to the dairy sector alone.

Viruses like H5N1 evolve over time. The more they spread and the more species they infect, the more opportunities they have to change. Dairy cattle may now be acting as what researchers describe as a kind of “training ground” for the virus to adapt to mammals, including humans.

“Every H5N1 infection in animals or humans is like a roll of the dice,” Gray said. “Most of the time, nothing major happens. But the more chances the virus gets, the greater the risk that the virus will become more dangerous to animals or humans.”

So far, human cases in the U.S. have been rare and mostly mild. But there have been at least 71 confirmed infections and two deaths, and public health experts are watching closely. People who work on or live near farms and those who consume raw (unpasteurized) milk may face higher exposure risks.

The idea is straightforward: If you can reduce how much virus is circulating in dairy herds, you reduce the chances of it spreading to other animals, to other farms, and to people.

“Think of it as turning down the volume on the virus,” Gray said. “You may not eliminate it entirely, but you make it much harder for it to cause serious problems.”

Vaccination could help protect cattle from illness, reduce virus levels in milk and on farms, slow or stop spread between herds, and lower the risk of spillover into humans and other species. It could also help protect poultry farms, which are highly vulnerable and often located near dairy operations.

There’s good reason to think vaccines could work well in cattle. Studies show that cows can develop strong, lasting immunity after infection. In some cases, animals remained protected for more than a year and did not shed virus when reexposed.

“That’s exactly what you want to see,” Gray said. “It tells us the immune system in cattle can handle this virus and that vaccines have a real shot at working.”

Early vaccine trials are also promising, with some candidates producing protective immune responses that last for months. Even better, the dairy industry already has the infrastructure to make vaccination practical. Routine vaccinations and detailed herd records are standard practice on most farms.

“This isn’t starting from scratch,” Gray noted. “We already have systems in place that could support a vaccine rollout.”

The idea of vaccinating animals against bird flu isn’t new. Countries like Mexico and China have used vaccines in poultry for years. While vaccination didn’t eliminate the virus, it significantly reduced illness and helped control outbreaks.

“Vaccines don’t have to be perfect to be useful,” Gray said. “If they reduce disease and transmission, that’s a win. We’ve been trying to control this with the tools we have, but it’s becoming clear those tools aren’t enough on their own. The longer we wait, the harder this gets. Right now, we have a chance to get ahead of it.

“This is about staying one step ahead,” Gray said. “We have the science. Now it’s about deciding to use it.”

AI-designed proteins built from scratch can recognize specific compounds

Professor Gyu Rie Lee of the Department of Biological Sciences successfully designed artificial proteins that selectively recognize specific compounds using AI through joint research with Professor David Baker. The research, published in the journal Nature Communications, is characterized by using AI to design proteins that recognize specific compounds from scratch (de novo) and implementing them as functional biosensors.

While the conventional approach mainly involved searching for natural proteins or modifying some of their functions, this research is highly significant in that it “custom-built” proteins with desired functions through AI-based design and even completed experimental verification.

In particular, the research team successfully designed a protein that selectively recognizes the stress hormone cortisol and implemented an AI-designed biosensor based on it. This is evaluated as a case that extends beyond protein design to actual measurable sensor technology, solving the long-standing challenge of small-molecule recognition in the field of protein design.

These research results have applications in fields including disease diagnosis, new drug development and environmental monitoring. The technology can precisely detect biomarkers in the blood to diagnose diseases early and contribute to the development of targeted therapies through the design of proteins that selectively recognize specific molecules.

Furthermore, it is expected that the implementation of customized biosensor technology will become possible, such as real-time monitoring of air and water quality through the development of sensors that detect environmental pollutants.

Designing new proteins (de novo proteins) that recognize compounds has been considered a challenge in the field of protein design for a long time because it requires precise calculations at the atomic level. The research team developed an AI model that precisely reflects protein-ligand interactions and successfully designed binding proteins using it.

As a result, artificial binding proteins were designed for six types of compounds, including metabolites and small-molecule drugs, and their functions were verified through experiments. In particular, a cortisol biosensor was developed by designing a chemical-induced dimer based on a new protein that binds with cortisol.

A provisional patent for the relevant design technology has been filed in the United States.

Professor Gyu Rie Lee stated, “This research experimentally proves that AI can be used to design proteins that precisely recognize specific compounds,” and added, “We plan to expand this into protein design technology that can be utilized in various fields such as disease diagnosis, new drug development, and environmental monitoring in the future.”

Professor Gyu Rie Lee of the KAIST Department of Biological Sciences participated in this research as the first author, and Professor David Baker as the corresponding author.

Director Do-Heon Lee, a mentor professor of the AI-CRED Innovative Drug Research Group, said, “This achievement is a meaningful result derived through cooperation between InnoCORE researchers and a global scholar.

“We will further strengthen our research capabilities based on active research collaboration with postdoctoral researchers recruited through the InnoCORE project to continue creating innovative results in the AI drug development and bio-fields.”

The KAIST InnoCORE Research Group aims to accelerate AI-based scientific and technological innovation and promote global joint research by supporting top-tier domestic and international postdoctoral researchers to devote themselves to the development of AI convergence technology in a cutting-edge collective research environment.

Unlocking the hidden metabolism of algae to advance the promise of renewable fuels and sustainable biomass

Researchers at the Donald Danforth Plant Science Center have solved a long-standing mystery of how a model green microalga reorganizes its central metabolism to supercharge growth when given access to both light and a carbon source—a finding with broad implications for developing algae as a sustainable source of renewable fuels, bioproducts, and biomass. Their study is published in the Proceedings of the National Academy of Sciences.

Algae are among Earth’s most productive life forms. Along with other photosynthetic microbes, they capture roughly half of all carbon absorbed from the atmosphere globally each year. Unlike land crops, algae can grow on land unsuitable for food production and generate oil, protein, and other valuable compounds at rates 10 to 50 times greater than most terrestrial plants. Yet translating that productivity into reliable, scalable bioproduct yields has proven difficult—largely because the metabolism governing how algae partition carbon under different growth conditions was poorly understood.

Using isotope-assisted metabolic flux analysis (MFA)—a technique that measures how carbon moves through cellular pathways—the research team led by Somnath Koley, previously in the Allen lab at the Danforth Center, compared algae grown on light alone versus algae grown on both light and acetate, a condition known as mixotrophic growth.

Rather than simply combining the two energy sources, the cells were found to fundamentally rewire their metabolism: activating highly efficient biochemical pathways to conserve carbon, suppressing a separate carbon-costly process, and strategically reducing photosynthesis in ways that lowered the burden of protein production and enabled faster growth (which greatly surpassed the additive effect of light-only growth plus acetate-only growth)—a result that prior computational models had failed to predict.

“What metabolic flux analysis reveals is the actual operating strategy of the cell—not a snapshot of gene or protein levels from -omics data, but the real rates at which carbon is moving through each pathway,” said Doug K. Allen, Ph.D., principal investigator at the Danforth Center and one of the corresponding authors. “Those real-world flux constraints are what you have to work with if you want to rationally engineer algae for higher yields.”

“Without flux analysis, we couldn’t have resolved the long-standing paradox of how acetate affects algae growth,” said James G. Umen, Ph.D., principal investigator at the Danforth Center and another corresponding author. “This study shows that metabolism is fundamentally different—and far more efficient—when light and acetate are both present, and that insight is critical for anyone trying to engineer algae for higher productivity.”

“Doug and Jim’s teams have provided something rare and valuable: a quantitative map of how algae actually manage carbon. That’s the kind of foundational insight that accelerates the path from discovery to real-world solutions—whether that’s a more sustainable fuel, a new biomass crop, or a bioproduct that reduces our dependence on fossil fuels,” said Giles Oldroyd, Ph.D., President of the Danforth Center.