AI-powered forecasts sharpen early warning for destructive crop pest

What if farmers could see a pest outbreak coming before the insect ever had a chance to damage their crop? New research from Texas A&M AgriLife Research indicates that artificial intelligence can predict outbreaks much more accurately than traditional methods. The tool could dramatically improve how and when insect pest risks are identified and controlled.

In their study recently published in Ecological Informatics, scientists in the Texas A&M College of Agriculture and Life Sciences Department of Entomology used machine learning models to forecast populations of western flower thrips with notable accuracy, offering producers an early warning when pest pressure is building.

The research was led by Kiran Gadhave, Ph.D., AgriLife Research entomologist and assistant professor in the Department of Entomology at the Texas A&M AgriLife High Plains Research and Extension Center in Canyon. “If we can see pest risk building even a week earlier, that changes everything,” Gadhave said. “Accurately predicting risks sooner shifts management from reacting to damage to staying ahead of it.”

Postdoctoral researcher Arinder Arora, Ph.D., in the Department of Entomology, and Nolan Anderson, Ph.D., AgriLife Research plant pathologist in the Texas A&M Department of Plant Pathology and Microbiology, both at the Texas A&M AgriLife center in Canyon, contributed to the study.

Thousands of traps, millions of data points

Western flower thrips, tiny insects that feed on plants and spread damaging viruses, can trigger significant losses in vegetable and commodity crops once populations begin to surge. By the time producers notice the damage, the outbreak is often underway.

They damage plants during feeding and are considered a “supervector” because even small populations can trigger large yield losses once virus transmission begins.

Traditional forecasting methods for insect pests like thrips in production settings rely on simple parameters, such as temperature, humidity and the number of pests present. But those approaches often fall short in accurately assessing pests’ threat potential.

The team analyzed data from nearly 1,700 yellow sticky traps deployed weekly in both open fields and high tunnel production systems for tomatoes and peppers at the Texas A&M AgriLife Research Station at Bushland. Those counts were combined with up to 16 environmental variables, including temperature, humidity, wind speed, wind direction and rainfall, as well as the size of the “parent population” recorded 14 days earlier.

Machine learning models proved highly accurate in predicting pest population development. Models predicted thrips populations in open field settings with nearly 88% accuracy and reached about 85% accuracy in high tunnels.

“AI represents a powerful addition to our modeling because it allows us to analyze many more environmental and biological variables simultaneously and uncover patterns we simply could not see before,” Gadhave said. “This study showed that we can produce very accurate localized forecasts for pest development.”

From reactive to predictive pest management

Gadhave said accuracy dropped sharply in models that applied parameters across both open field and high tunnel systems at the same location. This finding highlights how microclimates function as distinct pest ecosystems, even when fields are side by side.

“What stood out was how quickly models broke down across the different systems,” he said. “Even neighboring fields behaved like different ecosystems, which tells us pest dynamics are fundamentally shaped by microclimate.”

The team also found that one outbreak parameter stood out in both open field and high tunnel systems: parent population size.

If thrips were already present two weeks earlier, the risk of a severe outbreak increased substantially. Temperature ranked next, with wind and humidity shaping how populations spread and build.

This use of AI modeling to forecast localized pest population dynamics shows the potential for its application across various crops, pests and regional microclimates, Gadhave said. Advanced pest and disease forecasting tools could significantly alter how producers monitor and protect their crops.

“This is proof that AI-enabled tools for agriculture aren’t futuristic,” Gadhave said. “They’re already here, and AgriLife Research is well positioned to lead their development and application in the field where they can benefit producers.”

Fertilizer can be made from local resources instead of fossil fuels

The prices of mineral fertilizers are rising. The Fraunhofer Institute for Interfacial Engineering and Biotechnology IGB is working on alternative production methods: Researchers have developed various processes and demonstrated them on a pilot scale to recover nutrients from locally available waste streams. Fertilizers ready for immediate use can be obtained from digestion residues, manure, and wastewater, as the institute will show at IFAT in Munich in early May. This circular approach strengthens supply security and protects water bodies and the climate.

The war in Iran is not only leading to higher costs for gasoline and kerosene. Fertilizer prices have also already risen by up to 30%, which will affect food prices in the foreseeable future. This is because not only gas and oil, but also nearly 30% of all nitrogen- and phosphorus-containing mineral fertilizers traded worldwide are shipped through the Strait of Hormuz. Nitrogen-containing synthetic fertilizers such as ammonia and urea, which is produced from it, are manufactured in the Gulf States using natural gas. In this process, molecular atmospheric nitrogen (N₂) is converted into ammonia (NH₃) under high pressure and at high temperatures.

The Fraunhofer Institute for Interfacial Engineering and Biotechnology IGB will show at IFAT in Munich from May 4 to 7 that fertilizers can also be produced without fossil fuels. Researchers are investigating how nitrogen and phosphorus can be recovered from nutrient-rich waste streams. Various methods have been developed in a wide range of projects to recycle essential mineral salts from liquid manure, digestate, or wastewater into fertilizers that can be used directly.

Locally available waste and residual materials as reliable sources of nutrients

Particularly high concentrations of nutrients are found in agricultural waste—liquid manure from livestock farming and digestate from biogas plants, but also in municipal wastewater. “In livestock farming alone, nitrogen and phosphorus are excreted across the EU each year in quantities sufficient to meet Europe’s demand for mineral fertilizers,” explains Dr. Brigitte Kempter-Regel, responsible for Business Development in the Greentech Solutions division at Fraunhofer IGB.

However, in many places, manure and digestate—which are actually valuable farm fertilizers—can no longer be applied to fields. Especially in areas with intensive livestock farming or a high density of biogas plants, there is a risk of overfertilizing the soil and thereby further polluting groundwater and surface water. Due to their high water content of 80–90%, transporting these materials is far from practical. One solution is to recover nutrients such as nitrogen and phosphorus.

In wastewater treatment plants, these nutrients are already removed to comply with regulatory limits and prevent the eutrophication of water bodies. However, this is usually not a recovery process. Rather, valuable phosphates are precipitated as non-bioavailable aluminum or iron salts, while ammonium nitrogen is converted via nitrate into molecular nitrogen using biological processes which escape into the air. To achieve this, the tanks must be well-aerated—requiring enormous amounts of energy. In addition, these processes often release nitrous oxide, whose greenhouse gas effect is 265 times stronger than that of CO₂.

Processing of manure and digestate into fertilizer, peat substitute, and irrigation water
One approach that benefits agricultural producers and wastewater treatment plants as well as the environment and the climate is the recovery of nutrients in a form that can be used directly as fertilizer. To this end, Fraunhofer IGB, together with its partners in the BioEcoSIM project, has implemented a multi-stage process in a pilot plant that can be used to process manure and digestate into ammonium fertilizer, phosphorus fertilizer, and organic soil conditioners.

The processing procedure itself begins with acidification of the manure or digestate to completely dissolve phosphorus in the aqueous phase. Through a multi-stage filtration process, the substrate is then separated into a liquid and a solid fraction. From the liquid phase, which contains the dissolved inorganic nutrients, phosphorus is first precipitated in the form of phosphate salts. In a second step, the dissolved nitrogen is recovered and separated as an ammonium sulfate solution via membrane absorption.

The dewatered solid fraction can either be composted or dried. “When processed into compact organic soil conditioners, the product can compensate for the loss of organic soil matter and be used as a substitute for peat,” emphasizes Kempter-Regel. The resulting water can also be reused, for example for irrigation or as rinse water.

To additionally generate energy and thereby improve economic efficiency, a biogas plant can be integrated upstream of the process. This allows even odor-intensive substances in the raw manure to be metabolized and thus removed.

“In a current project, we have established a mathematical model for the BioEcoSIM process with an upstream biogas plant and calibrated it using measurement data for the various process steps,” explains Michael Bohn, who is further developing the nutrient recovery process at Fraunhofer IGB.

This allows the economic efficiency of the process to be predicted under various conditions. Higher prices for synthetic fertilizers, for example, have an immediate effect.

Gene circuits reshape DNA folding and affect how genes are expressed, study finds

When a gene is turned on in a cell, it creates a ripple effect along the DNA strand, changing the physical structure of the strand. A new study by MIT researchers, appearing in Science, shows that these ripples can stimulate or suppress neighboring genes. These effects, which result from the winding or unwinding of neighboring DNA, are determined by the order of genes along a strand of DNA. Genes upstream of the active gene are usually turned up, while those downstream are inhibited.

The new findings offer guidance that could make it easier to control the output of synthetic gene circuits. By altering the relative ordering and arrangement of genes (gene syntax), researchers could create circuits that synergize to maximize their output, or that alternate the output of two different genes.

“This is really exciting because we can coordinate gene expression in ways that just weren’t possible before,” says Katie Galloway, an assistant professor of chemical engineering at MIT. “Syntax will be really useful for dynamic circuits. Now we have the ability to select not only the biochemistry of circuits, but also the physical design to support dynamics.”

Galloway is the senior author of the study. MIT postdoc Christopher Johnstone, Ph.D. is the paper’s lead author. Other authors include MIT graduate student Kasey Love, members of the lab of Brandon DeKosky, an MIT associate professor of chemical engineering, and researchers from Peter Zandsta’s lab at the University of British Columbia and the labs of Christine Mummery and Richard Davis at Leiden University Medical Center in the Netherlands.

Gene syntax

When a gene is copied into messenger RNA (transcribed), the double-stranded DNA helix must be unwound so that an enzyme called RNA polymerase can access the DNA and start copying it. That unwinding leads to physical changes in the structure of the DNA strand.

Upstream of the gene, DNA becomes looser, while downstream, it becomes more tightly wound. These changes affect RNA polymerase’s ability to access the DNA: Upstream of an active gene, it’s easier for the enzyme to attach; downstream, it’s more difficult.

In a study published in 2022, Galloway and Johnstone performed computational modeling that explored how these biophysical changes might influence gene expression. They studied three different arrangements, or types of syntax: tandem, divergent, and convergent.

Most synthetic gene circuits are designed in a tandem arrangement, with one gene followed by another downstream. In a divergent arrangement, neighboring genes are transcribed in opposite directions (away from each other), and in convergent syntax, they are transcribed toward each other.

The modeling suggested that the divergent arrangement was most likely to produce circuits where both genes are expressed at a high level. Tandem arrangements were predicted to result in the downstream gene being suppressed by the upstream gene. In the new study, the researchers wanted to see if they could observe these predicted phenomena in human cells.

“Normally, we think about gene circuits and pieces of DNA as these lines that we draw, but they’re polymers that have physical characteristics,” Galloway says. “The thing that we were trying to solve in this paper was: When you put two genes on the same piece of DNA, how does their physical interaction become coupled?”

The researchers engineered circuits that each contained two genes—in either a tandem, divergent, or convergent configuration—into human cell lines and human induced pluripotent stem cells.

The results confirmed what their modeling had predicted: In divergent circuits, expression of both genes was amplified. In tandem circuits, turning on the upstream gene suppressed the expression of the downstream gene.

These effects produced as much as a 25-fold increase or decrease in gene expression, and they could be seen at distances of up to 2,000 base pairs between genes.

Using a high-resolution genome mapping technique called Region Capture Micro-C, the researchers were also able to analyze how the DNA structure changed when nearby genes were being transcribed.

As predicted, they found that the DNA regions downstream from an active gene formed tightly twisted structures known as plectonemes, similar to the tangles seen in a twisted telephone cord. These structures make it harder for RNA polymerase to bind to DNA.

To engineer these cells, the researchers used a new system they developed with the LUMC team called STRAIGHT-IN Dual, which allows them to efficiently insert two genes into the same DNA strand at both alleles. This system is being reported in a second paper published today, in Nature Biomedical Engineering.

Precise control
The new findings could help guide the design of synthetic gene circuits, which are usually designed to be controlled by biochemical interactions with activator or repressor molecules. Now, circuit designers can also perform biophysical manipulations to enhance or repress gene expression.

“Everyone thinks about the components they need, and the biochemical properties they need to build a circuit,” Galloway says. “Now, we have added the physical construction of those components, which is going to change how those biochemical units are interpreted.”

As a demonstration of one potential application, the researchers built synthetic circuits containing the genes for two segments of a novel antibody discovered by the Dekosky lab, used to treat yellow fever, and incorporated them into human cells. As they expected, the divergent syntax produced larger quantities of the yellow fever antibody.

Galloway’s lab has also used this approach to optimize the output of synthetic gene circuits they previously reported that could be used to deliver gene therapy or to reprogram adult cells into other cell types.

This strategy could also be used to build a variety of other types of dynamic synthetic circuits, such as toggle switches, oscillators, or pulse generators, for any application that requires precise control over gene expression.

“If you want coordinated expression, a divergent circuit is great. If you want something that’s either/or, you can imagine using a convergent or tandem circuit, so when one turns on, the other turns off, and you can alternate pulses,” Galloway says. “Now that we understand the syntax, I think this will pave the way for us to program dynamic behaviors.”

CRISPR speed patterns can identify multiple viruses and variants simultaneously

As the spread of infectious diseases accelerates, technologies that can accurately distinguish multiple viruses in a single test are becoming increasingly important. KAIST and an international research team have developed a new diagnostic technology that simultaneously identifies various viruses and variants by controlling the “speed” of gene scissors.

This study was published in Nature Biomedical Engineering.

This technology is expected to transform responses to emerging infectious diseases, as it can detect multiple infections at once while reducing the complexity of testing procedures.

A research team led by Professor Sungmin Son from the Department of Bio and Brain Engineering, in collaboration with researchers from the University of California, Berkeley (UC Berkeley) and the Gladstone Institutes, has developed a new ribonucleic acid (RNA) diagnostic technology that can distinguish multiple viruses and variants simultaneously by utilizing the reaction speed of gene scissors.

The tool used by the research team is a CRISPR-based protein called Cas13. Gene scissors are proteins that locate and cut specific genetic material, becoming activated when they recognize their target. Cas13 specifically targets RNA. When it finds its target, it becomes activated and cuts surrounding RNA, generating a fluorescent signal.

Existing technologies require the use of different gene scissors or various fluorescent colors to detect multiple viruses simultaneously, making the system complex and difficult to apply in real-world settings.

The research team took a different approach. They focused on the fact that when gene scissors bind to their target, the speed of “cutting” varies depending on the type of virus.

By observing at the single-molecule level within tiny droplets, they confirmed that unique reaction speed patterns emerge depending on the combination of guide RNA and target RNA. Guide RNA is an RNA molecule that provides “positional information,” guiding the gene scissors to their target.

Based on this, the research team developed a “kinetic barcoding” technology that uses differences in reaction speed like a barcode. This method interprets reaction speeds as signal patterns to distinguish different viruses. Through this technology, it became possible to simultaneously identify multiple viruses and variants using only a single type of gene scissors.

In addition, by adjusting the design of guide RNA, the cutting speed of gene scissors can be tuned, enabling scalable and simultaneous detection of a wide range of viruses.

The testing process has also been greatly simplified. In conventional methods, detecting RNA viruses requires a “reverse transcription” process that converts RNA into DNA, but this technology enables direct detection of RNA as it is. Reverse transcription is a step that increases testing time and complicates procedures.

When tested on actual clinical samples, the technology successfully distinguished various respiratory viruses and SARS-CoV-2 variants in a single reaction.

Professor Sungmin Son stated, “This study goes beyond simply determining whether a virus is present, and is the first case to use the reaction speed of gene scissors as a new form of diagnostic information,” adding, “It will become a next-generation platform capable of diagnosing various infectious diseases at once in the field.”

Water molecules found to actively drive gene transcription process

Researchers have uncovered a previously hidden layer of complexity in how genes are activated, showing that water molecules play a direct and essential role in one of the most fundamental processes in biology: DNA transcription.

Using state-of-the-art cryo-electron microscopy that can zoom down to smaller than the width of a single atom, the researchers visualized the inner workings of RNA polymerase II—the enzyme responsible for reading DNA and synthesizing RNA. The technology allowed the researchers to visualize individual water molecules and metal ions within the enzyme at an unprecedented level of detail.

The results, published in Molecular Cell, provide a fundamentally new understanding of how genetic information is read and expressed, offering insights that could inform future research in molecular biology, drug development and disease mechanisms.

RNA polymerase II is central to gene expression, carrying out the first step in converting genetic information into functional molecules. While scientists have long understood the major structural components involved in transcription, the precise biochemical mechanism has remained unclear.

Water’s active role in transcription

By capturing multiple high-resolution snapshots of the enzyme in action, researchers identified hundreds to more than a thousand individual water molecules positioned adjacent to the enzyme. Many of these were located at critical functional sites, forming intricate networks that connect the enzyme, DNA and incoming RNA building blocks.

The study reveals that these water molecules are not passive bystanders, but active participants in the chemistry of transcription. Specifically, they help facilitate proton transfer—an essential step in the chemical reaction that adds new nucleotides to a growing RNA strand.

Water molecules also contribute to recognizing the correct molecular substrates and stabilizing key structural elements of the enzyme during the transcription process.

Strikingly, these waters are evolutionary conserved from bacteria to yeast—and potentially humans as well—challenging the traditional “protein-centered” view of gene expression, showing that water is an integral and evolutionarily conserved part of the transcription machinery. This finding could fundamentally change how scientists think about how genes are read.

The study was led by Dong Wang, Ph.D., professor at UC San Diego Skaggs School of Pharmacy and Pharmaceutical Sciences at UC San Diego.

Autumn leaves transformed into biodegradable mulch film can curb farm plastic pollution

Fallen leaves, which are discarded every year, have been transformed into a resource that can replace waste plastics, a major nuisance in rural areas. A research team at the Korea Advanced Institute of Science and Technology (KAIST) has developed biodegradable agricultural vinyl made from fallen leaves, presenting a new way to solve the problem of conventional plastic vinyl, which is a cause of soil pollution. The study is published in the journal Green Chemistry.

The team, led by Professor Jaewook Myung of the Department of Civil and Environmental Engineering, developed an eco-friendly agricultural mulch film (an agricultural vinyl that covers the soil to suppress weeds and maintain moisture) that decomposes in the ground using fallen leaves collected from the campus and near the Gapcheon River in Daejeon. This research is significant in that it converted fallen leaves, which are non-edible biomass (plant resources not used for food) that were discarded as useless, into high-value functional materials.

Mulch films, widely used in agricultural fields, are essential materials for suppressing weed growth and maintaining soil moisture. However, most films currently used are made of polyethylene (PE, a representative petroleum-based plastic), making them difficult to collect after use. Residuals left in the soil turn into microplastics (plastic particles so small they are invisible to the naked eye), causing environmental pollution.

To extract the key components from fallen leaves, the research team used a hydrated deep eutectic solvent (DES, a special eco-friendly solvent with low toxicity) that mixes citric acid and choline chloride.

Through this, they extracted nanocellulose (plant-derived nanofibers with high strength and eco-friendliness) obtainable from plant cell walls and combined it with polyvinyl alcohol (PVA, a water-soluble and naturally degradable polymer material) to produce a composite film. The eco-friendliness was further enhanced by performing all manufacturing processes based on water instead of harmful organic solvents.

The “fallen leaf film” developed in this way showed sufficient performance even in actual agricultural environments. It effectively blocked ultraviolet rays (UVA and UVB) and exhibited moisturizing performance that suppressed soil moisture loss to a level of about 5% for 14 days. In addition, ryegrass grown using this film showed better growth status than cases where no film was used.

Biodegradation performance was also confirmed. As a result of testing under soil conditions, the developed film decomposed by 34.4% in about 115 days, showing a faster decomposition rate than conventional biodegradable films. Furthermore, it was confirmed that plant toxicity (harmful effects on plant germination or growth) did not occur during the decomposition process, thus not affecting the germination and early growth of ryegrass and bok choy.

Professor Jaewook Myung said, “This research is meaningful in that it went beyond simply processing fallen leaves and converted them into functional materials that can protect the agricultural environment. Through the use of fallen leaves that do not compete with food resources and water-based processes, it can be utilized as a sustainable alternative technology for agricultural plastics.”

Revolving doors and efficient engines: How proteins escape a molecular tangle

Trying to untangle a knot in a mess of strings can be frustrating and time-consuming. But not so for molecular machines—molecules that convert chemical energy into mechanical work and motion. Machines from the AAA+ family, which exist in the cells of all living organisms from bacteria to humans, can, among their many functions, recognize misfolded protein chains and swiftly unravel them.

Cracking how protein machines work

Researchers in the laboratory of Prof. Gilad Haran at the Weizmann Institute of Science have deciphered this sophisticated mechanism, which is both fast and remarkably efficient. Their findings, recently published in Nature Communications, reveal how cells perform quality control on their proteins, and may help explain why this control fails in diseases such as neurodegeneration and cancer. They may also provide inspiration for the development of highly efficient artificial molecular machines.

Over the past decade, scientists have succeeded in imaging the three-dimensional structures of the tiny AAA+ machines by freezing them and examining them under electron microscopes. They found that each machine consists of six protein subunits arranged in a ring, forming a central channel. When a protein chain in the cell becomes entangled or misfolded, these machines come to the rescue, unraveling the chain by threading it through the channel.

But what force pulls the chain through? Until now, it was unclear how a tiny molecular machine converts chemical energy within the cell into an effective mechanical pulling action. The prevailing hypothesis proposed a “hand-over-hand” mechanism: In each cycle, the machine would use a burst of energy to thrust one “arm” (a subunit) forward, grasp the protein chain and pull it through, repeating the cycle until the entire chain had passed through. However, this model did not align with several biophysical observations reported in scientific literature.Translocation of casein in both the forward and backward directions is observed in three-color experiments. Credit: Nature Communications (2026). DOI: 10.1038/s41467-026-68478-1

Watching proteins move in real time

To address this question, the researchers—led by Dr. Remi Casier from Haran’s lab in Weizmann’s Chemical and Biological Physics Department—developed a method that allowed them to monitor, in real time rather than through frozen snapshots, the passage of a protein chain through the molecular machine. They used fluorescent sensors attached to the milk protein casein and to the AAA+ machine that processes it. A green sensor was attached to the casein, an orange sensor to the machine’s entrance and a red sensor to its exit.

When the sensors were far apart, only the green fluorescence was visible; but as the protein moved through the channel, it transferred energy to the orange or red sensor. By measuring the intensity of each color, the researchers could determine precisely where the protein was at any given moment. To ensure repeated encounters between the protein and the machine, the researchers confined them within a tiny lipid bubble (a liposome) that prevented them from drifting out, while allowing the entrance of ATP molecules, the “fuel” used by most molecular machines.

“The labeled protein segment shot through the channel at tremendous speed, within just a few milliseconds—despite the fact that it takes the machine more than half a second to break down a single ATP molecule and extract energy from it,” says Haran. “This revealed just how energy-efficient the machine is—and made the ‘hand-over-hand’ model, which relies on energy bursts and big leaps, less plausible. We had to rethink the entire mechanism.”

To better understand the role of ATP in the machine’s activity, the researchers performed two experiments. In the first, they replaced ATP with similarly structured but largely inactive molecules and saw that movement within the channel became directionless. In the second experiment, they gradually reduced ATP concentration without eliminating it entirely. They observed a dramatic drop in the number of threading events, but to their surprise, the speed of each threading hardly changed.

A revolving door–style Brownian motor

“We discovered that the machine uses energy to initiate the threading process and maintain directional motion, but not to forcibly pull the chain or accelerate its movement,” explains Haran. “We propose that the molecular machine operates like a revolving door. When the protein enters, it can attempt to move in any direction. But the machine is structured so that in the presence of ATP, only movement in one direction results in forward motion, while attempts to move in the opposite direction are blocked.”

Because proteins are naturally in constant random motion, this mechanism—known as a Brownian motor, named after Robert Brown, who was the first to observe the random motion of small particles under a microscope—is highly energy-efficient.

“Based on these findings and previous research, we can now speculate in detail about what happens inside the molecular machine,” Haran adds. “Loops in the channel wall protrude into its interior and, like the wings of a revolving door, determine the preferred direction of movement. The machine uses energy to ensure that these loops oscillate in the correct direction.”

In the final stage of the study, the researchers focused on failure events, in which threading through the channel was not completed. “These events lasted a relatively long time,” says Haran. “We found that in their course, the protein moved back and forth within the channel until it mistakenly exited from the same end where it had entered. This indicates that there are no large energy fluctuations or powerful forces inside the channel, but rather a subtle motion-guiding mechanism that is occasionally prone to error.”

“In this new study, we were able to glimpse the inner workings of an important molecular machine that has been operating in cells for billions of years,” says Haran. “In many disease processes, including neurodegeneration and cancer, the quality control of cellular proteins fails, leading to the accumulation of misfolded proteins. Understanding the control mechanisms is crucial for discovering why this happens and how it might be prevented.

“Moreover, AAA+ machines perform many roles beyond quality control: They transport proteins and genetic material and move them across membranes, and we hypothesize that the Brownian mechanism we identified drives these processes as well.”

What this could mean for technology

In 2016, the Nobel Prize in Chemistry was awarded for the development of artificial molecular machines, such as a tiny elevator, an artificial muscle and nano-cars. The new Weizmann Institute findings may enable engineers to improve the design of such machines.

“The energy efficiency of the Brownian motor could allow a major leap forward in the development of artificial molecular machines,” says Haran. “In the future, such machines may carry out practical tasks and be integrated into engines and computers.”

AI drug target platform pairs prediction with benchmarking to improve early discovery

Insilico Medicine, a clinical-stage biotechnology company powered by generative artificial intelligence (AI), today announced advancements to its unified AI framework for drug target discovery, integrating its previously introduced Target Identification Pro (TargetPro) and Target Identification Benchmark (TargetBench 1.0) into a validated system designed to improve the accuracy, reliability, and scalability of early-stage drug development.

Published in Scientific Reports, the framework demonstrates strong performance across benchmarking and real-world discovery workflows, reinforcing its role as a standardized foundation for AI-driven target identification.

Drug discovery has long been constrained by high failure rates, with nearly 90% of clinical candidates failing—often due to weak or poorly validated biological targets. Insilico’s integrated TargetPro–TargetBench framework addresses this challenge by combining disease-specific predictive modeling with rigorous, standardized evaluation.

A new standard for AI-driven target discovery

TargetPro is a disease-specific machine learning model spanning 38 diseases across oncology, metabolic, immune-related, fibrotic, and neurological categories. The model integrates 22 distinct omics- and text-based scores from Insilico’s AI target discovery platform, PandaOmics, to identify targets with a high likelihood of clinical success.

In performance evaluations, TargetPro substantially outperformed individual omics- and text-based approaches, demonstrating the power of multimodal data integration. TargetPro also revealed that some data types or analytic models were more predictive in the context of different diseases, underscoring the value of disease-specific target identification models.

TargetBench 1.0: Establishing a benchmarking standard

Complementing TargetPro, TargetBench 1.0 provides a comprehensive benchmarking system for evaluating target discovery models, including large language models (LLMs). There has been a lack of evaluation methods to provide an unbiased comparison of these models, as each relies on different data types and sources and optimizes for different result characteristics.

To fill in this “benchmarking gap” and evaluate models for capturing both confidence and novelty, the platform assesses performance based on the ability to recover known clinical targets and prioritize high-quality novel candidates.

TargetPro demonstrated an overall precision-at-top-K of 71.6% in retrieving known clinical targets across all disease categories, representing a 1.7 to 5.5-fold improvement over leading LLM-based approaches. Together, TargetPro and TargetBench address a critical industry gap, not only generating high-quality targets, but systematically validating them against standardized, reproducible benchmarks.

Driving actionable, translationally novel targets

Beyond predictive performance, the integrated framework emphasizes translational readiness—ensuring that nominated targets are actionable in downstream experimental and clinical workflows. Among TargetPro’s predicted novel candidates:

  • 95.7% have available 3D protein crystal structures, enabling downstream drug design
  • 86.5% are classified as druggable with supporting clinical evidence
  • 46% are associated with approved drugs in other indications, highlighting repurposing opportunities

These results demonstrate the system’s ability to generate actionable intelligence—prioritized, biologically grounded targets with strong potential for successful development.

Toward scalable and standardized AI in drug discovery

The TargetPro–TargetBench framework represents a critical step toward more scalable and standardized AI-driven drug discovery. By integrating predictive modeling with rigorous benchmarking, Insilico is helping establish a common framework for evaluating and deploying AI systems across pharmaceutical R&D.

The latest versions, TargetPro 2.0 and TargetBench 2.0, expand disease coverage from 38 to 100 indications across 10 therapeutic areas, including new coverage of cardiovascular, ophthalmological, reproductive, and mental health disorders. TargetBench 2.0 also introduces additional benchmarking dimensions, such as mechanism-of-action clarity, commercial potential and average precision, to further strengthen the platform’s utility for drug discovery decision-making.

The approach also aligns with Insilico’s broader efforts to advance reliable scientific AI, emphasizing the importance of standardized evaluation to accelerate adoption and improve outcomes across the drug discovery pipeline.

“Most failures in drug discovery begin with weak target selection,” said Alex Zhavoronkov, Ph.D., Founder and CEO of Insilico Medicine. “By combining disease-specific modeling with rigorous benchmarking, we are moving from predictive AI to actionable intelligence—enabling scientists to prioritize targets with greater confidence and speed. This framework brings us closer to more reliable, scalable, and efficient therapeutic discovery.”

Egg-scanning AI may let hatcheries sort life, death and sex before chicks emerge

Eggs and poultry provide important sources of protein globally, driving a major industry with large economic impacts. Challenges to hatchery operations include embryo mortality, fertility, sex determination, and eggshell characteristics. These features have a substantial impact on production, but they are difficult and time-consuming to estimate.

A University of Illinois Urbana-Champaign research team has conducted multiple studies using near infrared (NIR) and hyperspectral imagery (HSI) to evaluate chicken eggs, potentially leading to more efficient, safe, and humane production methods. They discuss their findings in a series of publications.

Their most recent study published in British Poultry Science uses HSI and machine learning to predict chick embryo mortality. Previous research has shown that embryo mortality rates in hatcheries can reach more than 10%, which impacts economic viability, production efficiency, and animal welfare.

“If there is a genetic disorder or other inherent issue, some eggs don’t produce healthy chicks, and the embryo dies. This poses a health hazard, as dead embryos can harbor bacteria. If we can detect and remove them early in the incubation period, we can avoid biosecurity issues,” said lead author Md. Wadud Ahmed, a doctoral student in the Department of Agricultural and Biological Engineering (ABE), part of the College of Agricultural, Consumer and Environmental Sciences and The Grainger College of Engineering at U. of I. when the research was conducted.

Hatcheries can test for mortality by shining a bright light through the egg, but this method requires time and resources.

The researchers obtained 300 chicken eggs from the U. of I. poultry farm and placed them in a commercial incubator. They used a hyperspectral camera system to acquire images before incubation and after four days in the incubator.

After incubation, the researchers could identify spectral wavelength patterns from the images of dead and alive embryos. They used those images to build machine learning models that could interpret each egg’s status based on spectral images in early incubation. They found the best performing model reached up to 97% accuracy on day 4.

In another study published in Food Control, they focused on determining the sex of the embryo. Currently, hatcheries cull male chicks after incubation, but early identification could prevent this.

“Male chicks are considered a byproduct because they don’t lay eggs and they are not economically feasible for meat production. Around 6 billion male chicks are culled annually in the U.S., which raises serious animal welfare, economic, and biosecurity issues for the hatchery. If we can identify the embryos early, we can avoid the culling of males and use the eggs for table eggs or in food production,” Ahmed said.

Some European countries have banned the culling of male chicks, and the U.S. poultry industry is searching for technology solutions that would enable early identification of male embryos.

For this study, the researchers also obtained eggs from the U. of I. poultry farm and placed them in a commercial incubator, acquiring HSI images before and during incubation.

“For each egg, we have the hyperspectral images and we have the reference parameter, which is whether the egg produced a male or female chick. With this information, we can create a library of images and reference parameters, and we can use machine learning and explainable AI to process this information and train the model,” Ahmed said. “Then you can select an unknown egg; the system scans the egg, and the model reads the pattern based on previous experience to predict if the egg will yield a male or female chick.”

The researchers obtained 75% accuracy at day 0 (early incubation) in classifying male and female embryos.

In additional studies, they looked at other egg characteristics, including fertility, shell strength, shell thickness, and yolk ratio.

“Conventional testing methods are destructive; for example, to measure the shell strength, you need to break the eggs. Our primary focus is to develop non-destructive, cost-effective methods. With NIR and HSI, we do not need to destroy the eggs. We just need to scan them and the machine learning model will determine the desired parameter,” said Mohammed Kamruzzaman, assistant professor in ABE and corresponding author on the papers.

While regular cameras record light in three channels (red, green, and blue) to capture visible images, NIR captures bands beyond visible light to detect chemical composition. HSI records hundreds of bands across the light spectrum to yield molecular information.

To determine shell characteristics, the researchers used NIR spectroscopy, which is less expensive than HSI, but does not capture the complex molecular information required for sex and mortality determination.

If these techniques are to be implemented by the hatchery industry, the process needs to be automated, Kamruzzaman noted.

“We are working on developing a system with a robotic arm that can separate the eggs. For example, after the machine learning model identifies an egg as male or female, the arm can remove the male eggs,” he said.

“NIR and HSI technology have applications in agriculture, food, environment, and biomedicine. It’s new for the poultry industry, but the results we obtained in our research are very promising, so I think implementing it could be very useful for the industry’s processing or farm side.”

The researchers have published their NIR datasets on shell strength, shell thickness, and yolk ratio, making them freely available for other researchers to use. They plan to publish their HSI image data sets as well.

New technology helps flat-faced dogs breathe easy

Australian scientists have developed an injectable therapy that helps clear blocked airways in flat-faced dogs. Melbourne-based biotechnology company Snoretox and RMIT University have shown early success using the first therapy from a new technology, known as Snoretox-1.

The collaboration tested the therapy on bulldogs with breathing difficulties caused by a common condition in flat-faced dogs that restricts airflow, known as brachycephalic obstructive airway syndrome (BOAS).

Almost half of all Pugs, French and British Bulldogs are affected, impacting their ability to breathe, eat, exercise and sleep, according to Snoretox Managing Director and RMIT Adjunct Professor Tony Sasse.

“Decades of selective breeding for the popular flat-faced appearance have unfortunately led to serious breathing problems,” he said.

“In severe cases, the condition has been shown to shorten a dog’s life by up to four years.”

Successful early results in affected dogs

The early-stage trial involved six bulldogs with severe symptoms that struggled to complete a three-minute walk but were able to do so far more easily after receiving the patented Snoretox-1 treatment, with noticeably reduced breathing noise and effort.

The first published results of the study in The Veterinary Journal show how all six dogs displayed visible improvements and were able to complete a brisk walk that was previously difficult.

Surgery to widen the nostrils and remove excessive throat tissue, along with weight-management strategies, are currently the main treatment options available, but outcomes vary.

“Research shows that up to 60% of affected dogs still experience breathing problems after surgery, and 7% do not survive the procedure,” Sasse said.

Sasse said the bulldog trial results suggested a possible combination with, or alternative to, surgery.

“We also observed improvements in dogs that had not responded well to previous surgery,” he said.

“Further research and regulatory approvals are required before the treatment can be offered more widely, but these positive results provide an early indication that we are on the right path.”

How Snoretox-1 works

Snoretox-1 is an injectable treatment that uses a modified tetanus toxin to improve the muscle tone in the floor of the dog’s mouth, helping keep the airway open.

The technology has been in development for over 15 years in collaboration with RMIT School of Science biotechnologist Professor Peter Smooker.

“In short-snouted breeds, the soft tissue in the upper airway hasn’t adapted to the shorter skull. This leaves excess tissue crowded into a smaller space, where it can obstruct airflow,” he said.

“The therapy strengthens the muscles at the front of the airway, helping support the throat and maintain airflow during breathing.”

The treatment combines a targeting component with a tiny dose of the therapeutic agent to safely increase muscle tone in the airway. The technology may also have future applications in other conditions involving weak muscle tone, although the current focus is on veterinary use.

“This product has platform potential for a range of conditions in both animals and humans,” Sasse said.

“From a regulatory approvals perspective, it makes sense to start with these animal applications, but we are keeping the bigger picture of wider applications in animals and humans in mind.”

Impact-focused research

RMIT Deputy Vice-Chancellor Research and Innovation Distinguished Professor Calum Drummond AO said the technology was an example of RMIT’s commitment to research impact.

“This project is focused on making a real difference to animals, with the potential for broader impact in the future,” he said.

Professor Russell Conduit, who is part of the team from RMIT’s School of Health and Biomedical Sciences, said the findings also point to future applications beyond veterinary care such as such as obstructive sleep apnea, incontinence and pelvic floor disorders for humans.

“This is exciting evidence to support human drug trials for conditions involving poor muscle tone,” Conduit said.