One of the biggest challenges for surgeons is the visual and subjective assessment of tissue, for example, to distinguish between malignant and healthy areas or to spare critical structures. Spectral imaging is considered a means of overcoming the limitations of visual perception. Modern hyperspectral cameras capture more information than the human eye – for example, about blood flow or the oxygen content of tissue.
Artificial intelligence (AI) is required to convert high-dimensional spectral data into clinically usable information. However, in order for AI algorithms to make meaningful use of this data, they need large amounts of “annotated” image data. Experts understand this to refer to data sets that have been supplemented with additional information by medical professionals in order to make them usable for machine learning or other data-based analyses. However, such data sets are often difficult or impossible to obtain from patients for ethical, legal, and practical reasons.
At the same time, animal experiments often provide huge standardized image data sets in which various tissue changes have been specifically generated and examined. This is exactly where xeno-learning comes in: the method uses animal image data to prepare the AI specifically for typical changes such as circulatory disorders – and then transfers this knowledge to humans.
Knowledge transfer across species boundaries
“The challenge was that the tissue signatures of humans, pigs, and rats differ considerably in absolute terms,” explains principal investigator Lena Maier-Hein, department head at the DKFZ and director at the NCT Heidelberg. “But we were able to learn various pathophysiological mechanisms from the animal model and transfer them to humans through a novel training of neural networks,” says Maier-Hein, who also heads a research group at the Surgical Clinic at Heidelberg University Hospital.
In the study, the researchers analyzed over 13,000 hyperspectral images of humans, pigs, and rats—a data set that is unique to date. The results showed that classic AI models trained with animal data fail in humans. However, this hurdle was overcome with the new xeno-learning approach. The AI did not learn the absolute color patterns, but rather the patterns of change in certain pathological conditions – and was able to successfully apply this knowledge to human tissue.
The researchers see great potential for application in surgery: “Xeno-learning enables the use of spectral imaging even where human data is lacking,” says Jan Sellner, one of the two lead authors of the study. “This is an important step toward making surgical procedures safer and more precise in the future,” explains Alexander Studier-Fischer from the Department of Urology at Mannheim University Hospital, who led the clinical aspects of the project.
To ensure that their new approach can be introduced into the operating room as soon as possible, the DKFZ researchers have made the program code and pre-trained models available to other scientists.
The work was largely funded by the European Research Council (ERC Consolidator Grants NEURAL SPICING) and the Helmholtz Graduate School Hidss4Health.
Jan Sellner*, Alexander Studier-Fischer*, Ahmad Bin Qasim, Silvia Seidlitz, Nicholas Schreck, Minu Tizabi, Manuel Wiesenfarth, Annette Kopp-Schneider, Janne Heinecke, Jule Brandt, Samuel Knödler, Caelan Max Haney, Gabriel Salg, Berkin Özdemir, Maximilian Dietrich, Maurice Stephan Michel, Felix Nickel, Karl-Friedrich Kowalewski, Lena Maier-Hein: Xeno-learning: knowledge transfer across species in deep learning-based spectral image analysis.
*shared first authorship
Nature Biomedical Engineering 2026, DOI: 10.1038/s41551-025-01585-4