Transfer learning using generative adversarial networks for tissue spesific virtual staining
Abbas, Hyder (2025-05-31)
Transfer learning using generative adversarial networks for tissue spesific virtual staining
Abbas, Hyder
(31.05.2025)
Julkaisu on tekijänoikeussäännösten alainen. Teosta voi lukea ja tulostaa henkilökohtaista käyttöä varten. Käyttö kaupallisiin tarkoituksiin on kielletty.
avoin
Julkaisun pysyvä osoite on:
https://urn.fi/URN:NBN:fi-fe2025061064513
https://urn.fi/URN:NBN:fi-fe2025061064513
Tiivistelmä
Recent advances in biomedical imaging have highlighted the potential of generative adversarial networks (GANs) to significantly improve histopathological analysis by artificially staining tissues. The process is colloquially known as "virtual staining". Conventional methods, although quite essential and meaningful in diagnostics, are both time-consuming and expensive, creating an urge for much more efficient and high-quality alternative methods. In order to address said need, this thesis explores the use of ’transfer learning’ of GAN models to improve the virtual staining of histopathological images, which primarily aims to enhance model accuracy and reduce the computational demands that are related to training a model from scratch. Transfer learning leverages available weights of a pre-trained model to address data scarcity and time-complexity, enabling GAN models to produce high-quality stained representations from raw tissue images across diverse tissue types.
The primary objectives of this research include investigating the feasibility of transfer learning in generating virtual stains for various tissue types and assessing its impact on training efficiency and image quality. We have used DensePix2Pix GAN model for generating predicted images, this model serve as the foundation for this approach, with modifications made to support transfer learning on a dataset of histopathological images from four tissue types: skin, kidney, spleen, and intestine. Performance of model is evaluated using Pearson Correlation Coefficient Ratio (PCCR), Structural Similarity Index (SSIM), Peak Signal-to-Noise Ratio (PSNR) and Mean Square Error (MSE). This research addresses some critical research questions on the trade-offs between training from scratch versus transfer learning, resource efficiency in computational costs, and the generalization of transfer learning across different tissue types. Findings from this study contribute to the field of medical imaging by demonstrating that transfer learning with GANs can significantly reduce the need for large datasets and extensive computational resources, offering an accessible, scalable framework for virtual staining across multiple histopathological domains.
The primary objectives of this research include investigating the feasibility of transfer learning in generating virtual stains for various tissue types and assessing its impact on training efficiency and image quality. We have used DensePix2Pix GAN model for generating predicted images, this model serve as the foundation for this approach, with modifications made to support transfer learning on a dataset of histopathological images from four tissue types: skin, kidney, spleen, and intestine. Performance of model is evaluated using Pearson Correlation Coefficient Ratio (PCCR), Structural Similarity Index (SSIM), Peak Signal-to-Noise Ratio (PSNR) and Mean Square Error (MSE). This research addresses some critical research questions on the trade-offs between training from scratch versus transfer learning, resource efficiency in computational costs, and the generalization of transfer learning across different tissue types. Findings from this study contribute to the field of medical imaging by demonstrating that transfer learning with GANs can significantly reduce the need for large datasets and extensive computational resources, offering an accessible, scalable framework for virtual staining across multiple histopathological domains.