Presentation
7 March 2022 Improving the histological realism of ultraviolet photoacoustic remote sensing microscopy images using a deep learning-based generative adversarial network
Author Affiliations +
Abstract
Following resection of cancerous tissues, specimens are excised from the surgical margins to be examined post-operatively for the presence of residual cancer cells. Hematoxylin and eosin (H&E) staining is the gold standard of histopathological assessment. Ultraviolet photoacoustic microscopy (UV-PARS), combined with scattering microscopy, provides virtual nuclei and cytoplasm contrast similar to H&E staining. A generative adversarial network (GAN) deep learning approach, specifically a CycleGAN, was used to perform style transfer to improve the histological realism of UV-PARS generated images. Post-CycleGAN images are easier for a pathologist to examine and can be input into existing machine learning pipelines for H&E-stained images.
Conference Presentation
© (2022) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Matthew T. Martell, Ewan A. McAlister, Nathaniel J. M. Haven, Sveta Silverman, Lashan Peiris, Jean Deschenes, Xingyu Li, and Roger J. Zemp "Improving the histological realism of ultraviolet photoacoustic remote sensing microscopy images using a deep learning-based generative adversarial network", Proc. SPIE PC11960, Photons Plus Ultrasound: Imaging and Sensing 2022, PC1196013 (7 March 2022); https://doi.org/10.1117/12.2610461
Advertisement
Advertisement
KEYWORDS
Microscopy

Photoacoustic spectroscopy

Remote sensing

Ultraviolet radiation

Cancer

Oncology

Scattering

Back to Top