FA4SANS-GAN: A Novel Machine Learning Generative Adversarial Network to Further Understand Ophthalmic Changes in Spaceflight Associated Neuro-Ocular Syndrome (SANS)

Sharif Amit Kamran, Khondker Fariha Hossain, Joshua Ong, Ethan Waisberg, Nasif Zaman, Salah A. Baker, Andrew G. Lee, Alireza Tavakkoli

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

Purpose: To provide an automated system for synthesizing fluorescein angiography (FA) images from color fundus photographs for averting risks associated with fluorescein dye and extend its future application to spaceflight associated neuro-ocular syndrome (SANS) detection in spaceflight where resources are limited. Design: Development and validation of a novel conditional generative adversarial network (GAN) trained on limited amount of FA and color fundus images with diabetic retinopathy and control cases. Participants: Color fundus and FA paired images for unique patients were collected from a publicly available study. Methods: FA4SANS-GAN was trained to generate FA images from color fundus photographs using 2 multiscale generators coupled with 2 patch-GAN discriminators. Eight hundred fifty color fundus and FA images were utilized for training by augmenting images from 17 unique patients. The model was evaluated on 56 fluorescein images collected from 14 unique patients. In addition, it was compared with 3 other GAN architectures trained on the same data set. Furthermore, we test the robustness of the models against acquisition noise and retaining structural information when introduced to artificially created biological markers. Main Outcome Measures: For GAN synthesis, metric Fréchet Inception Distance (FID) and Kernel Inception Distance (KID). Also, two 1-sided tests (TOST) based on Welch's t test for measuring statistical significance. Results: On test FA images, mean FID for FA4SANS-GAN was 39.8 (standard deviation, 9.9), which is better than GANgio model's mean of 43.2 (standard deviation, 13.7), Pix2PixHD's mean of 57.3 (standard deviation, 11.5) and Pix2Pix's mean of 67.5 (standard deviation, 11.7). Similarly for KID, FA4SANS-GAN achieved mean of 0.00278 (standard deviation, 0.00167) which is better than other 3 model's mean KID of 0.00303 (standard deviation, 0.00216), 0.00609 (standard deviation, 0.00238), 0.00784 (standard deviation, 0.00218). For TOST measurement, FA4SANS-GAN was proven to be statistically significant versus GANgio (P = 0.006); versus Pix2PixHD (P < 0.00001); and versus Pix2Pix (P < 0.00001). Conclusions: Our study has shown FA4SANS-GAN to be statistically significant for 2 GAN synthesis metrics. Moreover, it is robust against acquisition noise, and can retain clear biological markers compared with the other 3 GAN architectures. This deployment of this model can be crucial in the International Space Station for detecting SANS. Financial Disclosure(s): The authors have no proprietary or commercial interest in any materials discussed in this article.

Original languageEnglish (US)
Article number100493
JournalOphthalmology Science
Volume4
Issue number4
DOIs
StatePublished - Jul 1 2024

Keywords

  • Astronaut
  • Generative adversarial networks
  • Ophthalmic imaging
  • Space medicine
  • Spaceflight-associated neuro-ocular syndrome

ASJC Scopus subject areas

  • Ophthalmology

Fingerprint

Dive into the research topics of 'FA4SANS-GAN: A Novel Machine Learning Generative Adversarial Network to Further Understand Ophthalmic Changes in Spaceflight Associated Neuro-Ocular Syndrome (SANS)'. Together they form a unique fingerprint.

Cite this