TY - JOUR
T1 - RFormer
T2 - Transformer-Based Generative Adversarial Network for Real Fundus Image Restoration on a New Clinical Benchmark
AU - Deng, Zhuo
AU - Cai, Yuanhao
AU - Chen, Lu
AU - Gong, Zheng
AU - Bao, Qiqi
AU - Yao, Xue
AU - Fang, Dong
AU - Yang, Wenming
AU - Zhang, Shaochong
AU - Ma, Lan
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2022/9/1
Y1 - 2022/9/1
N2 - Ophthalmologists have used fundus images to screen and diagnose eye diseases. However, different equipments and ophthalmologists pose large variations to the quality of fundus images. Low-quality (LQ) degraded fundus images easily lead to uncertainty in clinical screening and generally increase the risk of misdiagnosis. Thus, real fundus image restoration is worth studying. Unfortunately, real clinical benchmark has not been explored for this task so far. In this paper, we investigate the real clinical fundus image restoration problem. Firstly, We establish a clinical dataset, Real Fundus (RF), including 120 low- and high-quality (HQ) image pairs. Then we propose a novel Transformer-based Generative Adversarial Network (RFormer) to restore the real degradation of clinical fundus images. The key component in our network is the Window-based Self-Attention Block (WSAB) which captures non-local self-similarity and long-range dependencies. To produce more visually pleasant results, a Transformer-based discriminator is introduced. Extensive experiments on our clinical benchmark show that the proposed RFormer significantly outperforms the state-of-the-art (SOTA) methods. In addition, experiments of downstream tasks such as vessel segmentation and optic disc/cup detection demonstrate that our proposed RFormer benefits clinical fundus image analysis and applications.
AB - Ophthalmologists have used fundus images to screen and diagnose eye diseases. However, different equipments and ophthalmologists pose large variations to the quality of fundus images. Low-quality (LQ) degraded fundus images easily lead to uncertainty in clinical screening and generally increase the risk of misdiagnosis. Thus, real fundus image restoration is worth studying. Unfortunately, real clinical benchmark has not been explored for this task so far. In this paper, we investigate the real clinical fundus image restoration problem. Firstly, We establish a clinical dataset, Real Fundus (RF), including 120 low- and high-quality (HQ) image pairs. Then we propose a novel Transformer-based Generative Adversarial Network (RFormer) to restore the real degradation of clinical fundus images. The key component in our network is the Window-based Self-Attention Block (WSAB) which captures non-local self-similarity and long-range dependencies. To produce more visually pleasant results, a Transformer-based discriminator is introduced. Extensive experiments on our clinical benchmark show that the proposed RFormer significantly outperforms the state-of-the-art (SOTA) methods. In addition, experiments of downstream tasks such as vessel segmentation and optic disc/cup detection demonstrate that our proposed RFormer benefits clinical fundus image analysis and applications.
KW - Real Fundus Image Restoration
KW - generative Adversarial Network
KW - self-Attention
KW - transformer
UR - http://www.scopus.com/inward/record.url?scp=85133784910&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85133784910&partnerID=8YFLogxK
U2 - 10.1109/JBHI.2022.3187103
DO - 10.1109/JBHI.2022.3187103
M3 - Article
C2 - 35767498
AN - SCOPUS:85133784910
SN - 2168-2194
VL - 26
SP - 4645
EP - 4655
JO - IEEE Journal of Biomedical and Health Informatics
JF - IEEE Journal of Biomedical and Health Informatics
IS - 9
ER -