TY - JOUR
T1 - A modular and scalable computational framework for interactive immersion into imaging data with a holographic augmented reality interface
AU - Velazco-Garcia, Jose D.
AU - Shah, Dipan J.
AU - Leiss, Ernst L.
AU - Tsekos, Nikolaos V.
N1 - Funding Information:
This work was supported by the National Science Foundation awards CNS-1646566 , DGE-1746046 , and DGE-1433817 . All opinions, findings, conclusions or recommendations expressed in this work are those of the authors and do not necessarily reflect the views of our sponsors.
Publisher Copyright:
© 2020
Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.
PY - 2021/1
Y1 - 2021/1
N2 - Background and objective: Modern imaging scanners produce an ever-growing body of 3D/4D multimodal data requiring image analytics and visualization of fused images, segmentations, and information. For the latter, augmented reality (AR) with head-mounted displays (HMDs) has shown potential. This work describes a framework (FI3D) for interactive immersion with data, integration of image processing and analytics, and rendering and fusion with an AR interface. Methods: The FI3D was designed and endowed with modules to communicate with peripherals, including imaging scanners and HMDs, and to provide computational power for data acquisition and processing. The core of FI3D is deployed to a dedicated computational unit that performs the computationally demanding processes in real-time, and the HMD is used as a display output peripheral and an input peripheral through gestures and voice commands. FI3D offers user-made processing and analysis dedicated modules. Users can customize and optimize these for a particular workflow while incorporating current or future libraries. Results: The FI3D framework was used to develop a workflow for processing, rendering, and visualization of CINE MRI cardiac sets. In this version, the data were loaded from a remote database, and the endocardium and epicardium of the left ventricle (LV) were segmented using a machine learning model and transmitted to a HoloLens HMD to be visualized in 4D. Performance results show that the system is capable of maintaining an image stream of one image per second with a resolution of 512 × 512. Also, it can modify visual properties of the holograms at 1 update per 16 milliseconds (62.5 Hz) while providing enough resources for the segmentation and surface reconstruction tasks without hindering the HMD. Conclusions: We provide a system design and framework to be used as a foundation for medical applications that benefit from AR visualization, removing several technical challenges from the developmental pipeline.
AB - Background and objective: Modern imaging scanners produce an ever-growing body of 3D/4D multimodal data requiring image analytics and visualization of fused images, segmentations, and information. For the latter, augmented reality (AR) with head-mounted displays (HMDs) has shown potential. This work describes a framework (FI3D) for interactive immersion with data, integration of image processing and analytics, and rendering and fusion with an AR interface. Methods: The FI3D was designed and endowed with modules to communicate with peripherals, including imaging scanners and HMDs, and to provide computational power for data acquisition and processing. The core of FI3D is deployed to a dedicated computational unit that performs the computationally demanding processes in real-time, and the HMD is used as a display output peripheral and an input peripheral through gestures and voice commands. FI3D offers user-made processing and analysis dedicated modules. Users can customize and optimize these for a particular workflow while incorporating current or future libraries. Results: The FI3D framework was used to develop a workflow for processing, rendering, and visualization of CINE MRI cardiac sets. In this version, the data were loaded from a remote database, and the endocardium and epicardium of the left ventricle (LV) were segmented using a machine learning model and transmitted to a HoloLens HMD to be visualized in 4D. Performance results show that the system is capable of maintaining an image stream of one image per second with a resolution of 512 × 512. Also, it can modify visual properties of the holograms at 1 update per 16 milliseconds (62.5 Hz) while providing enough resources for the segmentation and surface reconstruction tasks without hindering the HMD. Conclusions: We provide a system design and framework to be used as a foundation for medical applications that benefit from AR visualization, removing several technical challenges from the developmental pipeline.
KW - Augmented reality
KW - Computational framework
KW - Human-cyber Interface
KW - Interactive Immersion
KW - Medical imaging
KW - Rendering and visualization
UR - http://www.scopus.com/inward/record.url?scp=85092219783&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85092219783&partnerID=8YFLogxK
U2 - 10.1016/j.cmpb.2020.105779
DO - 10.1016/j.cmpb.2020.105779
M3 - Article
C2 - 33045556
AN - SCOPUS:85092219783
VL - 198
JO - Computer Methods and Programs in Biomedicine
JF - Computer Methods and Programs in Biomedicine
SN - 0169-2607
M1 - 105779
ER -