Biomedical Imaging and Instrumentation
Sohail Hassan (he/him/his)
Undergraduate Research Assistant
University of Texas at Dallas
LITTLE ELM, Texas, United States
Xinyuan Zhang
Research Assistant
University of Texas at Dallas, United States
Jie Yuan
Research Associate
University of Texas at Dallas, United States
Yichen Ding
Assistant Professor
University of Texas at Dallas, United States
Multiplexed imaging has emerged as a prominent technology for unravelling intricate structures at various levels, spanning from the molecular to organ level. While conventional data interpretation methods on panel displays have proven effective in visualizing and analyzing 3D and 4D data, the substantial growth in data volume and its increasing complexity necessitates a more efficient approach to assess volumetric and multiscale data derived from molecular and optical imaging techniques. We propose utilizing virtual reality (VR) for the study of images due to its capability for immersive stereoscopic vision, enabling in-depth analysis and visualization of our data as if we were looking at it in real life1. In addition, VR would allow us to manipulate 3D and dynamic models with user-defined controls. With our developed platform, we seek to demonstrate the feasibility of VR in answering fundamental questions about tissue morphology and function in model organisms.
To enable immersive visualization and interactive analysis of imaging data, we have developed a framework to convert raw image stacks to 3D surface meshes (see Figure 1A). These meshes can be imported into VR and operated upon using our developed functions. In the first step of our pipeline, raw data is labeled via image segmentation, a technique involving pixelwise classification of selected regions of interest. This allows us to extract necessary information from images, such as the lungs from a chest-CT and cardiomyocytes from a zebrafish heart. When dealing with dynamic 4D datasets that involve tracking of regions between subsequent time frames, additional post-processing steps are required. Once segmentation and processing are completed, we can finally convert the labeled data into a surface mesh. We have generated a VR environment using Unity, a 3D development engine, with coded functions that allow for both qualitative and quantitative analysis of the 3D surface models. These functions are uniquely designed to expand the current capabilities for image data analysis, allowing for greater flexibility in navigation, region selection, and measurements.
Overall, we have successfully developed a platform to render 3D/4D images into an explorative and interactive environment. The models generated by our workflow allow us to perform user-defined operations in an immersive setting. By establishing our VR program as open source, we hope to provide an easy-to-use tool that advances biological investigation and fundamental research.
Our framework grants additional perspectives in visualizing medical data, permitting physicians and students to carry out key functions, such as adjust model transparency in a CT lung model to better focus on the pulmonary vessels and measure a pulmonary vessel branch to quantify the extent of branching vasculature (see Figure 1B)2. In addition, users can utilize our slicing operation to cut meshes into sub-components, such as bisecting a CT cardiac model into superior and inferior components to investigate the inner ventricular and atrial structures. This level of visual and analytical capability overcomes current limitations in medical analysis. Rather than having physicians try to mentally “fuse” image slices into a single representation, we provide a method for immersive 3D vision and functions that cannot be performed on conventional monitors3.
With regards to 4D data, our platform allows for qualitative assessment of regional cardiac contraction through dynamic display of individual cardiomyocytes in a zebrafish (see Figure 1C)4. With our manipulative tools, we can select one or two cells at a time, analyzing them by displaying their trajectory and measuring their volume, surface area, velocity, and relative distance between each other. Experiments with our data in VR conveyed that the average velocity of atrial cells was higher than that of ventricular cells, and relative distance between cells varies depending on the location in the heart. Giving the opportunity for researchers to explore their data in greater detail provides greater potential for translatable research. Our results have demonstrated the feasibility of VR towards gaining insights about cardiac dynamics, holding promise towards solving issues such as the epidemic of heart disease. By allowing for better data interpretation and analysis, our novel approach will also be applicable to pertinent obstacles regarding medical education, surgical planning, and biological experiments.
1. Yuan, J., Hassan, S. S., Wu, J., Koger, C. R., Packard, R. R. S., Shi, F., Fei, B., & Ding, Y. (2023). Extended reality for biomedicine. Nature reviews methods primers, 3, 15. https://doi.org/10.1038/s43586-023-00208-z
2. Koger, C. R., Hassan, S. S., Yuan, J., & Ding, Y. (2022). Virtual Reality for Interactive Medical Analysis. Frontiers in virtual reality, 3, 782854. https://doi.org/10.3389/frvir.2022.782854
3. Krupinski E. A. (2010). Current perspectives in medical image perception. Attention, perception & psychophysics, 72(5), 1205–1217. https://doi.org/10.3758/APP.72.5.1205
4. Zhang, X., Almasian, M., Hassan, S. S., Jotheesh, R., Kadam, V. A., Polk, A. R., Saberigarakani, A., Rahat, A., Yuan, J., Lee, J., Carroll, K., & Ding, Y. (2023). 4D Light-sheet imaging and interactive analysis of cardiac contractility in zebrafish larvae. APL bioengineering, 7(2), 026112. https://doi.org/10.1063/5.0153214