Device Technologies and Biomedical Robotics
Revolutionizing Robot-Assisted Femur Fracture Surgery Training: a Virtual Reality Haptic Simulator
Hoang Huy Nguyen
Student
Rowan University
Camden, New Jersey, United States
Jacob Logar
Undergrate Student
Rowan University, United States
Fayez Alruwaili
PHD student
Rowan University, New Jersey, United States
Mohammad Abedin-Nasab
Professor
Rowan University, United States
Surgical standards and training are achieved through numerous hours of practice and failure using imitated models under the supervision of a limited number of highly specialized surgeons [1]. This can be very time-consuming and costly for future specialized surgeons to attain the experience needed for operational proficiency. Virtual reality (VR) simulators allow residents and skilled surgeons alike, to learn new complex surgical procedures when it best fits their schedules and grants the opportunity to perfect their skills through failure with low risk [2]. Through this more flexible training regimen, surgical evaluation improvements such as shortened surgical times, greater tool manipulability, and greater accuracy have been translated into the operating room [3]. Today’s long bone femur fracture surgery has many complications including high malalignment rates, high exposure to X-ray radiation, high fracture reduction forces, and extended amounts of time [4]–[8]. In response to these complications, our group has presented a surgical system, called Robossis, that aids in the elimination of the foreseen complications [4]–[8]. As such, surgical accuracy, efficacy, and confidence can be increased using the novel robotic system, however, it is limited by the training and familiarity of the surgeon(s) and operating staff. Specifically for a novel robotic surgery, a VR simulator system is a necessity to overcome the challenges with training time and costs for the surgeon(s) and the operating staff [2]. In this work, we propose to develop a VR environment for robot-assisted femur fracture surgery, using the novel Robossis system.
The architecture of the designed VR environment includes the haptic Sigma-7 controller, rendering and velocity restriction pipeline, control algorithms, and the Gazebo simulator to house the VR environment and Robossis surgical robot. We use the haptic Sigma-7 as the interface between the user and the VR environment. Mainly, the haptic Sigma-7 couples the motion of the user’s hand to manipulate the Robossis surgical robot and experience haptic feedback. Specifically, we implement a force restriction on the motion of the user’s hand to be within the joint limits of the Robossis surgical robot. Also, we restrict the speed of the Robossis surgical robot by dynamically scaling the motion based on a capped maximum desired speed (user-defined). We interface the kinematics of the Robossis surgical robot with the Gazebo simulator to manipulate the Robossis surgical robot within the VR environment. The Gazebo environment was designed to house the surgical operating room based on the layout from a previous cadaver experiment with the system. Further, position control is used to drive the motion of the Robossis surgical robot’s active joints (Θi) and linear actuators (di). We designed algorithms that are custom-designed model plugins for the Gazebo simulator to interface the haptic controller and motion control of the Robossis surgical robot.
The VR environment was designed to mimic a realistic operating room, complete with the Robossis surgical robot, a surgical bed, a patient in the supine position with an exposed fracture, and a haptic Sigma-7 controller (Figure 1). The haptic Sigma-7 was the interface between the user and the VR environment, enabling users to manipulate the Robossis surgical robot and experience haptic feedback. Simulation testing was conducted to determine the deviation of the Robossis surgical robot from the motion of the user's hand through the haptic controller. The results showed that the user's hand motion was accurately translated to the robot's movement, with a maximum deviation for the linear actuator length (~ 0.01 m) and active joint rotation (~ 0.6 deg) during the entire simulation (Figure 2). Despite some peaks in the deviation, the PID controller always drives the error back to a minimum. Also, the haptic rendering algorithm restricted the user's hand motion within the joint space limits of the Robossis robot, preventing any potential joint damage during real-life movement. The VR simulation environment provided a spatial and visual feel of the operating room, aiding surgeons in better translating their techniques to real-world settings. Realistic environmental and system model integration helped surgeons become more comfortable and gain confidence in the robot-environment interaction. The kinematic matching between the haptic controller and the virtual Robossis system ensured real-time evaluation during surgical training. Future improvements to the VR simulation include connecting it with a VR headset for a 3-dimensional worldview and providing live bone references during virtual training. As the project progresses, feedback from surgeons will be sought to continually improve and refine the VR training model for the benefit of future Robossis-implemented procedures.
[1] 1 - M. P. Rogers, A. J. DeSantis, H. Janjua, T. M. Barry, and P. C. Kuo, “The future surgical training paradigm: Virtual reality and machine learning in surgical education,” Surgery, vol. 169, no. 5, pp. 1250–1252, 2021.
[2] 2 - A. J. Lungu, W. Swinkels, L. Claesen, P. Tu, J. Egger, and X. Chen, “A review on the applications of virtual reality, augmented reality and mixed reality in surgical simulation: an extension to different kinds of surgery,” Expert review of medical devices, vol. 18, no. 1, pp. 47–62, 2021.
[3] 3 - M. P. Fried et al., “From virtual reality to the operating room: the endoscopic sinus surgery simulator experiment,” Otolaryngology—Head and Neck Surgery, vol. 142, no. 2, pp. 202–207, 2010.
[4] 4 - M. S. Saeedi-Hosseiny, F. Alruwaili, S. McMillan, I. Iordachita, and M. H. Abedin-Nasab, “A Surgical Robotic System for Long-Bone Fracture Alignment: Prototyping and Cadaver Study,” IEEE Trans Med Robot Bionics, pp. 1–1, 2021, doi: 10.1109/TMRB.2021.3129277.
[5] 5 - Marzieh S. Saeedi-Hosseiny, Fayez Alruwaili, Akash S. Patel, Sean McMillan, Iulian I. Iordachita, and Mohammad H. Abedin-Nasab, “Spatial Detection of the Shafts of Fractured Femur for Image-Guided Robotic Surgery,” IEEE Engineering in Medicine & Biology Society (EMBC), 2021.
[6] 6 - F. Alruwaili, M. S. Saeedi-Hosseiny, M. Clancy, S. McMillan, I. I. Iordachita, and M. H. Abedin-Nasab, “Experimental Evaluation of a 3-Armed 6-DOF Parallel Robot for Femur Fracture Surgery,” J Med Robot Res, vol. 07, no. 04, p. 2241009, Feb. 2022, doi: 10.1142/s2424905x22410094.
[7] 7 - F. Alruwaili, M. S. Saeedi-Hosseiny, L. Guzman, S. McMillan, I. I. Iordachita, and M. H. Abedin-Nasab, “A 3-Armed 6-DOF Parallel Robot for Femur Fracture Reduction: Trajectory and Force Testing,” in 2022 International Symposium on Medical Robotics (ISMR), IEEE, 2022, pp. 1–6.
[8] 8 - M. H. Abedinnasab, F. Farahmand, B. Tarvirdizadeh, H. Zohoor, and J. Gallardo-Alvarado, “Kinematic effects of number of legs in 6-DOF UPS parallel mechanisms,” Robotica, vol. 35, no. 12, pp. 2257–2277, 2017.