Collision detection for haptic interactions in virtual surgery
Yogendra Bhasin, The University of Texas at Arlington, United States
The University of Texas at Arlington . Awarded
Lack of depth perception and constrained movement of the surgical instruments employed in laparoscopic surgery limit operational effectiveness of the surgeons. For surgical simulation, the development of acceptable deformable organ models and collision detection algorithms that also have high computational speed and accuracy is crucial for accurate haptic rendering and visual realism. This research work addresses this problem by presenting a hybrid approach to perform collision detection between the instruments and deformable convex/concave organs. The proposed method utilizes an OpenGL based graphic hardware test and exploits geometric coherence between successive time steps of the simulation to provide accurate and efficient collision points at haptic rates. Multi-threading and fast search techniques have been used to reduce the computational time. Our prototype implementation has been meticulously tested, it achieves smooth, realistic haptic interactions, allowing users to accurately feel the force that arises from multiple collision points. This is expected to lead to precise haptic perception and visual realism, imperative for surgical simulators.
Bhasin, Y. Collision detection for haptic interactions in virtual surgery. Master's thesis, The University of Texas at Arlington. Retrieved March 19, 2019 from https://www.learntechlib.org/p/120594/.
Citation reproduced with permission of ProQuest LLC.
For copies of dissertations and theses: (800) 521-0600/(734) 761-4700 or https://dissexpress.umi.com