Gesture-Based Interaction for Volume Visualization
Traditional tactile interaction methods such as trackers or the mouse can cause problems when used in the operating room. Trackers can be expensive, difficult to setup and calibrate and prone to errors due to line-of-sight requirements or electromagnetic interference. The mouse is ubiquitous, but it is a 2D interface and might be difficult in 3D interaction tasks necessary for volume visualization. The common problem for all these interfaces is that they require sterilization, a process that is time consuming, costly, and possibly can cause complications.
The use of gestures can replace these interaction methods and eliminate these problems. Furthermore, intuitive methods like these can decrease the training time necessary for performing interaction tasks important for medical volume visualization. We are developing such methods for common tasks such as data exploration, editing and rotation. The user studies conducted showed that novice users can perform tasks such as matching rotations and finding internal structures in a volumetric dataset very effectively and quickly using a gesture-based interface compared to the mouse.