Display technologies used in the project: (left) holographic display; (middle) holographic display (close-up); (right) stereo-mirror display from SenseGraphics AB.
Our vision is a new interaction paradigm that gives the user an unprecedented ability to touch and manipulate high contrast, high resolution, three-dimensional (3D) virtual objects suspended in space, using a glove that gives realistic whole hand haptic feedback closely resembling interaction with objects in real life.
The system will consist of two main components: The first is a whole hand haptic system comprising a glove mounted on a robot arm that gives the user force feedback during manipulation of an object and a tactile array subsystem that renders textures to the fingertips. The second component is a three-dimensional display based on a holographic optical element (HOE) that permits the user glasses-free interaction with a virtual object by reaching into the object with the gloved hand. We also use the haptic glove with a 3D stereo system with a more traditional 3D display comprising a semi-transparent mirror and stereo glasses. Our driving application is maxillofacial surgery, giving a surgeon the ability to plan complex surgical procedures, moving bone and bone replacement pieces into a desired position and then stretching the soft tissue to match the structural changes.
The glove prototype comprises an exoskeleton made in plastic by rapid prototyping and a piezoelectric motor from Piezomotor AB controlling the grip force using a force sensor in a feed-back loop. The exoskeleton combined with the piezoelectric motor creates the stiffness and counter-forces we expect when a hand interacts with a (virtual) object. The exoskeleton prototype has six degrees of freedom (DOF) movement of the hand and one DOF gripping with the thumb and index finger. The six DOF movement is accomplished with a commercial haptic arm, the SensAble Phantom Premium, while the gripping exoskeleton "glove" is developed within this project.
Video: Showing two usage examples of Snap-to-fit, a haptic 6-DOF alignment tool for virtual assembly. In the first example, we use Snap-to-fit as a guide to virtually find the proper position of a fractured and displaced zygomatic bone, loaded from volumetric CT data. In the second example, we virtually assemble a pre-historic spear tip, fractured into two pieces.
To feel object elasticity is important to identify objects in the world around us, and it is particularly important in surgery for manipulation of soft tissue.
Video: Squeezing a virtual ball with a haptic glove
Gripping of objects with two fingers allows object manipulation that is not feasible, or at least is very cumbersome, with one point of contact with an object. Using two fingers, the user may lift and manipulate the object in 3D. A physics simulation which includes weight, inertia, and gravity adds to the realism.
Video: Lifting a virtual box with a haptic glove
Co-located haptics will gain importance when more advanced haptic interfaces, such as high-fidelity whole hand devices, become available. We have investigated the pros and cons with physically co-located versus non-collocated haptics on our two displays. We use two accuracy tasks with spatial accuracy as the dependent variable and one manipulation task with time as the dependent variable.
Co-location experiments: (left) accuracy task: trace spiral; (middle) dynamic task: push cube through labyrinth; (right) accuracy task: find corners of cube.
The holographic optical element (HOE) acts as a projection screen for a number of digital projectors, and each projector image can be viewed within a narrow angle representing one view of a 3D object. With multiple, properly spaced, projectors each eye receives a separate view and the user perceives a 3D model suspended in space above the holographic plate without any obstacle between the viewer and the image, allowing him/her to reach into the 3D model. An increase in the number of projectors extends the field of view and allows viewing of the object from different angles. The 3D model can be manipulated in real time using graphics capabilities of modern computer systems.
The project steering committee, in addition to the project leader and assistant project leader, comprises Håkan Lanshammar, Head of the Department of Information Technology, Uppsala University; Lars Mattsson, Head of Industrial Metrology and Optics, KTH; Stefan Johansson, PiezoMotor AB and Department of Engineering Sciences, Uppsala University; and Jan-Michael Hirsch, Department of Surgical Sciences, Oral and Maxillofacial Surgery, Uppsala University.
Feeling is believing
Visualization and haptics for interactive medical image analysis
Orbit segmentation for cranio-maxillofacial surgery planning
The Visualization Program by Knowledge Foundation, VINNOVA, SSF, ISA, and Vårdalsstiftelsen.