Ultrahaptics – it’s magic in the air
Press release issued: 7 October 2013
A team from the University of Bristol’s Interaction and Graphics (BIG) research group have developed a solution that not only allows people to feel what is on an interactive surface, but also receive invisible information before they touch it.
Multi-touch surfaces offer easy interaction in public spaces, with people being able to walk-up and use them. However, people cannot feel what they have touched. A team from the University of Bristol’s Interaction and Graphics (BIG) research group have developed a solution that not only allows people to feel what is on the screen, but also receive invisible information before they touch it.
The research paper, to be presented at the ACM Symposium on User Interface Software and Technology (UIST) 2013 by Tom Carter from the Department of Computer Science, will unveil Ultrahaptics, a system designed to provide multipoint, mid-air haptic feedback above a touch surface.
Ultrahaptics uses the principle of acoustic radiation force where a phased array of ultrasonic transducers is used to exert forces on a target in mid-air. Haptic sensations are projected through a screen and directly onto the user’s hands.
The use of ultrasonic vibrations is a new technique for delivering tactile sensations to the user. A series of ultrasonic transducers emit very high frequency sound waves. When all of the sound waves meet at the same location at the same time, they create sensations on a human’s skin.
By carrying out technical evaluations, the team have shown that the system is capable of creating individual points of feedback that are far beyond the perception threshold of the human hand. The researchers have also established the necessary properties of a display surface that is transparent to 40kHz ultrasound.
The results from two user studies have demonstrated that feedback points with different tactile properties can be distinguished at smaller separations. The researchers also found that users are able to identify different tactile properties with training.
Finally, the research team explored three new areas of interaction possibilities that Ultrahaptics can provide: mid-air gestures, tactile information layers and visually restricted displays, and created an application for each.
Tom Carter, PhD student in the Department of Computer Science’s BIG research group, said: “Current systems with integrated interactive surfaces allow users to walk-up and use them with bare hands. Our goal was to integrate haptic feedback into these systems without sacrificing their simplicity and accessibility.
“To achieve this, we have designed a system with an ultrasound transducer array positioned beneath an acoustically transparent display. This arrangement allows the projection of focused ultrasound through the interactive surface and directly onto the users’ bare hands. By creating multiple simultaneous feedback points, and giving them individual tactile properties, users can receive localised feedback associated to their actions.”
A video demonstrating Ultrahaptics is available on YouTube.
Paper: Ultrahaptics: Multi-Point Mid-Air Haptic Feedback for Touch Surfaces, Thomas Carter, Sue Ann Seah, Benjamin Long, Bruce Drinkwater, Sriram Subramanian, UIST 2013, 8-11 October, St Andrews, UK.
This work is supported by the European Research Council (Starting Grant Agreement 278576) under the Seventh Framework Programme and by EPSRC (EP/J004448/1) through its responsive mode funding scheme.
About the Bristol Interaction and Graphics
The Bristol Interaction and Graphics (BIG), based in the University of Bristol’s Department of Computer Science, is united by a common interest in creative interdisciplinarity. BIG acts as a hub for collaboration between social scientists, artists, scientists and engineers to combine efficient and aesthetic design. The group is particularly interested in areas which couple the design of devices with deployment and evaluation in public settings. Members of the group have expertise in research areas spanning human-computer interaction, visual and tactile perception, imaging, visualisation and computer-supported collaboration.
About the ACM Symposium on User Interface Software and Technology (UIST)
The 26th ACM UIST symposium will take place from Tuesday 8 to Friday 11 October 2013 at St Andrews, UK.
The ACM Symposium on User Interface Software and Technology (UIST) is the premier forum for innovations in human-computer interfaces. Sponsored by ACM special interest groups on computer-human interaction (SIGCHI) and computer graphics (SIGGRAPH), UIST brings together people from diverse areas including graphical and web user interfaces, tangible and ubiquitous computing, virtual and augmented reality, multimedia, new input and output devices, and CSCW.