Dr Nick Bryan-Kinns, Reader in Interaction Design in the School of Electronic Engineering and Computer Science tells us about his latest research into how sighted and visually impaired people use touchscreen devices that they can’t see. This paper won the Best Short Paper prize at the Human Computer Interaction Conference 2014.
Touch-screens are increasingly ubiquitous in our lives, not just in the smart phones and tablets that we use almost unthinkingly, but also when we pay for our shopping at a self-service checkout, when we buy a train ticket, and in any number of other situations.
For many people and in many situations touchscreens are the easiest way to interact with a device, but what about when we can’t easily look at the device or if the user is visually impaired? We are interested in exploring the ways in which auditory and tactile cues can help to solve these problems and how they affect the user-experience.
Previously the team at QMUL have been developing non-visual access to diagrams including for tube maps but also circuit diagrams and mind-mapping, allowing sighted and visually-impaired colleagues to work better together. Our current research is looking at developing audio production software that is accessible to visually-impaired musicians – potentially opening up whole new careers to people previously denied the opportunity by the limits of technology.
In a recent preliminary study we wanted to find out how sighted users’ experiences of navigating menus on a touch-screen that they couldn’t see was affected by audio-only and audio and tactile, feedback. Using a Samsung Galaxy Note device, users in our study were asked to navigate a series of menus using touch-based gestures and tapping, much as they would when using a touch-device in the real world. In the audio-only test the device read out the menu labels to the user and in the audio-tactile test there was an additional vibration whenever the user touched a menu item.
Interestingly, in contrast to similar tests done elsewhere the addition of tactile feedback seems to have slowed users down in our test. We’re not yet sure why this is, and the slower completion times weren’t reflected by the participants, who rated the effort taken equally for both tests, but we hope that further wider studies might shed some light. This type of subjective ‘cognitive load’ measurement is often overlooked in other tests and the contrast in usability of different methods would be an interesting topic for future study, particularly when users have others things, often other screens, competing for their attention.
This was a limited preliminary study that threw up some thought provoking questions about when different types of feedback might be most useful to users. How can we improve the user experience of sighted people by making use of their other sense? How can we best configure touchscreens in public so that visually-impaired users can use them unhindered? And, what applications can we find for touchscreens that allow greater collaboration between sighted and visually-impaired colleagues?
Dr Nick Bryan-Kinns is Reader in Interaction Design, based in the School of Electronic Engineering and Computer Science at Queen Mary University of London.
For media information, contact: