Most electronic devices, ranging from desktop computers to mobile phones, to PDAs, to in-vehicle "infotainment" systems, can be navigated using a menu structure. In cases where the user cannot look at or cannot see the interface, an auditory menu can provide access to the device's functions. Basic auditory menus involve text-to-speech (TTS) synthesis of the menu items. Our research is taking this type of interface to the next generation of interaction. Advanced auditory menus use a range of novel techniques like spearcons, spindexes, and auditory scrollbars to provide fast, efficient, error-free, and enjoyable interaction with an auditory or multimodal menu system.
Program of Research
We are studying advances to the TTS itself, as well as added (non-speech) auditory cues before, during, or after the TTS phrases. We evaluate the effectiveness, efficiency, learnability, and subjective impressions of all these enhancements, with a range of enhancement designs, in a range of use situations, on a range of devices, and with a range of user classes.
A spearcon is a brief non-speech sound that is created by speeding up the TTS phase in particular ways, even to the point where the spearcon is no longer recognizable as a particular word. Often a spearcon is prepended to each TTS menu item, and leads to faster, more accurate, and more enjoyable navigation through a menu. Spearcons are most effective in short and medium-length menus, but also improve long menus.
A spindex is a set of brief speech sounds that are used to speed up navigation through a long menu, such as a mobile phone contact list, or the song list on an MP3 player or iPod. They are analogous to the notches or tabs tabs that facilitate flipping to the right part of a large reference book such as a dictionary.
An auditory scrollbar is a set of sounds that help the user know how long an auditory menu is, and also which item in that list is currently selected. This can improve navigation, and also help listeners develp a mental model of the menu structure.
In addition to these enhancement techniques, we are developing and evaluating many others.
Publications Relating to the Research
(See the Publications page for all Sonification Lab publications.)
Walker, B. N., & Kogan, A. (2009). Spearcons enhance performance and preference for auditory menus on a mobile phone. Invited paper in Proceedings of the 5th International Conference on Universal Access in Human-Computer Interaction (UAHCI) at HCI International 2009, San Diego, CA, USA, (19-24 July). pp. TBD. <PDF>
Jeon, M., & Walker, B. N. (2009). "Spindex": Accelerated Initial Speech Sounds Improve Navigation Performance in Auditory Menus. Proceedings of the Annual Meeting of the Human Factors and Ergonomics Society (HFES2009), San Antonio, TX (19-23 October). pp. TBD. <PDF>
Palladino, D., & Walker, B. N. (2008). Efficiency of spearcon-enhanced navigation of one-dimensional electronic menus. Proceedings of the International Conference on Auditory Display (ICAD 2008), Paris, France (24-27 June). <PDF>
Palladino, D., & Walker, B. N. (2008). Navigation efficiency of two dimensional auditory menus using spearcon enhancements. Proceedings of the Annual Meeting of the Human Factors and Ergonomics Society (HFES2008), New York, NY (22-26 September). pp. 1262-1266. <PDF>
Yalla, P., & Walker, B. N. (2008). Advanced auditory menus: Design and evaluation of auditory scroll bars. Proceedings of the Tenth International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS2008), Halifax, Canada (13-15 October, 2008). pp. 105-112. DOI: 10.1145/1414471.1414492 <PDF>
Palladino, D., & Walker, B. N. (2007). Learning rates for auditory menus enhanced with spearcons versus earcons. Proceedings of the International Conference on Auditory Display (ICAD 2007), Montreal, Canada (26-29 June). pp. 274-279. <PDF>
Yalla, P., & Walker, B. N. (2007). Advanced Auditory Menus. Georgia Institute of Technology GVU Center Technical Report # GIT-GVU-07-12. October. <PDF>
Walker, B. N., Nance, A., & Lindsay, J. (2006). Spearcons: Speech-based Earcons Improve Navigation Performance in Auditory Menus. Proceedings of the International Conference on Auditory Display (ICAD 2006), London, England (20-24 June). pp. 63-68. <PDF>
This research has been supported, in part, by grants from the US Department of Education through the National Institute on Disability and Rehabilitation Research (NIDRR) via the Wireless RERC, and the National Science Foundation (NSF). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the funding agencies or sponsors.