Showing posts with label HCI. Show all posts
Showing posts with label HCI. Show all posts

Friday, 30 September 2016

Design Considerations for Haptic and Auditory Map Interfaces

Rice, M., Jacobson, R.D., Golledge, R.G., and Jones, D. (2005) Design Considerations for Haptic and Auditory Map Interfaces. Cartography and Geographic Information Science, 32 (4), 381-391http://dx.doi.org/10.1559/152304005775194656


Abstract

Communicating spatial information to the blind and visually impaired using maps and graphics presents many difficulties. Past research has offered advice to cartographers on topics such as tactile areal, point, and line symbolization; on perceptual problems related to dense linear features on tactile maps; and on the relationship between categorical data, measurement theory, and tactile discrimination. With this previous work as a foundation, we describe our research efforts with haptic and auditory maps - the Haptic Soundscapes Project. Haptic Soundscapes maps allow blind and visually-impaired individuals to feel map features through force feedback devices and hear auditory cues that add both redundant and complementary information. Recent experimental work by the authors has led to several recommended practices for cartographic data simplification, object size discrimination, shape identification, and general interface navigation. The authors also present haptic and auditory mapping examples to illustrate design ideas, algorithms, and technical requirements. Future prospects for automated haptic and auditory map creation are discussed and presented in the context of the past work in generating maps for the blind and visually impaired from cartographic data.

[VIEW PDF]

Thursday, 16 May 2013

Representing Spatial Information Through Multimodal Interfaces: Overview and preliminary results in non-visual interfaces

Jacobson, R.D. (2002) Representing Spatial Information Through Multimodal Interfaces: Overview and preliminary results in non-visual interfaces.  6th International Conference on Information Visualization: Symposium on Spatial/Geographic Data Visualization, IEEE Proceedings, London, 10-12 July, 2002, 730-734.

Abstract

The research discussed here is a component of a larger study to explore the accessibility and usability of spatial data presented through multiple sensory modalities including haptic, auditory, and visual interfaces.  Geographical Information Systems (GIS) and other computer-based tools for spatial display predominantly use vision to communicate information to the user, as sight is the spatial sense par excellence. Ongoing research is exploring the fundamental concepts and techniques necessary to navigate through multimodal interfaces, which are user, task, domain, and interface specific. This highlights the necessity for both a conceptual / theoretical schema, and the need for extensive usability studies.  Preliminary results presented here exploring feature recognition, and shape tracing in non-visual environments indicate multimodal interfaces have a great deal of potential for facilitating access to spatial data for blind and visually impaired persons. The research is undertaken with the wider goals of increasing information accessibility and promoting “universal access”.  

[VIEW PDF]

Friday, 3 May 2013

Haptic Soundscapes

Towards making maps, diagrams and graphs accessible to visually impaired people 

The aim of this research project is to develop and evaluate haptic soundscapes. This allows people with little or no vision to interact with maps, diagrams and graphs displayed via dissemination media, such as the World Wide Web, through sound, touch and force feedback. Although of principal utility for people with severe visual impairments, it is anticipated that this interface will allow informative educational resources for children and people with learning difficulties to be developed and accessed through the Internet. The research project offers a simple, yet innovative solution to accessing spatial data without the need for vision. It builds upon previous work carried out in various departments at UCSB, and fosters inter-disciplinary links and cooperation between usually unconnected research groups. The research hopes to further knowledge and understanding in this emerging field and also to offer practical results that will impact on people's lives. It is strongly felt that the development of the project will lead to continued external funding, and it is our hope that this project will act as a springboard to further research in which UCSB will be a key component.

Further development, usability testing, and expansion
 
The Haptic Soundscapes project has developed a set of audio-tactile mapping tools to help blind people access spatial information and to help aid research in multi-modal spatial cognition. These tools offer blind people access to the geographic world they cannot otherwise fully experience, creating opportunities for orientation, navigation, and education. Spatial knowledge from maps, charts, and graphs, is obtained through display and interaction with sound, touch, and force-feedback devices. Individuals can use audio-tactile mapping tools to explore an unknown environment or create a audio-tactile map from images displayed on a computer screen. These audio-tactile maps can be disseminated over the internet, or used in educational settings. Next year, several objectives are planned for the Haptic Soundscapes project. These include cognitive experiments to assess a user’s ability to navigate within a scene, between adjacent scenes, and between scenes of different scales using the audio-tactile mapping tools. We will also expand the capability of the audio-tactile mapping system to include text-to-speech synthesis and real-time multi-dimensional sound representation. Several off-campus funding proposals will be submitted. Finally, we will showcase the tools developed in the course of this project by expanding our campus demonstrator - an interactive, navigable audio-tactile map of the UCSB campus.