Showing posts with label sound. Show all posts
Showing posts with label sound. Show all posts

Friday, 17 May 2013

Haptic Soundscapes: Developing novel multi-sensory tools to promote access to geographic information

Jacobson, R.D. (2004) Haptic Soundscapes: Developing novel multi-sensory tools to promote access to geographic information. In: Janelle,D., Warf, B., and Hansen, K (eds.) WorldMinds: Geographical Perspectives on 100 problems. Kluwer: Dordrecht, pp 99-103.


This essay explores the critical need for developing new tools to promote access to geographic information that have throughout history been conventionally represented by maps. This problem is especially acute for vision-impaired individuals. The need for new tools to access map-like information is driven by the changing nature of maps, from static paper-based products to digital representations that are interactive, dynamic, and
distributed across the Internet. This revolution in the content, display, and availability of geographic representations generates a significant problem and an opportunity. The problem is that for people without sight there is a wealth of information that is inaccessible due the visual nature of computer displays. At the same time the digital nature of geographic information provides an opportunity for making information accessible to non-visual users by presenting the information in different sensory modalities in computer interfaces, such as, speech, touch, sound, and haptics (computer generated devices that allow users to interact with and to feel information).


Thursday, 16 May 2013

Exploratory user study of haptic and auditory display for multimodal information systems

Jeong, W. and Jacobson, R.D. (2002) Exploratory user study of haptic and auditory display for multimodal information systems. In: McLaughlin, M. L., Hespanha, J.P., and Sukhatme, G.S. (eds.) Touch in virtual Environments: Haptics and the design of interactive systems. IMSC Series in Multimedia, Prentice Hall: New York, pp. 194-204.


Since the inception of virtual reality (VR) environments, interaction has been predominantly visual, especially in conveying spatial information. However, in many situations vision is not enough or is not available. For example, for the visually impaired over-reliance on visual display denies them access to the information. Even for the general population, if there is no light or weak light, a visual display is not optimal for conveying information. Recently a number of researchers have tried to add other modalities, such as sound or haptics, to overcome the imitations of visual display.


Friday, 3 May 2013

Haptic Soundscapes

Towards making maps, diagrams and graphs accessible to visually impaired people 

The aim of this research project is to develop and evaluate haptic soundscapes. This allows people with little or no vision to interact with maps, diagrams and graphs displayed via dissemination media, such as the World Wide Web, through sound, touch and force feedback. Although of principal utility for people with severe visual impairments, it is anticipated that this interface will allow informative educational resources for children and people with learning difficulties to be developed and accessed through the Internet. The research project offers a simple, yet innovative solution to accessing spatial data without the need for vision. It builds upon previous work carried out in various departments at UCSB, and fosters inter-disciplinary links and cooperation between usually unconnected research groups. The research hopes to further knowledge and understanding in this emerging field and also to offer practical results that will impact on people's lives. It is strongly felt that the development of the project will lead to continued external funding, and it is our hope that this project will act as a springboard to further research in which UCSB will be a key component.

Further development, usability testing, and expansion
The Haptic Soundscapes project has developed a set of audio-tactile mapping tools to help blind people access spatial information and to help aid research in multi-modal spatial cognition. These tools offer blind people access to the geographic world they cannot otherwise fully experience, creating opportunities for orientation, navigation, and education. Spatial knowledge from maps, charts, and graphs, is obtained through display and interaction with sound, touch, and force-feedback devices. Individuals can use audio-tactile mapping tools to explore an unknown environment or create a audio-tactile map from images displayed on a computer screen. These audio-tactile maps can be disseminated over the internet, or used in educational settings. Next year, several objectives are planned for the Haptic Soundscapes project. These include cognitive experiments to assess a user’s ability to navigate within a scene, between adjacent scenes, and between scenes of different scales using the audio-tactile mapping tools. We will also expand the capability of the audio-tactile mapping system to include text-to-speech synthesis and real-time multi-dimensional sound representation. Several off-campus funding proposals will be submitted. Finally, we will showcase the tools developed in the course of this project by expanding our campus demonstrator - an interactive, navigable audio-tactile map of the UCSB campus.