Showing posts with label tactile. Show all posts
Showing posts with label tactile. Show all posts

Saturday, 18 May 2013

Multimodal Interfaces for Representing and Accessing Geospatial Information

Golledge, R.G., Rice, M., and Jacobson, R.D. (2006) Multimodal Interfaces for Representing and Accessing Geospatial Information. In: Rana, S. and Sharma, J. (eds.) Frontiers of Geographic Information Technology. Springer-Verlag: Berlin & New York, pp 181-208.

 Abstract

Multimodal interfaces have a great potential impact in our daily lives and in the education of students in all grades.  In particular, they offer significant benefits for people who are disabled.  The use of tactile, haptic, and auditory interfaces has a potential to make technology more universally accessible.  To this extent it will
mitigate the rapidly expanding digital divide between those who are able to use computers to access the Internet and web page information (i.e., those who  are computer literate) and those who are not.
Information technology transformations are affecting how we communicate, how we store and access information, how we become healthier and receive more medical care, how we learn at different stages of our development, how business is conducted, how work is undertaken in order to produce income, how things are built or designed, how data is stored and managed, and how research is conducted.  With the increasing emphasis on visualization as the main interface medium for computer based services, an ethical problem emerges regarding whether or not people who are visually impaired or who have other tactile, haptic, or auditory impairments should be increasingly disabled by the trend towards digital communication and information processing.  We believe that such groups should not be shut out from the advantages offered by the use of this technology, just as we believe that multimodal interfaces will enrich the understanding of the computer-based input and output of information that is becoming a part of our everyday lives. 

 
[VIEW PDF]

Thursday, 16 May 2013

Navigating maps with little or no sight: A novel audio-tactile approach

Jacobson, R.D. (1998) Navigating maps with little or no sight: A novel audio-tactile approach. Proceedings of Content Visualization and Intermedia Representations. August 15, University of Montreal, Montreal.
Abstract 

This paper first presents a review of the options available for conveying maps and graphics to visually impaired and blind people. A novel audio-tactile methodology is described, and the results from its pilot study reported. Communication of spatial media, such as map, is problematic without sight. Tactile perception is serial rather
than synoptic. By building a working model of the environment that is uses both tactile and auditory feedback, a map is made far more accessible. Results from the pilot study demonstrated simplicity and enjoyment of use of this novel approach which integrates speech, verbal landmarks, earcons and recorded environmental sound
to build a small spatial hypermedia system.

Friday, 3 May 2013

Haptic Soundscapes

Towards making maps, diagrams and graphs accessible to visually impaired people 

The aim of this research project is to develop and evaluate haptic soundscapes. This allows people with little or no vision to interact with maps, diagrams and graphs displayed via dissemination media, such as the World Wide Web, through sound, touch and force feedback. Although of principal utility for people with severe visual impairments, it is anticipated that this interface will allow informative educational resources for children and people with learning difficulties to be developed and accessed through the Internet. The research project offers a simple, yet innovative solution to accessing spatial data without the need for vision. It builds upon previous work carried out in various departments at UCSB, and fosters inter-disciplinary links and cooperation between usually unconnected research groups. The research hopes to further knowledge and understanding in this emerging field and also to offer practical results that will impact on people's lives. It is strongly felt that the development of the project will lead to continued external funding, and it is our hope that this project will act as a springboard to further research in which UCSB will be a key component.

Further development, usability testing, and expansion
 
The Haptic Soundscapes project has developed a set of audio-tactile mapping tools to help blind people access spatial information and to help aid research in multi-modal spatial cognition. These tools offer blind people access to the geographic world they cannot otherwise fully experience, creating opportunities for orientation, navigation, and education. Spatial knowledge from maps, charts, and graphs, is obtained through display and interaction with sound, touch, and force-feedback devices. Individuals can use audio-tactile mapping tools to explore an unknown environment or create a audio-tactile map from images displayed on a computer screen. These audio-tactile maps can be disseminated over the internet, or used in educational settings. Next year, several objectives are planned for the Haptic Soundscapes project. These include cognitive experiments to assess a user’s ability to navigate within a scene, between adjacent scenes, and between scenes of different scales using the audio-tactile mapping tools. We will also expand the capability of the audio-tactile mapping system to include text-to-speech synthesis and real-time multi-dimensional sound representation. Several off-campus funding proposals will be submitted. Finally, we will showcase the tools developed in the course of this project by expanding our campus demonstrator - an interactive, navigable audio-tactile map of the UCSB campus.