Showing posts with label auditory. Show all posts
Showing posts with label auditory. Show all posts

Friday, 31 May 2013

Multimodal zooming in digital geographic information

As a basic research issue, how well can people integrate and reconcile spatial information from various modalities, and how useful is such integration?

As an applied issue, what is the potential for haptic and auditory navigation within geographic information systems? Can visual information be augmented by the presentation of information via other modalities, namely, haptics and audition, and if so, to what extent?

The research will investigate a particular form of navigation within geographic information systems, namely, zooming. The research aims to investigate non-visual methods of representing or augmenting a visual zoom through the auditory and haptic senses, creating a multimodal zooming mechanism.

Sunday, 19 May 2013

Implementing Auditory Icons in Raster GIS

MacVeigh, R. and Jacobson, R.D. (2007) Implementing Auditory Icons in Raster GIS, Proceedings of the 13th InternationalConference on Auditory Display, Montréal, Québec, Canada, 530-535.


This paper describes a way to incorporate sound into a raster based classified image. Methods for determining the sound location, amplitude, type and how to create a layer to store the information are described. Hurdles are discussed and suggestions of how to overcome them are presented. As humans we rely on our senses to help us navigate the world. Sight, sound, touch, taste, and smell; they all help us perceive our environment. Although we sometimes take vision for granted, all our other senses play as important of a role in our daily lives. Even with all these senses at our disposal, the conventional GIS very
uncommonly do much more than convey their information visually. We demonstrate an auditory display with a sample implementation using a classified raster image, commonly used in a GIS analysis. This was achieved using a spatial sonification algorithm initially created in a Java environment. The ultimate aim of this work  is to develop an interactive mapping technology that fully incorporates auditory display, over a variety of platforms and applications. Such a tool would have the potential be of great benefit for displaying multivariate
information in complex information displays.


Saturday, 18 May 2013

Multimodal Interfaces for Representing and Accessing Geospatial Information

Golledge, R.G., Rice, M., and Jacobson, R.D. (2006) Multimodal Interfaces for Representing and Accessing Geospatial Information. In: Rana, S. and Sharma, J. (eds.) Frontiers of Geographic Information Technology. Springer-Verlag: Berlin & New York, pp 181-208.


Multimodal interfaces have a great potential impact in our daily lives and in the education of students in all grades.  In particular, they offer significant benefits for people who are disabled.  The use of tactile, haptic, and auditory interfaces has a potential to make technology more universally accessible.  To this extent it will
mitigate the rapidly expanding digital divide between those who are able to use computers to access the Internet and web page information (i.e., those who  are computer literate) and those who are not.
Information technology transformations are affecting how we communicate, how we store and access information, how we become healthier and receive more medical care, how we learn at different stages of our development, how business is conducted, how work is undertaken in order to produce income, how things are built or designed, how data is stored and managed, and how research is conducted.  With the increasing emphasis on visualization as the main interface medium for computer based services, an ethical problem emerges regarding whether or not people who are visually impaired or who have other tactile, haptic, or auditory impairments should be increasingly disabled by the trend towards digital communication and information processing.  We believe that such groups should not be shut out from the advantages offered by the use of this technology, just as we believe that multimodal interfaces will enrich the understanding of the computer-based input and output of information that is becoming a part of our everyday lives. 


Thursday, 16 May 2013

Representing Spatial Information Through Multimodal Interfaces: Overview and preliminary results in non-visual interfaces

Jacobson, R.D. (2002) Representing Spatial Information Through Multimodal Interfaces: Overview and preliminary results in non-visual interfaces.  6th International Conference on Information Visualization: Symposium on Spatial/Geographic Data Visualization, IEEE Proceedings, London, 10-12 July, 2002, 730-734.


The research discussed here is a component of a larger study to explore the accessibility and usability of spatial data presented through multiple sensory modalities including haptic, auditory, and visual interfaces.  Geographical Information Systems (GIS) and other computer-based tools for spatial display predominantly use vision to communicate information to the user, as sight is the spatial sense par excellence. Ongoing research is exploring the fundamental concepts and techniques necessary to navigate through multimodal interfaces, which are user, task, domain, and interface specific. This highlights the necessity for both a conceptual / theoretical schema, and the need for extensive usability studies.  Preliminary results presented here exploring feature recognition, and shape tracing in non-visual environments indicate multimodal interfaces have a great deal of potential for facilitating access to spatial data for blind and visually impaired persons. The research is undertaken with the wider goals of increasing information accessibility and promoting “universal access”.  


Cognitive mapping without sight: Four preliminary studies of spatial learning

Jacobson, R.D. (1998) Cognitive mapping without sight: Four preliminary studies of spatial learning. Journal of Environmental Psychology, 18, 289-305.


This paper illustrates the application of cognitive mapping to people with visual impairments and blindness. It gives perspectives on past research, outlines ongoing  research, highlights some of the methodological and validity issues arising from this research, and discusses the movement of theory into practice. The findings of three small preliminary studies have been reported, as part of continuing research into the cognitive mapping abilities of blind or visually impaired people. These studies have highlighted the need to use multiple, mutually supportive tests to assess cognitive map knowledge. In light of these findings and the need to move theory into practice, a current research project is outlined. This project seeks to use the knowledge gained from the three projects to design and implement an auditory hyper map system to aid wayfinding and the spatial learning of an area. Finally an agenda for applied research is presented.


Navigating maps with little or no sight: A novel audio-tactile approach

Jacobson, R.D. (1998) Navigating maps with little or no sight: A novel audio-tactile approach. Proceedings of Content Visualization and Intermedia Representations. August 15, University of Montreal, Montreal.

This paper first presents a review of the options available for conveying maps and graphics to visually impaired and blind people. A novel audio-tactile methodology is described, and the results from its pilot study reported. Communication of spatial media, such as map, is problematic without sight. Tactile perception is serial rather
than synoptic. By building a working model of the environment that is uses both tactile and auditory feedback, a map is made far more accessible. Results from the pilot study demonstrated simplicity and enjoyment of use of this novel approach which integrates speech, verbal landmarks, earcons and recorded environmental sound
to build a small spatial hypermedia system.