Showing posts with label navigation. Show all posts
Showing posts with label navigation. Show all posts

Tuesday, 21 May 2013

Supporting Accessibility for Blind and Vision-impaired People With a Localized Gazetteer and Open Source Geotechnology

Rice, M.T., Aburizaiza, A.O, Jacobson,R.D, Shore , B.M., and Paez. F I.  (2012). Supporting Accessibility for Blind and Vision-impaired People With a Localized Gazetteer and Open Source Geotechnology. Transactions in GIS 16 (2):177-190. http://dx.doi.org/10.1111/j.1467-9671.2012.01318.x
Abstract
Disabled people, especially the blind and vision-impaired, are challenged by many transitory hazards in urban environments such as construction barricades, temporary fencing across walkways, and obstacles along curbs. These hazards present a problem for navigation, because they typically appear in an unplanned manner and are seldom included in databases used for accessibility mapping. Tactile maps are a traditional tool used by blind and vision-impaired people for navigation through urban environments, but such maps are not automatically updated with transitory hazards. As an alternative approach to static content on tactile maps, we use volunteered geographic information (VGI) and an Open Source system to provide
updates of local infrastructure. These VGI updates, contributed via voice, text message, and e-mail, use geographic descriptions containing place names to describe changes to the local environment. After they have been contributed and stored in a database, we georeference VGI updates with a detailed gazetteer of local place names including buildings, administrative offices, landmarks, roadways, and dormitories. We publish maps and alerts showing transitory hazards, including location-based alerts delivered to mobile devices. Our system is built with several technologies including PHP, JavaScript, AJAX, Google Maps API, PostgreSQL, an Open Source database, and PostGIS, the PostgreSQL’s spatial extension. This article provides insight into the integration of user-contributed geospatial information into a comprehensive system for use by the blind and vision-impaired, focusing on currently developed methods for geoparsing and georeferencing using a gazetteer.

A Haptic and Auditory Maritime Environment for Non Visual Cognitive Mapping of Blind Sailors

M. Simonnet, R.D. Jacobson, S. Vieilledent and J. Tisseau. (2009) SeaTouch: A Haptic and Auditory Maritime Environment for Non Visual Cognitive Mapping of Blind Sailors. In K. Stewart Hornsby et al. (Eds.): COSIT 2009, LNCS 5756, pp. 212–226, 2009. Springer-Verlag Berlin, Heidelberg.

Abstract

Navigating consists of coordinating egocentric and allocentric spatial frames of reference. Virtual environments have afforded researchers in the spatial community with tools to investigate the learning of space. The issue of the transfer between virtual and real situations is not trivial. A central question is the role of frames of reference in mediating spatial knowledge transfer to external surroundings, as is the effect of different sensory modalities accessed in simulated and real worlds. This challenges the capacity of blind people to use virtual reality to explore a scene without graphics. The present experiment involves a haptic and auditory maritime virtual environment. In triangulation tasks, we measure systematic errors and preliminary results show an ability to learn configurational knowledge and to navigate through it without vision. Subjects appeared to take advantage of getting lost in an egocentric “haptic” view in the virtual environment to improve performances in the real environment.


Wednesday, 15 May 2013

Talking tactile maps and environmental audio beacons: An orientation and mobility development tool for visually impaired people

Jacobson, R.D. (1996) Talking tactile maps and environmental audio beacons: An orientation and mobility development tool for visually impaired people, Proceedings of the ICA Commission on maps and graphics for blind and visually impaired people, 21-25 October, 1996, Ljubjiana, Slovenia.

Abstract

Pedestrian navigation through the built environment is a fundamental human activity. Environmental scales may range from the micro, the room of a house, to the macro, a cityscape, for example. In order to navigate effectively through this range of environments visually impaired people need to develop orientation and mobility skills. Auditory beacons, accessed in a model as a talking tactile map and in the environment by beacons which transmit audio messages to a small receiver carried by the pedestrian, serve to integrate the model representation and the environment, and act as mobility and orientation development tool. This technical approach is assessed using a multi-task analysis of the cognitive maps of people using the system when learning a new route. Although analysis was not conclusive, those who used the system expressed great interest, suggesting that both maps and audio complimented and enhanced each other. This study demonstrates that access to audio beacons in environment and model leads to increased spatial comprehension and confidence about the route and shows the need for a mixture of quantitative and qualitative approaches when assessing cognitive mapping ability.

[View PDF]

Navigation for the visually impaired: Going beyond tactile cartography

 Jacobson, R.D. (1994) Navigation for the visually impaired: Going beyond tactile cartography, Swansea Geographer, 31, 53-59.

Abstract

Wayfinding for the visually handicapped, is made more complex by the loss of their visual sense. In spite  of  this  they  can  hold  spatial  concepts  and  are  often  competent  navigators.  Tactile  maps,  those sensed by touch, have been shown to improve their spatial awareness and mobility. It is however the development of a personal guidance system  (PGS) relying on recently developed technologies that
may  herald  a  break  through  for  navigation  for  the  blind  and  visually  impaired.  It  would  enable  the visually handicapped to move more  freely  and  independently  through  their  environment.  It  would provide  on-line  interactions  with  representations  of  their  environment,  in  audio  or  tactile  form,
providing orientation, location and guidance information, enabling them to plan, monitor and execute navigation  decisions.

[View PDF]