Showing posts with label audio. Show all posts
Showing posts with label audio. Show all posts

Friday, 31 May 2013

Transcending the Digital Divide

The purpose of this research is to develop, evaluate, and disseminate a non-visual interface for accessing digital information. The aim is to investigate the perceptual and cognitive problems that blind people face when trying to interpret information provided in a multimodal manner. The project also plans to provide touch sensitive and sound based network interface and navigation devices that incorporate cognitive wayfinding heuristics. Haptic (force feedback) interfaces will be provided for exploring web pages that consist of map, graphic, iconic or image products. Sound identifiers for on-screen windowed, map, and image information will also be provided. These tasks will contribute to transcending the Digital Divide that increasingly separates blind or vision impaired people from the growing information-based workplace. Recent research at UCSB has begun to explore how individuals identify features presented through sound and touch. Other research (e.g. O'Modhrrain and Gillespie, 1998; McKinley and Scott, 1998) have used haptics to explore screen objects such as windows, pulldown menus, buttons, and sliders; but map, graphic and other cartographic representations have not been explored. In particular, the potential of auditory maps of on-screen phenomena (e.g. as would be important in GIS applications) has barely been examined and few examples exist of combining audio and touch principles to build an interface. While imaginative efforts to build non-visual interfaces have been proceeding. there is a yet little empirical evidence that people without sight can use them effectively (i.e. develop a true representation of the experienced phenomena). Experiments will be undertaken to test the ability of vision impaired and sighted people from different age groups to use these new interface or features such as: (i) the haptic mouse or a touch window tied to auditory communication displays; (ii) digitized real sounds to indicate environmental features at their mapped locations; (iii) "sound painting" of maps, images, or charts to indicate gradients of phenomena like temperature, precipitation, pressure, population density and altitude. Tests will be developed to evaluate (i) the minimum resolvable area for the haptic interpretation of scenes; (ii) the development of skills for shape tracing in the sound or the force-feedback haptic domain, (iii) the possibility of using continuous or discreet sound symbols associated with touch sensitive pads to learn hierarchically nested screen information (e.g. locations of cities within regions within states within nations); (iv) to evaluate how dynamic activities such as scrolling, zooming, and searching can be conducted in the haptic or auditory domain, (v) to evaluate people's comprehension and ability to explore, comprehend, and make inferences about various non-visual interpretations of complex visual displays (e.g. maps and diagrams), and (vi) to explore the effectiveness of using a haptic mouse with a 2" square motion domain to search a 14" screen (i.e. scale effects).

Tuesday, 21 May 2013

Can Virtual Reality Provide Digital Maps To Blind Sailors? A Case Study

Jacobson, R.D., Simonnet, M., Vieilledent, S. and Tisseau, J. (2009) Can Virtual Reality Provide Digital Maps To Blind Sailors? A Case Study. Proceedings of the International Cartographic Congress, 15-21 November 2009, Santiago, Chile. 10pp.

Abstract
This paper presents information about “SeaTouch” a virtual haptic and auditory interface to digital Maritime Charts to facilitate blind sailors to prepare for ocean voyages, and ultimately to navigate autonomously while at sea. It has been shown that blind people mainly encode space relative to their body. But mastering space consists of coordinating body and environmental reference points. Tactile maps are powerful tools to help them to encode spatial information. However only digital charts an be updated during an ocean voyageand they very often the only alternative is through conventional printed media. Virtual reality can present information using auditory and haptic interfaces. Previous work has shown that virtual navigation facilitates the ability to acquire spatial knowledge. The construction of spatial representations from physical contact of individuals with their environment, the use of Euclidean geometry seems to facilitate mental processing about space. However, navigation takes great advantage of matching ego- and allo-centered spatial frames of
reference to move and locate in surroundings. Blindness does not indicate a lack of comprehension of spatial concepts, but it leads people to encounter difficulties in perceiving and updating information about the environment. Without access to distant landmarks that are available to people with sight, blind people tend to encode spatial relations in an ego-centered spatial frame of reference. On the contrary, tactile maps and appropriate exploration strategies allow them to build holistic configural representations in an allo-centered spatial frame of reference. However,  position updating during navigation remains particularly complicated without vision. Virtual reality techniques can provide a virtual environment to manage and explore their surroundings. Haptic and auditory interfaces provide blind people with an immersive virtual navigation experience. In order to help blind sailors to coordinate ego- and allo-centered spatial frames of reference, we conceived SeaTouch. This haptic and auditory software is adapted so that blind sailors are able to
set up and simulate their itineraries before sailing navigation. In our first experimental condition, we compare spatial representations built by six blind sailors during the exploration of a tactile map and the virtual map of SeaTouch. Results show that these two conditions were equivalent. In our second experimental condition, we focused on the conditions which favour the transfer of spatial knowledge from a virtual to a real environment. In this respect, blind sailors performed a virtual navigation in‘Northing mode’, where the ship moves on the map, and in‘Heading mode’, where the map shifts around the sailboat. No significant difference appears. This reveals that the most important factor for the blind sailors to locate themselves in the real environment is the orientation of the maps during the initial encoding time. However, we noticed that the subjects who got lost in the virtual environment in northing condition slightly improved their performances in the real environment. The analysis of the exploratory movements on the map are congruent with a previous model of coordination of spatial frames of reference. Moreover, beyond the direct benefits of SeaTouch for the navigation of blind sailors, this study offers some new insight to facilitate understanding of non visual spatial cognition. More specifically the cognitively complex task of the coordination and integration of ego and allocentered spatial frames of reference. In summary the research aims at measuring if a blind sailor can learn a maritime environment with a virtual map as well as with a tactile map. The results tend to confirm this, and suggest pursuing investigations with non visual virtual navigation. Here we present the initial results with
one participant.

[VIEW PDF]

Wednesday, 15 May 2013

Talking tactile maps and environmental audio beacons: An orientation and mobility development tool for visually impaired people

Jacobson, R.D. (1996) Talking tactile maps and environmental audio beacons: An orientation and mobility development tool for visually impaired people, Proceedings of the ICA Commission on maps and graphics for blind and visually impaired people, 21-25 October, 1996, Ljubjiana, Slovenia.

Abstract

Pedestrian navigation through the built environment is a fundamental human activity. Environmental scales may range from the micro, the room of a house, to the macro, a cityscape, for example. In order to navigate effectively through this range of environments visually impaired people need to develop orientation and mobility skills. Auditory beacons, accessed in a model as a talking tactile map and in the environment by beacons which transmit audio messages to a small receiver carried by the pedestrian, serve to integrate the model representation and the environment, and act as mobility and orientation development tool. This technical approach is assessed using a multi-task analysis of the cognitive maps of people using the system when learning a new route. Although analysis was not conclusive, those who used the system expressed great interest, suggesting that both maps and audio complimented and enhanced each other. This study demonstrates that access to audio beacons in environment and model leads to increased spatial comprehension and confidence about the route and shows the need for a mixture of quantitative and qualitative approaches when assessing cognitive mapping ability.

[View PDF]

Navigation for the visually impaired: Going beyond tactile cartography

 Jacobson, R.D. (1994) Navigation for the visually impaired: Going beyond tactile cartography, Swansea Geographer, 31, 53-59.

Abstract

Wayfinding for the visually handicapped, is made more complex by the loss of their visual sense. In spite  of  this  they  can  hold  spatial  concepts  and  are  often  competent  navigators.  Tactile  maps,  those sensed by touch, have been shown to improve their spatial awareness and mobility. It is however the development of a personal guidance system  (PGS) relying on recently developed technologies that
may  herald  a  break  through  for  navigation  for  the  blind  and  visually  impaired.  It  would  enable  the visually handicapped to move more  freely  and  independently  through  their  environment.  It  would provide  on-line  interactions  with  representations  of  their  environment,  in  audio  or  tactile  form,
providing orientation, location and guidance information, enabling them to plan, monitor and execute navigation  decisions.

[View PDF]