Showing posts with label blind. Show all posts
Showing posts with label blind. Show all posts

Friday, 30 September 2016

Design Considerations for Haptic and Auditory Map Interfaces

Rice, M., Jacobson, R.D., Golledge, R.G., and Jones, D. (2005) Design Considerations for Haptic and Auditory Map Interfaces. Cartography and Geographic Information Science, 32 (4), 381-391http://dx.doi.org/10.1559/152304005775194656


Abstract

Communicating spatial information to the blind and visually impaired using maps and graphics presents many difficulties. Past research has offered advice to cartographers on topics such as tactile areal, point, and line symbolization; on perceptual problems related to dense linear features on tactile maps; and on the relationship between categorical data, measurement theory, and tactile discrimination. With this previous work as a foundation, we describe our research efforts with haptic and auditory maps - the Haptic Soundscapes Project. Haptic Soundscapes maps allow blind and visually-impaired individuals to feel map features through force feedback devices and hear auditory cues that add both redundant and complementary information. Recent experimental work by the authors has led to several recommended practices for cartographic data simplification, object size discrimination, shape identification, and general interface navigation. The authors also present haptic and auditory mapping examples to illustrate design ideas, algorithms, and technical requirements. Future prospects for automated haptic and auditory map creation are discussed and presented in the context of the past work in generating maps for the blind and visually impaired from cartographic data.

[VIEW PDF]

Monday, 12 August 2013

Crowdsourcing techniques for augmenting traditional accessibility maps with transitory obstacle information


Jacobson, R.D., Caldwell, D.R., McDermott, S.D., Paez. F. I., Aburizaiza, A.O., Curtin K.M., Stefanidis A, and Qin, H. (2013) Crowdsourcing techniques for augmenting traditional accessibility maps with transitory obstacle information Cartography and Geographic Information Science  40 (3): 210-219. http://dx.doi.org/10.1080/15230406.2013.799737

Abstract


One of the most scrutinized contemporary techniques for geospatial data collection and production is crowdsourcing. This inverts the traditional top-down geospatial data production and distribution methods by emphasizing on the participation of the end user or community. The technique has been shown to be particularly useful in the domain of accessibility mapping, where it can augment traditional mapping methods and systems by providing information about transitory obstacles in the built environment. This research paper presents details of techniques and applications of crowdsourcing and related methods for improving the presence of transitory obstacles in accessibility mapping systems. The obstacles are very difficult to incorporate with any other traditional mapping workflows, since they typically appear in an unplanned manner and disappear just as quickly. Nevertheless, these obstacles present a major impediment to navigating an unfamiliar environment. Fortunately, these obstacles can be reported, defined, and captured through a variety of crowdsourcing techniques, including gazetteer-based geoparsing and active social media harvesting, and then referenced in a crowdsourced mapping system. These techniques are presented, along with context from research in tactile cartography and geo-enabled accessibility systems.

[VIEW PDF]


Friday, 31 May 2013

Multimodal speech interfaces to GIS

Multimodal speech interfaces to GIS

Ken Sam's project invloves leveraging existing commercial off the shelf (COTS) web-GIS component and open specification Speech Application Language Tags (SALT) as building blocks for creating a multimodal web-GIS application. In this paper, we will address how the different technology components were applied for creating a multimodal interfaces for the navigation, interaction and feedback for the web-based GIS application.

Screen caputure of Voice-enabled multimodal WebGIS application interface
Speech driven GIS interface
In most computing and information technology environment, data is presented in either text or graphic format as a means of conveying information to the end users. This has been the traditional paradigm of data display and visualization in the computing world. Efforts have been made in the software industry to design better navigation interfaces for software products and improve on the overall user-friendliness of the products. With geospatial data, additional dimensions are introduced in the presentation and display of the data. Because of the added complexity of geospatial data, there are a number of researches that are still on-going in trying to improve on the interface, visualization and interpretation of geospatial data. One can normally expect geospatial data to be viewed or interpreted by a normal-vision user without much challenge. Yet, visualization and navigation of map is a huge challenge for people who are visually impaired. The design and usability of GIS applications has traditionally been tailored to keyboard and mouse interaction in an office environment. To help with the visualization of geospatial data and navigation of a GIS application, this project presents the result of a prototype application that incorporates voice as another mode of interacting with a web-GIS application. While voice is not a replacement for the mouse and keyboard interface, it can act as an enhancement or augmentation to improve the accessibility and usability of an application. The multimodal approach of combining voice with other user interface for navigation and data presentation is beneficial to the interpretation and visualization of geospatial data and make GIS easier to use for all users.

Publications
Jacobson, R.D., and Sam, K. (2006) Multimodal Web-GIS: AugmentingMap Navigation and Spatial Data Visualization with Voice Control, AutoCarto 2006, June 26-28, Electronic Proceedings.

Transcending the Digital Divide

The purpose of this research is to develop, evaluate, and disseminate a non-visual interface for accessing digital information. The aim is to investigate the perceptual and cognitive problems that blind people face when trying to interpret information provided in a multimodal manner. The project also plans to provide touch sensitive and sound based network interface and navigation devices that incorporate cognitive wayfinding heuristics. Haptic (force feedback) interfaces will be provided for exploring web pages that consist of map, graphic, iconic or image products. Sound identifiers for on-screen windowed, map, and image information will also be provided. These tasks will contribute to transcending the Digital Divide that increasingly separates blind or vision impaired people from the growing information-based workplace. Recent research at UCSB has begun to explore how individuals identify features presented through sound and touch. Other research (e.g. O'Modhrrain and Gillespie, 1998; McKinley and Scott, 1998) have used haptics to explore screen objects such as windows, pulldown menus, buttons, and sliders; but map, graphic and other cartographic representations have not been explored. In particular, the potential of auditory maps of on-screen phenomena (e.g. as would be important in GIS applications) has barely been examined and few examples exist of combining audio and touch principles to build an interface. While imaginative efforts to build non-visual interfaces have been proceeding. there is a yet little empirical evidence that people without sight can use them effectively (i.e. develop a true representation of the experienced phenomena). Experiments will be undertaken to test the ability of vision impaired and sighted people from different age groups to use these new interface or features such as: (i) the haptic mouse or a touch window tied to auditory communication displays; (ii) digitized real sounds to indicate environmental features at their mapped locations; (iii) "sound painting" of maps, images, or charts to indicate gradients of phenomena like temperature, precipitation, pressure, population density and altitude. Tests will be developed to evaluate (i) the minimum resolvable area for the haptic interpretation of scenes; (ii) the development of skills for shape tracing in the sound or the force-feedback haptic domain, (iii) the possibility of using continuous or discreet sound symbols associated with touch sensitive pads to learn hierarchically nested screen information (e.g. locations of cities within regions within states within nations); (iv) to evaluate how dynamic activities such as scrolling, zooming, and searching can be conducted in the haptic or auditory domain, (v) to evaluate people's comprehension and ability to explore, comprehend, and make inferences about various non-visual interpretations of complex visual displays (e.g. maps and diagrams), and (vi) to explore the effectiveness of using a haptic mouse with a 2" square motion domain to search a 14" screen (i.e. scale effects).

Tuesday, 21 May 2013

Supporting Accessibility for Blind and Vision-impaired People With a Localized Gazetteer and Open Source Geotechnology

Rice, M.T., Aburizaiza, A.O, Jacobson,R.D, Shore , B.M., and Paez. F I.  (2012). Supporting Accessibility for Blind and Vision-impaired People With a Localized Gazetteer and Open Source Geotechnology. Transactions in GIS 16 (2):177-190. http://dx.doi.org/10.1111/j.1467-9671.2012.01318.x
Abstract
Disabled people, especially the blind and vision-impaired, are challenged by many transitory hazards in urban environments such as construction barricades, temporary fencing across walkways, and obstacles along curbs. These hazards present a problem for navigation, because they typically appear in an unplanned manner and are seldom included in databases used for accessibility mapping. Tactile maps are a traditional tool used by blind and vision-impaired people for navigation through urban environments, but such maps are not automatically updated with transitory hazards. As an alternative approach to static content on tactile maps, we use volunteered geographic information (VGI) and an Open Source system to provide
updates of local infrastructure. These VGI updates, contributed via voice, text message, and e-mail, use geographic descriptions containing place names to describe changes to the local environment. After they have been contributed and stored in a database, we georeference VGI updates with a detailed gazetteer of local place names including buildings, administrative offices, landmarks, roadways, and dormitories. We publish maps and alerts showing transitory hazards, including location-based alerts delivered to mobile devices. Our system is built with several technologies including PHP, JavaScript, AJAX, Google Maps API, PostgreSQL, an Open Source database, and PostGIS, the PostgreSQL’s spatial extension. This article provides insight into the integration of user-contributed geospatial information into a comprehensive system for use by the blind and vision-impaired, focusing on currently developed methods for geoparsing and georeferencing using a gazetteer.

Comparing Tactile Maps and Haptic Digital Representations of a Maritime Environment

Simonnet, M., Vieilledent, and Tisseau, J. (2011) Comparing Tactile Maps and Haptic Digital Representations of a Maritime Environment. Journal of Visual Impairment and Blindness, 105 (4), 222-234.

Abstract

A map exploration and representation exercise was conducted with participants who were totally blind. Representations of maritime environments were presented either with a tactile map or with a digital haptic virtual map. We assessed the knowledge of spatial configurations using a triangulation technique. The results revealed that both types of map learning were equivalent.


Tactile maps

Jacobson, R.D. (2010) Tactile maps, In: Goldstein, B. (ed) Encyclopedia of Perception, pp.950-952. Sage: London

Abstract

Extracting meaningful information from a tactile map, that is a map elevated in the third dimension, designed to be read by the sense of touch, is far more problematic than reading a conventional map with the use of vision.  Tactile maps are widely used in educational settings and in orientation and mobility training for vision impaired individuals. Maps and graphics are the most fundamental and primary mechanism for communicating spatial arrangements to blind people that is any representation of spatial features their arrangement and intra relationships.  Tactile graphics are used as diagrams in school text books, and portable maps when traveling. Just as Braille is often used as a substitute for the written word, tactile graphics are the equivalent for maps and diagrams. These are an essential tool for providing independence and education to people without vision.

The assessment of non visual maritime cognitive maps of a blind sailor: a case study

Simonnet, M., Vieilledent, S., Jacobson, D. and Tisseau, J. (2010) The assessment of non visual maritime cognitive maps of a blind sailor: a case study, Journal of Maps, v2010, 289-301. 10.4113/jom.2010.1087.

Abstract

Nowadays, thanks to the accessibility of GPS, sighted people widely use electronic charts to navigate through different kinds of environments. In the maritime domain, it has considerably improved the precision of course control. In this domain, blind sailors can not make a compass bearing, however they are able to interact with multimodal electronic charts. Indeed, we conceived SeaTouch, a haptic (tactile-kinesthetic) and auditory virtual environment that allows users to perform virtual maritime navigation without vision. In this study we attempt to assess if heading or northing “haptic” views during virtual navigation training influences non-visual spatial knowledge. After simulating a navigation session in each condition, a blind sailor truly navigated on the sea and estimated seamark bearings. We used the triangulation technique to compare the efficiency of northing and heading virtual training. The results are congruent with current knowledge about spatial frames of reference and suggest that getting lost in heading mode forces the blind sailor to coordinate his current “view” with a more global and stable representation.

Map - data Publication

Simonnet, M., Vieilledent, S., Jacobson, D. and Tisseau, J. (2010) Published Map. In Simonnet, M., Vieilledent, S., Jacobson, D. and Tisseau, J. (2010) The assessment of non visual maritime cognitive maps of a blind sailor: a case study, Journal of Maps, v2010, 289-301. 10.4113/jom.2010.1087.

[VIEW PDF]

A Haptic and Auditory Maritime Environment for Non Visual Cognitive Mapping of Blind Sailors

M. Simonnet, R.D. Jacobson, S. Vieilledent and J. Tisseau. (2009) SeaTouch: A Haptic and Auditory Maritime Environment for Non Visual Cognitive Mapping of Blind Sailors. In K. Stewart Hornsby et al. (Eds.): COSIT 2009, LNCS 5756, pp. 212–226, 2009. Springer-Verlag Berlin, Heidelberg.

Abstract

Navigating consists of coordinating egocentric and allocentric spatial frames of reference. Virtual environments have afforded researchers in the spatial community with tools to investigate the learning of space. The issue of the transfer between virtual and real situations is not trivial. A central question is the role of frames of reference in mediating spatial knowledge transfer to external surroundings, as is the effect of different sensory modalities accessed in simulated and real worlds. This challenges the capacity of blind people to use virtual reality to explore a scene without graphics. The present experiment involves a haptic and auditory maritime virtual environment. In triangulation tasks, we measure systematic errors and preliminary results show an ability to learn configurational knowledge and to navigate through it without vision. Subjects appeared to take advantage of getting lost in an egocentric “haptic” view in the virtual environment to improve performances in the real environment.


Thursday, 16 May 2013

Cognitive maps, spatial abilities, and human wayfinding

Golledge, R.G., Jacobson, R.D., Kitchin, R.M., and Blades, M. (2000) Cognitive maps, spatial abilities, and human wayfinding. Geographical Review of Japan, ser. B: The English journal of the Association of Japanese Geographers, 73 (Ser.B) (2), 93-104.

Abstract

In a series of experiments in Belfast (Northern Ireland) and Santa Barbara (California) we used 10 sighted, 10 visually impaired, and 10 blind individuals matched for age, socio-economic status, and educational background to examine wayfinding. The participants were first required to take the experimenter over a familiar route to observe the types of behavior they exhibited. This established a performance base and provided a training exercise as participants undertook the set of tasks to be performed in the unfamiliar environment. Table 2 shows the aggregate results from participants' familiar environments.  They were then required to learn a new route in completely unfamiliar environments.  To do this the participants were given 4 trials - the first was an experimenter-guided trial and the next 3 were learning and evaluation trials.

[VIEW PDF]

Wednesday, 15 May 2013

Navigation for the visually impaired: Going beyond tactile cartography

 Jacobson, R.D. (1994) Navigation for the visually impaired: Going beyond tactile cartography, Swansea Geographer, 31, 53-59.

Abstract

Wayfinding for the visually handicapped, is made more complex by the loss of their visual sense. In spite  of  this  they  can  hold  spatial  concepts  and  are  often  competent  navigators.  Tactile  maps,  those sensed by touch, have been shown to improve their spatial awareness and mobility. It is however the development of a personal guidance system  (PGS) relying on recently developed technologies that
may  herald  a  break  through  for  navigation  for  the  blind  and  visually  impaired.  It  would  enable  the visually handicapped to move more  freely  and  independently  through  their  environment.  It  would provide  on-line  interactions  with  representations  of  their  environment,  in  audio  or  tactile  form,
providing orientation, location and guidance information, enabling them to plan, monitor and execute navigation  decisions.

[View PDF]

Friday, 3 May 2013

Haptic Soundscapes

Towards making maps, diagrams and graphs accessible to visually impaired people 

The aim of this research project is to develop and evaluate haptic soundscapes. This allows people with little or no vision to interact with maps, diagrams and graphs displayed via dissemination media, such as the World Wide Web, through sound, touch and force feedback. Although of principal utility for people with severe visual impairments, it is anticipated that this interface will allow informative educational resources for children and people with learning difficulties to be developed and accessed through the Internet. The research project offers a simple, yet innovative solution to accessing spatial data without the need for vision. It builds upon previous work carried out in various departments at UCSB, and fosters inter-disciplinary links and cooperation between usually unconnected research groups. The research hopes to further knowledge and understanding in this emerging field and also to offer practical results that will impact on people's lives. It is strongly felt that the development of the project will lead to continued external funding, and it is our hope that this project will act as a springboard to further research in which UCSB will be a key component.

Further development, usability testing, and expansion
 
The Haptic Soundscapes project has developed a set of audio-tactile mapping tools to help blind people access spatial information and to help aid research in multi-modal spatial cognition. These tools offer blind people access to the geographic world they cannot otherwise fully experience, creating opportunities for orientation, navigation, and education. Spatial knowledge from maps, charts, and graphs, is obtained through display and interaction with sound, touch, and force-feedback devices. Individuals can use audio-tactile mapping tools to explore an unknown environment or create a audio-tactile map from images displayed on a computer screen. These audio-tactile maps can be disseminated over the internet, or used in educational settings. Next year, several objectives are planned for the Haptic Soundscapes project. These include cognitive experiments to assess a user’s ability to navigate within a scene, between adjacent scenes, and between scenes of different scales using the audio-tactile mapping tools. We will also expand the capability of the audio-tactile mapping system to include text-to-speech synthesis and real-time multi-dimensional sound representation. Several off-campus funding proposals will be submitted. Finally, we will showcase the tools developed in the course of this project by expanding our campus demonstrator - an interactive, navigable audio-tactile map of the UCSB campus.