Showing posts with label visual impairment. Show all posts
Showing posts with label visual impairment. Show all posts

Friday, 30 September 2016

Design Considerations for Haptic and Auditory Map Interfaces

Rice, M., Jacobson, R.D., Golledge, R.G., and Jones, D. (2005) Design Considerations for Haptic and Auditory Map Interfaces. Cartography and Geographic Information Science, 32 (4), 381-391


Communicating spatial information to the blind and visually impaired using maps and graphics presents many difficulties. Past research has offered advice to cartographers on topics such as tactile areal, point, and line symbolization; on perceptual problems related to dense linear features on tactile maps; and on the relationship between categorical data, measurement theory, and tactile discrimination. With this previous work as a foundation, we describe our research efforts with haptic and auditory maps - the Haptic Soundscapes Project. Haptic Soundscapes maps allow blind and visually-impaired individuals to feel map features through force feedback devices and hear auditory cues that add both redundant and complementary information. Recent experimental work by the authors has led to several recommended practices for cartographic data simplification, object size discrimination, shape identification, and general interface navigation. The authors also present haptic and auditory mapping examples to illustrate design ideas, algorithms, and technical requirements. Future prospects for automated haptic and auditory map creation are discussed and presented in the context of the past work in generating maps for the blind and visually impaired from cartographic data.


Friday, 31 May 2013

Multimodal speech interfaces to GIS

Multimodal speech interfaces to GIS

Ken Sam's project invloves leveraging existing commercial off the shelf (COTS) web-GIS component and open specification Speech Application Language Tags (SALT) as building blocks for creating a multimodal web-GIS application. In this paper, we will address how the different technology components were applied for creating a multimodal interfaces for the navigation, interaction and feedback for the web-based GIS application.

Screen caputure of Voice-enabled multimodal WebGIS application interface
Speech driven GIS interface
In most computing and information technology environment, data is presented in either text or graphic format as a means of conveying information to the end users. This has been the traditional paradigm of data display and visualization in the computing world. Efforts have been made in the software industry to design better navigation interfaces for software products and improve on the overall user-friendliness of the products. With geospatial data, additional dimensions are introduced in the presentation and display of the data. Because of the added complexity of geospatial data, there are a number of researches that are still on-going in trying to improve on the interface, visualization and interpretation of geospatial data. One can normally expect geospatial data to be viewed or interpreted by a normal-vision user without much challenge. Yet, visualization and navigation of map is a huge challenge for people who are visually impaired. The design and usability of GIS applications has traditionally been tailored to keyboard and mouse interaction in an office environment. To help with the visualization of geospatial data and navigation of a GIS application, this project presents the result of a prototype application that incorporates voice as another mode of interacting with a web-GIS application. While voice is not a replacement for the mouse and keyboard interface, it can act as an enhancement or augmentation to improve the accessibility and usability of an application. The multimodal approach of combining voice with other user interface for navigation and data presentation is beneficial to the interpretation and visualization of geospatial data and make GIS easier to use for all users.

Jacobson, R.D., and Sam, K. (2006) Multimodal Web-GIS: AugmentingMap Navigation and Spatial Data Visualization with Voice Control, AutoCarto 2006, June 26-28, Electronic Proceedings.

Transcending the Digital Divide

The purpose of this research is to develop, evaluate, and disseminate a non-visual interface for accessing digital information. The aim is to investigate the perceptual and cognitive problems that blind people face when trying to interpret information provided in a multimodal manner. The project also plans to provide touch sensitive and sound based network interface and navigation devices that incorporate cognitive wayfinding heuristics. Haptic (force feedback) interfaces will be provided for exploring web pages that consist of map, graphic, iconic or image products. Sound identifiers for on-screen windowed, map, and image information will also be provided. These tasks will contribute to transcending the Digital Divide that increasingly separates blind or vision impaired people from the growing information-based workplace. Recent research at UCSB has begun to explore how individuals identify features presented through sound and touch. Other research (e.g. O'Modhrrain and Gillespie, 1998; McKinley and Scott, 1998) have used haptics to explore screen objects such as windows, pulldown menus, buttons, and sliders; but map, graphic and other cartographic representations have not been explored. In particular, the potential of auditory maps of on-screen phenomena (e.g. as would be important in GIS applications) has barely been examined and few examples exist of combining audio and touch principles to build an interface. While imaginative efforts to build non-visual interfaces have been proceeding. there is a yet little empirical evidence that people without sight can use them effectively (i.e. develop a true representation of the experienced phenomena). Experiments will be undertaken to test the ability of vision impaired and sighted people from different age groups to use these new interface or features such as: (i) the haptic mouse or a touch window tied to auditory communication displays; (ii) digitized real sounds to indicate environmental features at their mapped locations; (iii) "sound painting" of maps, images, or charts to indicate gradients of phenomena like temperature, precipitation, pressure, population density and altitude. Tests will be developed to evaluate (i) the minimum resolvable area for the haptic interpretation of scenes; (ii) the development of skills for shape tracing in the sound or the force-feedback haptic domain, (iii) the possibility of using continuous or discreet sound symbols associated with touch sensitive pads to learn hierarchically nested screen information (e.g. locations of cities within regions within states within nations); (iv) to evaluate how dynamic activities such as scrolling, zooming, and searching can be conducted in the haptic or auditory domain, (v) to evaluate people's comprehension and ability to explore, comprehend, and make inferences about various non-visual interpretations of complex visual displays (e.g. maps and diagrams), and (vi) to explore the effectiveness of using a haptic mouse with a 2" square motion domain to search a 14" screen (i.e. scale effects).

Tuesday, 21 May 2013

Supporting Accessibility for Blind and Vision-impaired People With a Localized Gazetteer and Open Source Geotechnology

Rice, M.T., Aburizaiza, A.O, Jacobson,R.D, Shore , B.M., and Paez. F I.  (2012). Supporting Accessibility for Blind and Vision-impaired People With a Localized Gazetteer and Open Source Geotechnology. Transactions in GIS 16 (2):177-190.
Disabled people, especially the blind and vision-impaired, are challenged by many transitory hazards in urban environments such as construction barricades, temporary fencing across walkways, and obstacles along curbs. These hazards present a problem for navigation, because they typically appear in an unplanned manner and are seldom included in databases used for accessibility mapping. Tactile maps are a traditional tool used by blind and vision-impaired people for navigation through urban environments, but such maps are not automatically updated with transitory hazards. As an alternative approach to static content on tactile maps, we use volunteered geographic information (VGI) and an Open Source system to provide
updates of local infrastructure. These VGI updates, contributed via voice, text message, and e-mail, use geographic descriptions containing place names to describe changes to the local environment. After they have been contributed and stored in a database, we georeference VGI updates with a detailed gazetteer of local place names including buildings, administrative offices, landmarks, roadways, and dormitories. We publish maps and alerts showing transitory hazards, including location-based alerts delivered to mobile devices. Our system is built with several technologies including PHP, JavaScript, AJAX, Google Maps API, PostgreSQL, an Open Source database, and PostGIS, the PostgreSQL’s spatial extension. This article provides insight into the integration of user-contributed geospatial information into a comprehensive system for use by the blind and vision-impaired, focusing on currently developed methods for geoparsing and georeferencing using a gazetteer.

Comparing Tactile Maps and Haptic Digital Representations of a Maritime Environment

Simonnet, M., Vieilledent, and Tisseau, J. (2011) Comparing Tactile Maps and Haptic Digital Representations of a Maritime Environment. Journal of Visual Impairment and Blindness, 105 (4), 222-234.


A map exploration and representation exercise was conducted with participants who were totally blind. Representations of maritime environments were presented either with a tactile map or with a digital haptic virtual map. We assessed the knowledge of spatial configurations using a triangulation technique. The results revealed that both types of map learning were equivalent.

Tactile maps

Jacobson, R.D. (2010) Tactile maps, In: Goldstein, B. (ed) Encyclopedia of Perception, pp.950-952. Sage: London


Extracting meaningful information from a tactile map, that is a map elevated in the third dimension, designed to be read by the sense of touch, is far more problematic than reading a conventional map with the use of vision.  Tactile maps are widely used in educational settings and in orientation and mobility training for vision impaired individuals. Maps and graphics are the most fundamental and primary mechanism for communicating spatial arrangements to blind people that is any representation of spatial features their arrangement and intra relationships.  Tactile graphics are used as diagrams in school text books, and portable maps when traveling. Just as Braille is often used as a substitute for the written word, tactile graphics are the equivalent for maps and diagrams. These are an essential tool for providing independence and education to people without vision.

The assessment of non visual maritime cognitive maps of a blind sailor: a case study

Simonnet, M., Vieilledent, S., Jacobson, D. and Tisseau, J. (2010) The assessment of non visual maritime cognitive maps of a blind sailor: a case study, Journal of Maps, v2010, 289-301. 10.4113/jom.2010.1087.


Nowadays, thanks to the accessibility of GPS, sighted people widely use electronic charts to navigate through different kinds of environments. In the maritime domain, it has considerably improved the precision of course control. In this domain, blind sailors can not make a compass bearing, however they are able to interact with multimodal electronic charts. Indeed, we conceived SeaTouch, a haptic (tactile-kinesthetic) and auditory virtual environment that allows users to perform virtual maritime navigation without vision. In this study we attempt to assess if heading or northing “haptic” views during virtual navigation training influences non-visual spatial knowledge. After simulating a navigation session in each condition, a blind sailor truly navigated on the sea and estimated seamark bearings. We used the triangulation technique to compare the efficiency of northing and heading virtual training. The results are congruent with current knowledge about spatial frames of reference and suggest that getting lost in heading mode forces the blind sailor to coordinate his current “view” with a more global and stable representation.

Map - data Publication

Simonnet, M., Vieilledent, S., Jacobson, D. and Tisseau, J. (2010) Published Map. In Simonnet, M., Vieilledent, S., Jacobson, D. and Tisseau, J. (2010) The assessment of non visual maritime cognitive maps of a blind sailor: a case study, Journal of Maps, v2010, 289-301. 10.4113/jom.2010.1087.


A Haptic and Auditory Maritime Environment for Non Visual Cognitive Mapping of Blind Sailors

M. Simonnet, R.D. Jacobson, S. Vieilledent and J. Tisseau. (2009) SeaTouch: A Haptic and Auditory Maritime Environment for Non Visual Cognitive Mapping of Blind Sailors. In K. Stewart Hornsby et al. (Eds.): COSIT 2009, LNCS 5756, pp. 212–226, 2009. Springer-Verlag Berlin, Heidelberg.


Navigating consists of coordinating egocentric and allocentric spatial frames of reference. Virtual environments have afforded researchers in the spatial community with tools to investigate the learning of space. The issue of the transfer between virtual and real situations is not trivial. A central question is the role of frames of reference in mediating spatial knowledge transfer to external surroundings, as is the effect of different sensory modalities accessed in simulated and real worlds. This challenges the capacity of blind people to use virtual reality to explore a scene without graphics. The present experiment involves a haptic and auditory maritime virtual environment. In triangulation tasks, we measure systematic errors and preliminary results show an ability to learn configurational knowledge and to navigate through it without vision. Subjects appeared to take advantage of getting lost in an egocentric “haptic” view in the virtual environment to improve performances in the real environment.

Beyond Tactile Maps: Towards ontologies for future research

Jacobson, R.D. (2009) Beyond Tactile Maps: Towards ontologies for future research. Published Abstract Proceedings of the International Cartographic Congress, 15-21 November 2009, Santiago, Chile.
Tactile maps have traditionally been the representation media of choice for cartographers when attempting to convey spatial information to people with limited or no vision.  The production of tactile maps provide an exaggerated example of classic cartographic issues, such as, classification, abstraction, symbolization, generalization and standardization due to their production methods and their necessity to be read at a scale of
fingertip resolution.  Map reading problems are most acutely felt when a user has to extract contextual information, due to disrupted interpretation when linking legend information to other components of the cartographic display. 

The future of tactile cartography

Jacobson, R.D. (2007) The future of tactile cartography: from static raised lines to multimodal dynamic portable computer interfaces, International Cartographic Conference, Moscow 


While still not considered a large component of mainstream cartographic research, the map-related research focusing on the blind and partially sighted map user population continues to grow.  Currently, several groups of researchers housed in universities in North America and internationally are conducting and pursuing research that focuses on identifying the needs, creating new innovative delivery methods, assessing strategies and spatial and geospatial performance, improving access, and developing potential educational resources for blind and partially sighted map users.  


Friday, 17 May 2013

A Commentary on the Use of Touch for Accessing On-Screen Spatial Representations: The Process of Experiencing Haptic Maps and Graphics

Golledge, R.G., Rice, M., and Jacobson, R.D. (2005) A Commentary on the Use of Touch for Accessing On-Screen Spatial Representations: The Process of Experiencing Haptic Maps and Graphics. The Professional Geographer, 57 (3). 339-349.


The growth of the Internet and the digital revolution have meant increased reliance on electronic representations of information. Geospatial information has been readily adapted to the world of cyberspace, and most Web pages incorporate graphics, images, or maps to represent spatial and spatialized data. But flat computer screens do not facilitate a map or graph experience by those who are visually impaired. The traditional method for compensating for nonvisual access to maps and graphics has been to construct hard-copy tactile maps. In this article, we examine an electronic accommodation for nonvisual users—the haptic map. Using new and off-the-shelf hardware—force feedback and vibrotactile mice—we explore how touch can be combined with virtual representations of shapes and patterns to enable nonvisual access to onscreen map or graphic material.
Key Words: digital representation, haptic maps, visual impairment


Thursday, 16 May 2013

Wayfinding by people with visual impairments: The effect of spatial tasks on the ability to learn a novel route

Blades, M., Lippa, Y., Golledge, R.G., Jacobson, R.D., and Kitchin, R.M. (2002) Wayfinding by people with visual impairments: The effect of spatial tasks on the ability to learn a novel route. Journal of Visual Impairment and Blindness, 96, 407-419.


Thirty-eight people with visual impairments learned a 483-meter novel route through a University campus which included 28 choice point (e.g. left or right turns). After a single guided experience of the route participants were divided into four groups and walked the route three times under different conditions. In the verbalization condition participants gave a verbal description of the route from memory after each route experience. In the modeling condition participants made a model of the route from memory after each route
experience. In the pointing condition participants made pointing estimates between places on the route as they walked along it. In the control condition participants walked the route without any additional testing. Performance was measured in terms of accurate decisions at choice points. All four groups showed an improvement in performance with greater experience of the route. The modeling group showed the greatest improvement compared to the control group. The methodological implications of these results are considered, and the implications for mobility training are discussed.  


Exploratory user study of haptic and auditory display for multimodal information systems

Jeong, W. and Jacobson, R.D. (2002) Exploratory user study of haptic and auditory display for multimodal information systems. In: McLaughlin, M. L., Hespanha, J.P., and Sukhatme, G.S. (eds.) Touch in virtual Environments: Haptics and the design of interactive systems. IMSC Series in Multimedia, Prentice Hall: New York, pp. 194-204.


Since the inception of virtual reality (VR) environments, interaction has been predominantly visual, especially in conveying spatial information. However, in many situations vision is not enough or is not available. For example, for the visually impaired over-reliance on visual display denies them access to the information. Even for the general population, if there is no light or weak light, a visual display is not optimal for conveying information. Recently a number of researchers have tried to add other modalities, such as sound or haptics, to overcome the imitations of visual display.


Rapid development of cognitive maps in people with visual impairments when exploring novel geographic spaces

Jacobson, R.D., Lippa, Y., Golledge, R.G., Kitchin, R.M., and Blades, M. (2001) Rapid development of cognitive maps in people with visual impairments when exploring novel geographic spaces. IAPS Bulletin of People-Environment Studies (Special Issue on Environmental Cognition), 18, 3-6.


'Cognitive' map is a term that refers to a person's environmental knowledge. Anyone experiencing a new environment will, over time, develop a cognitive representation of that environment, including information derived from that environment (e.g., about places, routes and spatial relationships) and information about personal experiences (e.g. memories about events at locations and attitudes towards places). There is now a great deal of research into the cognitive maps of sighted people (see Golledge, 1999;  Kitchin
& Freundschuh, 2000;  Kitchin & Blades, in press), but there is comparatively little research into the cognitive maps of people with visual impairments.


Cognitive maps, spatial abilities, and human wayfinding

Golledge, R.G., Jacobson, R.D., Kitchin, R.M., and Blades, M. (2000) Cognitive maps, spatial abilities, and human wayfinding. Geographical Review of Japan, ser. B: The English journal of the Association of Japanese Geographers, 73 (Ser.B) (2), 93-104.


In a series of experiments in Belfast (Northern Ireland) and Santa Barbara (California) we used 10 sighted, 10 visually impaired, and 10 blind individuals matched for age, socio-economic status, and educational background to examine wayfinding. The participants were first required to take the experimenter over a familiar route to observe the types of behavior they exhibited. This established a performance base and provided a training exercise as participants undertook the set of tasks to be performed in the unfamiliar environment. Table 2 shows the aggregate results from participants' familiar environments.  They were then required to learn a new route in completely unfamiliar environments.  To do this the participants were given 4 trials - the first was an experimenter-guided trial and the next 3 were learning and evaluation trials.


Cognitive mapping without sight: Four preliminary studies of spatial learning

Jacobson, R.D. (1998) Cognitive mapping without sight: Four preliminary studies of spatial learning. Journal of Environmental Psychology, 18, 289-305.


This paper illustrates the application of cognitive mapping to people with visual impairments and blindness. It gives perspectives on past research, outlines ongoing  research, highlights some of the methodological and validity issues arising from this research, and discusses the movement of theory into practice. The findings of three small preliminary studies have been reported, as part of continuing research into the cognitive mapping abilities of blind or visually impaired people. These studies have highlighted the need to use multiple, mutually supportive tests to assess cognitive map knowledge. In light of these findings and the need to move theory into practice, a current research project is outlined. This project seeks to use the knowledge gained from the three projects to design and implement an auditory hyper map system to aid wayfinding and the spatial learning of an area. Finally an agenda for applied research is presented.


Wednesday, 15 May 2013

Navigation for the visually impaired: Going beyond tactile cartography

 Jacobson, R.D. (1994) Navigation for the visually impaired: Going beyond tactile cartography, Swansea Geographer, 31, 53-59.


Wayfinding for the visually handicapped, is made more complex by the loss of their visual sense. In spite  of  this  they  can  hold  spatial  concepts  and  are  often  competent  navigators.  Tactile  maps,  those sensed by touch, have been shown to improve their spatial awareness and mobility. It is however the development of a personal guidance system  (PGS) relying on recently developed technologies that
may  herald  a  break  through  for  navigation  for  the  blind  and  visually  impaired.  It  would  enable  the visually handicapped to move more  freely  and  independently  through  their  environment.  It  would provide  on-line  interactions  with  representations  of  their  environment,  in  audio  or  tactile  form,
providing orientation, location and guidance information, enabling them to plan, monitor and execute navigation  decisions.

[View PDF]

Tuesday, 14 May 2013

Spatial cognition through tactile mapping

 Jacobson, R.D. (1992) Spatial cognition through tactile mapping. Swansea Geographer 29, 79-88.

This paper describes an experiment to determine whether a tactile map of the University College of Swansea campus increases the spatial awareness of visually handicapped subjects.
View paper [PDF]

Friday, 3 May 2013

Haptic Soundscapes

Towards making maps, diagrams and graphs accessible to visually impaired people 

The aim of this research project is to develop and evaluate haptic soundscapes. This allows people with little or no vision to interact with maps, diagrams and graphs displayed via dissemination media, such as the World Wide Web, through sound, touch and force feedback. Although of principal utility for people with severe visual impairments, it is anticipated that this interface will allow informative educational resources for children and people with learning difficulties to be developed and accessed through the Internet. The research project offers a simple, yet innovative solution to accessing spatial data without the need for vision. It builds upon previous work carried out in various departments at UCSB, and fosters inter-disciplinary links and cooperation between usually unconnected research groups. The research hopes to further knowledge and understanding in this emerging field and also to offer practical results that will impact on people's lives. It is strongly felt that the development of the project will lead to continued external funding, and it is our hope that this project will act as a springboard to further research in which UCSB will be a key component.

Further development, usability testing, and expansion
The Haptic Soundscapes project has developed a set of audio-tactile mapping tools to help blind people access spatial information and to help aid research in multi-modal spatial cognition. These tools offer blind people access to the geographic world they cannot otherwise fully experience, creating opportunities for orientation, navigation, and education. Spatial knowledge from maps, charts, and graphs, is obtained through display and interaction with sound, touch, and force-feedback devices. Individuals can use audio-tactile mapping tools to explore an unknown environment or create a audio-tactile map from images displayed on a computer screen. These audio-tactile maps can be disseminated over the internet, or used in educational settings. Next year, several objectives are planned for the Haptic Soundscapes project. These include cognitive experiments to assess a user’s ability to navigate within a scene, between adjacent scenes, and between scenes of different scales using the audio-tactile mapping tools. We will also expand the capability of the audio-tactile mapping system to include text-to-speech synthesis and real-time multi-dimensional sound representation. Several off-campus funding proposals will be submitted. Finally, we will showcase the tools developed in the course of this project by expanding our campus demonstrator - an interactive, navigable audio-tactile map of the UCSB campus.