Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Bruce N. Walker is active.

Publication


Featured researches published by Bruce N. Walker.


international symposium on wearable computers | 2007

SWAN: System for Wearable Audio Navigation

Jeff Wilson; Bruce N. Walker; Jeffrey Lindsay; Craig Cambias; Frank Dellaert

Wearable computers can certainly support audio-only presentation of information; a visual interface need not be present for effective user interaction. A system for wearable audio navigation (SWAN) is being developed to serve as a navigation and orientation aid for persons temporarily or permanently visually impaired. SWAN is a wearable computer consisting of audio-only output and tactile input via a handheld interface. SWAN aids a user in safe pedestrian navigation and includes the ability for the user to author new GIS data relevant to their needs of wayfinding, obstacle avoidance, and situational awareness support. Emphasis is placed on representing pertinent data with non-speech sounds through a process of sonification. SWAN relies on a geographic information system (GIS) infrastructure for supporting geocoding and spatialization of data. Furthermore, SWAN utilizes novel tracking technology.


Human Factors | 2006

Navigation Performance With a Virtual Auditory Display: Effects of Beacon Sound, Capture Radius, and Practice

Bruce N. Walker; Jeffrey Lindsay

Objective: We examined whether spatialized nonspeech beacons could guide navigation and how sound timbre, waypoint capture radius, and practice affect performance. Background: Auditory displays may assist mobility and wayfinding for those with temporary or permanent visual impairment, but they remain understudied. Previous systems have used speech-based interfaces. Method: Participants (108 undergraduates) navigated three maps, guided by one of three beacons (pink noise, sonar ping, or 1000-Hz pure tone) spatialized by a virtual reality engine. Dependent measures were efficiency of time and path length. Results: Overall navigation was very successful, with significant effects of practice and capture radius, and interactions with beacon sound. Overshooting and subsequent hunting for waypoints was exacerbated for small radius conditions. A human-scale capture radius (1.5 m) and sonar-like beacon yielded the optimal combination for safety and efficiency. Conclusion: The selection of beacon sound and capture radius depend on the specific application, including whether speed of travel or adherence to path are of primary concern. Extended use affects sound preferences and quickly leads to improvements in both speed and accuracy. Application: These findings should lead to improved wayfinding systems for the visually impaired as well as for first responders (e.g., firefighters) and soldiers.


tests and proofs | 2005

Mappings and metaphors in auditory displays: An experimental assessment

Bruce N. Walker; Gregory Kramer

Auditory displays are becoming more and more common, but there are still no general guidelines for mapping data dimensions (e.g., temperature) onto display dimensions (e.g., pitch). This paper presents experimental research on different mappings and metaphors, in a generic process-control task environment, with reaction time and accuracy as dependent measures. It is hoped that this area of investigation will lead to the development of mapping guidelines applicable to auditory displays in a wide range of task domains.


Journal of Experimental Psychology: Applied | 2002

Magnitude estimation of conceptual data dimensions for use in sonification

Bruce N. Walker

Sonifications must match listener expectancies about representing data with sound. Three experiments showed the utility of magnitude estimation for this. In Experiment 1, 67 undergraduates judged the sizes of visual stimuli and the temperature, pressure, velocity, size, or dollars they represented. Similarly, in Experiment 2, 132 listeners judged the pitch or tempo of sounds and the data they represented. In both experiments, polarity and scaling preference depended on the conceptual data dimension. In Experiment 3, 60 listeners matched auditory graphs to data created with the results of Experiment 2, providing initial validation of scaling slopes. Magnitude estimation is proposed as a design tool in the development of data sonifications, with the level of polarity preference agreement predicting mapping effectiveness.


intelligent robots and systems | 2004

Map-based priors for localization

Sang Min Oh; Sarah Tariq; Bruce N. Walker; Frank Dellaert

Localization from sensor measurements is a fundamental task for navigation. Particle filters are among the most promising candidates to provide a robust and real-time solution to the localization problem. They instantiate the localization problem as a Bayesian altering problem and approximate the posterior density over location by a weighted sample set. In this paper, we introduce map-based priors for localization, using the semantic information available in maps to bias the motion model toward areas of higher probability. We, show that such priors, under a particular assumption, can easily be incorporated in the particle filter by means of a pseudo likelihood. The resulting filter is more reliable and more accurate. We show experimental results on a GPS based outdoor people tracker that illustrate the approach and highlight its potential.


Reviews of Human Factors and Ergonomics | 2011

Auditory displays for in-vehicle technologies

Michael A. Nees; Bruce N. Walker

Modern vehicle cockpits have begun to incorporate a number of information-rich technologies, including systems to enhance and improve driving and navigation performance and also driving-irrelevant information systems. The visually intensive nature of the driving task requires these systems to adopt primarily nonvisual means of information display, and the auditory modality represents an obvious alternative to vision for interacting with in-vehicle technologies (IVTs). Although the literature on auditory displays has grown tremendously in recent decades, to date, few guidelines or recommendations exist to aid in the design of effective auditory displays for IVTs. This chapter provides an overview of the current state of research and practice with auditory displays for IVTs. The role of basic auditory capabilities and limitations as they relate to in-vehicle auditory display design are discussed. Extant systems and prototypes are reviewed, and when possible, design recommendations are made. Finally, research needs and an iterative design process to meet those needs are discussed. Keywords: Driver distraction; Language: en


ACM Transactions on Accessible Computing | 2010

Universal Design of Auditory Graphs: A Comparison of Sonification Mappings for Visually Impaired and Sighted Listeners

Bruce N. Walker; Lisa M. Mauney

Determining patterns in data is an important and often difficult task for scientists and students. Unfortunately, graphing and analysis software typically is largely inaccessible to users with vision impairment. Using sound to represent data (i.e., sonification or auditory graphs) can make data analysis more accessible; however, there are few guidelines for designing such displays for maximum effectiveness. One crucial yet understudied design issue is exactly how changes in data (e.g., temperature) are mapped onto changes in sound (e.g., pitch), and how this may depend on the specific user. In this study, magnitude estimation was used to determine preferred data-to-display mappings, polarities, and psychophysical scaling functions relating data values to underlying acoustic parameters (frequency, tempo, or modulation index) for blind and visually impaired listeners. The resulting polarities and scaling functions are compared to previous results with sighted participants. There was general agreement about polarities obtained with the two listener populations, with some notable exceptions. There was also evidence for strong similarities regarding the magnitudes of the slopes of the scaling functions, again with some notable differences. For maximum effectiveness, sonification software designers will need to consider carefully their intended users’ vision abilities. Practical implications and limitations are discussed.


Human Factors | 2013

Spearcons (Speech-Based Earcons) Improve Navigation Performance in Advanced Auditory Menus

Bruce N. Walker; Jeffrey Lindsay; Amanda Nance; Yoko Nakano; Dianne K. Palladino; Tilman Dingler; Myounghoon Jeon

Objective: The goal of this project is to evaluate a new auditory cue, which the authors call spearcons, in comparison to other auditory cues with the aim of improving auditory menu navigation. Background: With the shrinking displays of mobile devices and increasing technology use by visually impaired users, it becomes important to improve usability of non-graphical user interface (GUI) interfaces such as auditory menus. Using nonspeech sounds called auditory icons (i.e., representative real sounds of objects or events) or earcons (i.e., brief musical melody patterns) has been proposed to enhance menu navigation. To compensate for the weaknesses of traditional nonspeech auditory cues, the authors developed spearcons by speeding up a spoken phrase, even to the point where it is no longer recognized as speech. Method: The authors conducted five empirical experiments. In Experiments 1 and 2, they measured menu navigation efficiency and accuracy among cues. In Experiments 3 and 4, they evaluated learning rate of cues and speech itself. In Experiment 5, they assessed spearcon enhancements compared to plain TTS (text to speech: speak out written menu items) in a two-dimensional auditory menu. Results: Spearcons outperformed traditional and newer hybrid auditory cues in navigation efficiency, accuracy, and learning rate. Moreover, spearcons showed comparable learnability as normal speech and led to better performance than speech-only auditory cues in two-dimensional menu navigation. Conclusion: These results show that spearcons can be more effective than previous auditory cues in menu-based interfaces. Application: Spearcons have broadened the taxonomy of nonspeech auditory cues. Users can benefit from the application of spearcons in real devices.


automotive user interfaces and interactive vehicular applications | 2009

Enhanced auditory menu cues improve dual task performance and are preferred with in-vehicle technologies

Myounghoon Jeon; Benjamin K. Davison; Michael A. Nees; Jeff Wilson; Bruce N. Walker

Auditory display research for driving has mainly focused on collision warning signals, and recent studies on auditory in-vehicle information presentation have examined only a limited range of tasks (e.g., cell phone operation tasks or verbal tasks such as reading digit strings). The present study used a dual task paradigm to evaluate a plausible scenario in which users navigated a song list. We applied enhanced auditory menu navigation cues, including spearcons (i.e., compressed speech) and a spindex (i.e., a speech index that used brief audio cues to communicate the users position in a long menu list). Twenty-four undergraduates navigated through an alphabetized song list of 150 song titles---rendered as an auditory menu---while they concurrently played a simple, perceptual-motor, ball-catching game. The menu was presented with text-to-speech (TTS) alone, TTS plus one of three types of enhanced auditory cues, or no sound at all. Both performance of the primary task (success rate of the game) and the secondary task (menu search time) were better with the auditory menus than with no sound. Subjective workload scores (NASA TLX) and user preferences favored the enhanced auditory cue types. Results are discussed in terms of multiple resources theory and practical IVT design applications.


ieee virtual reality conference | 2005

Exploring individual differences in raybased selection: strategies and traits

Chadwick A. Wingrave; Ryan Tintner; Bruce N. Walker; Doug A. Bowman; Larry F. Hodges

User-centered design is often performed without regard to individual user differences in aptitude and experience. The methodology of this study is an anthropological and observational approach observing users performing a selection task using common virtual environment raybased techniques and analyzes the interaction through psychology aptitude tests, questionnaires and observation. The results of this study show the approach yields useful information about users even in a simple task. The study indicates correlations between performance and aptitude test and user behavior performed to overcome difficulties in the task.

Collaboration


Dive into the Bruce N. Walker's collaboration.

Top Co-Authors

Avatar

Thomas M. Gable

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Myounghoon Jeon

Michigan Technological University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jeffrey Lindsay

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Brianna J. Tomlinson

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Carrie Bruce

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jonathan H. Schuett

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Benjamin K. Davison

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Brittany E. Noah

Georgia Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge