Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jeffrey Lindsay is active.

Publication


Featured researches published by Jeffrey Lindsay.


international symposium on wearable computers | 2007

SWAN: System for Wearable Audio Navigation

Jeff Wilson; Bruce N. Walker; Jeffrey Lindsay; Craig Cambias; Frank Dellaert

Wearable computers can certainly support audio-only presentation of information; a visual interface need not be present for effective user interaction. A system for wearable audio navigation (SWAN) is being developed to serve as a navigation and orientation aid for persons temporarily or permanently visually impaired. SWAN is a wearable computer consisting of audio-only output and tactile input via a handheld interface. SWAN aids a user in safe pedestrian navigation and includes the ability for the user to author new GIS data relevant to their needs of wayfinding, obstacle avoidance, and situational awareness support. Emphasis is placed on representing pertinent data with non-speech sounds through a process of sonification. SWAN relies on a geographic information system (GIS) infrastructure for supporting geocoding and spatialization of data. Furthermore, SWAN utilizes novel tracking technology.


Human Factors | 2006

Navigation Performance With a Virtual Auditory Display: Effects of Beacon Sound, Capture Radius, and Practice

Bruce N. Walker; Jeffrey Lindsay

Objective: We examined whether spatialized nonspeech beacons could guide navigation and how sound timbre, waypoint capture radius, and practice affect performance. Background: Auditory displays may assist mobility and wayfinding for those with temporary or permanent visual impairment, but they remain understudied. Previous systems have used speech-based interfaces. Method: Participants (108 undergraduates) navigated three maps, guided by one of three beacons (pink noise, sonar ping, or 1000-Hz pure tone) spatialized by a virtual reality engine. Dependent measures were efficiency of time and path length. Results: Overall navigation was very successful, with significant effects of practice and capture radius, and interactions with beacon sound. Overshooting and subsequent hunting for waypoints was exacerbated for small radius conditions. A human-scale capture radius (1.5 m) and sonar-like beacon yielded the optimal combination for safety and efficiency. Conclusion: The selection of beacon sound and capture radius depend on the specific application, including whether speed of travel or adherence to path are of primary concern. Extended use affects sound preferences and quickly leads to improvements in both speed and accuracy. Application: These findings should lead to improved wayfinding systems for the visually impaired as well as for first responders (e.g., firefighters) and soldiers.


Human Factors | 2013

Spearcons (Speech-Based Earcons) Improve Navigation Performance in Advanced Auditory Menus

Bruce N. Walker; Jeffrey Lindsay; Amanda Nance; Yoko Nakano; Dianne K. Palladino; Tilman Dingler; Myounghoon Jeon

Objective: The goal of this project is to evaluate a new auditory cue, which the authors call spearcons, in comparison to other auditory cues with the aim of improving auditory menu navigation. Background: With the shrinking displays of mobile devices and increasing technology use by visually impaired users, it becomes important to improve usability of non-graphical user interface (GUI) interfaces such as auditory menus. Using nonspeech sounds called auditory icons (i.e., representative real sounds of objects or events) or earcons (i.e., brief musical melody patterns) has been proposed to enhance menu navigation. To compensate for the weaknesses of traditional nonspeech auditory cues, the authors developed spearcons by speeding up a spoken phrase, even to the point where it is no longer recognized as speech. Method: The authors conducted five empirical experiments. In Experiments 1 and 2, they measured menu navigation efficiency and accuracy among cues. In Experiments 3 and 4, they evaluated learning rate of cues and speech itself. In Experiment 5, they assessed spearcon enhancements compared to plain TTS (text to speech: speak out written menu items) in a two-dimensional auditory menu. Results: Spearcons outperformed traditional and newer hybrid auditory cues in navigation efficiency, accuracy, and learning rate. Moreover, spearcons showed comparable learnability as normal speech and led to better performance than speech-only auditory cues in two-dimensional menu navigation. Conclusion: These results show that spearcons can be more effective than previous auditory cues in menu-based interfaces. Application: Spearcons have broadened the taxonomy of nonspeech auditory cues. Users can benefit from the application of spearcons in real devices.


conference on computers and accessibility | 2004

The audio abacus: representing numerical values with nonspeech sound for the visually impaired

Bruce N. Walker; Jeffrey Lindsay; Justin Godfrey

Point estimation is a relatively unexplored facet of sonification. We present a new computer application, the Audio Abacus, designed to transform numbers into tones following the analogy of an abacus. As this is an entirely novel approach to sonifying exact data values, we have begun a systematic line of investigation into the application settings that work most effectively. Results are presented for an initial study. Users were able to perform relatively well with very little practice or training, boding well for this type of display. Further investigations are planned. This could prove to be very useful for visually impaired individuals given the common nature of numerical data in everyday settings.


Assistive Technology | 2005

Using Virtual Environments to Prototype Auditory Navigation Displays

Bruce N. Walker; Jeffrey Lindsay

There is a critical need for navigation and orientation aids for the visually impaired. Developing such displays is difficult and time consuming due to the lack of design tools and guidelines, the inefficiency of trial-and-error design, and experimental participant safety concerns. We discuss using a virtual environment (VE) to help in the design, evaluation, and iterative refinement of an auditory navigation system. We address questions about the (real) interface that the VE version allows us to study. Examples include sound design, system behavior, and user interface design. Improved designs should result from a more systematic and scientific method of assistive technology development. We also point out some of the ongoing caveats that researchers in this field need to consider, especially relating to external validity and over-reliance on VE for design solutions.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2005

Development and Evaluation of a System for Wearable Audio Navigation

Bruce N. Walker; Jeffrey Lindsay

If it is not possible to use vision when navigating through ones surroundings, moving safely and effectively becomes much harder. In such cases, non-speech audio cues can serve as navigation beacons, as well as denote features in the environment relevant to the user. This paper outlines and summarizes the development and evaluation of a System for Wearable Audio Navigation (SWAN), including an overview of completed, ongoing, and future research relating to the sounds used, the human-system interaction, output hardware, divided attention, and task effects.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2006

THE EFFECT OF A SPEECH DISCRIMINATION TASK ON NAVIGATION IN A VIRTUAL ENVIRONMENT

Bruce N. Walker; Jeffrey Lindsay

If the input from the visual system is unavailable (e.g., damage to the optic nerves or smoke in a burning building), navigating and avoiding obstacles becomes very challenging. It is therefore desirable to develop a navigation aide for use where visual input has become unavailable. There is a small body of research concerning such navigation aides and their efficacy. However, many issues that may have serious human factors repercussions for such a system are unexplored. This study was conducted in order to examine the effect of an attentionally demanding distractor task on wayfinding performance with an audio only navigation aide, in this case the System for Wearable Audio Navigation (SWAN). A difficult secondary speech task reduced efficiency in navigation, as users switched attentional resources to the speech task. Practical applications are discussed.


international conference on auditory display | 2006

SPEARCONS: SPEECH-BASED EARCONS IMPROVE NAVIGATION PERFORMANCE IN AUDITORY MENUS

Bruce N. Walker; Amanda Nance; Jeffrey Lindsay


international conference on auditory display | 2005

NAVIGATION PERFORMANCE IN A VIRTUAL ENVIRONMENT WITH BONEPHONES

Bruce N. Walker; Jeffrey Lindsay


Archive | 2008

LEARNABILTIY OF SOUND CUES FOR ENVIRONMENTAL FEATURES: AUDITORY ICONS, EARCONS, SPEARCONS, AND SPEECH

Tilman Dingler; Jeffrey Lindsay; Bruce N. Walker; Ludwig-Maximilians-Universität München; Forschungseinheit Medieninformatik

Collaboration


Dive into the Jeffrey Lindsay's collaboration.

Top Co-Authors

Avatar

Bruce N. Walker

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Amanda Nance

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Dianne K. Palladino

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Justin Godfrey

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Yoko Nakano

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Craig Cambias

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Frank Dellaert

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Jeff Wilson

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Myounghoon Jeon

Michigan Technological University

View shared research outputs
Researchain Logo
Decentralizing Knowledge