Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Richard L. McKinley is active.

Publication


Featured researches published by Richard L. McKinley.


Human Factors | 1999

Aurally aided visual search in three-dimensional space

Robert S. Bolia; William R. D'Angelo; Richard L. McKinley

We conducted an experiment to evaluate the effectiveness of spatial audio displays on target acquisition performance. Participants performed a visual search task with and without the aid of a spatial audio display. Potential target locations ranged between plus and minus 180° in azimuth and from -70° to +90° in elevation. Independent variables included the number of visual distractors present (1, 5, 10, 25, 50) and the spatial audio condition (no spatial audio, free-field spatial audio, virtual spatial audio). Results indicated that both free-field and virtual audio cues engendered a significant decrease in search times. Potential applications of this research include the design of spatial audio displays for aircraft cockpits and ground combat vehicles.


Human Factors | 1996

AURALLY AIDED VISUAL SEARCH UNDER VIRTUAL AND FREE-FIELD LISTENING CONDITIONS

David R. Perrott; John Cisneros; Richard L. McKinley; William R. D'Angelo

We examined the minimum latency required to locate and identify a visual target (visual search) in a two-alternative forced-choice paradigm in which the visual target could appear from any azimuth (0° to 360°) and from a broad range of elevations (from 90° above to 70° below the horizon) relative to a persons initial line of gaze. Seven people were tested in six conditions: unaided search, three aurally aided search conditions, and two visually aided search conditions. Aurally aided search with both actual and virtual sound localization cues proved to be superior to unaided and visually guided search. Application of synthesized three dimensional and two-dimensional sound cues in the workstations are discussed.


Human Factors | 1998

Effects of Localized Auditory Information on Visual Target Detection Performance Using a Helmet-Mounted Display

W. Todd Nelson; Lawrence J. Hettinger; James A. Cunningham; Bart J. Brickman; Michael W. Haas; Richard L. McKinley

An experiment was conducted to evaluate the effects of localized auditory information on visual target detection performance. Visual targets were presented on either a wide field-of-view dome display or a helmet-mounted display and were accompanied by either localized, nonlocalized, or no auditory information. The addition of localized auditory information resulted in significant increases in target detection performance and significant reductions in workload ratings as compared with conditions in which auditory information was either nonlocalized or absent. Qualitative and quantitative analyses of participants′ head motions revealed that the addition of localized auditory information resulted in extremely efficient and consistent search strategies. Implications for the development and design of multisensory virtual environments are discussed. Actual or potential applications of this research include the use of spatial auditory displays to augment visual information presented in helmet-mounted displays, thereby leading to increases in performance efficiency, reductions in physical and mental workload, and enhanced spatial awareness of objects in the environment.


Human Factors | 2005

The Impact of Hearing Protection on Sound Localization and Orienting Behavior

Brian D. Simpson; Robert S. Bolia; Richard L. McKinley; Douglas S. Brungart

The effect of hearing protection devices (HPDs) on sound localization was examined in the context of an auditory-cued visual search task. Participants were required to locate and identify a visual target in a field of 5, 20, or 50 visual distractors randomly distributed on the interior surface of a sphere. Four HPD conditions were examined: earplugs, earmuffs, both earplugs and earmuffs simultaneously (double hearing protection), and no hearing protection. In addition, there was a control condition in which no auditory cue was provided. A repeated measures analysis of variance revealed significant main effects of HPD for both search time and head motion data (p < .05), indicating that the degree to which localization is disrupted by HPDs varies with the type of device worn. When both earplugs and earmuffs are worn simultaneously, search times and head motion are more similar to those found when no auditory cue is provided than when either earplugs or earmuffs alone are worn, suggesting that sound localization cues are so severely disrupted by double hearing protection the listener can recover little or no information regarding the direction of sound source origin. Potential applications of this research include high-noise military, aerospace, and industrial settings in which HPDs are necessary but wearing double protection may compromise safety and/or performance.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1999

Spatial Audio Displays for Speech Communications: A Comparison of Free Field and Virtual Acoustic Environments

W. Todd Nelson; Robert S. Bolia; Mark A. Ericson; Richard L. McKinley

The ability of listeners to detect, identify, and monitor multiple simultaneous speech signals was measured in free field and virtual acoustic environments. Factorial combinations of four variables, including audio condition, spatial condition, the number of speech signals, and the sex of the talker were employed using a within-subjects design. Participants were required to detect the presentation of a critical speech signal among a background of non-signal speech events. Results indicated that spatial separation increased the percentage of correctly identified critical speech signals as the number of competing messages increased. These outcomes are discussed in the context of designing binaural speech displays to enhance speech communication in aviation environments.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1995

Aurally Aided Detection and Identification of Visual Targets

David R. Perrott; John Cisneros; Richard L. McKinley; William R. D'Angelo

The experiments described in this report provide baseline performance measures of aurally directed detection and search for visual targets in an observers immediate space. While the simple target detection task was restricted to the frontal hemi-field (extending 180 degrees in azimuth and 150 degrees in elevation), visual search performance (discrimination of which of two light arrays was present on a given trial) was evaluated for both the frontal and rear hemi-fields. In both tasks, the capacity to process information from the visual channel was improved substantially (a 10-50 percent reduction in latency) when spatial information from the auditory modality was provided concurrently. While performance gains were greatest for events in the rear hemi-field and in the peripheral regions of the frontal hemi-field, significant effects were also evident for events within the subjects central visual field. The relevance of these results to the development of virtual 3-D sound systems is discussed.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1995

An Initial Study of the Effects of 3-Dimensional Auditory Cueing on Visual Target Detection

Richard L. McKinley; William R. D'Angelo; Michael W. Haas; David R. Perrot; W. Todd Nelson; Lawrence J. Hettinger; Bart J. Brickman

Developments in virtual environment technology are enabling the rapid generation of systems that provide synthetic visual and auditory displays. The successful use of this technology in education, training, entertainment, and various other applications relies to a great extent on the effective combination of visual and auditory information. Little is known about the basic interactions between the auditory system and the visual system in real environments or virtual environments. Therefore, the purpose of the current study was to begin to assess the effectiveness of various combinations of visualauditory information in supporting the performance of a common task (detecting targets) in a virtual environment.


Journal of the Acoustical Society of America | 2016

Military jet noise source imaging using multisource statistically optimized near-field acoustical holography

Alan T. Wall; Kent L. Gee; Tracianne B. Neilsen; Richard L. McKinley; Michael M. James

The identification of acoustic sources is critical to targeted noise reduction efforts for jets on high-performance tactical aircraft. This paper describes the imaging of acoustic sources from a tactical jet using near-field acoustical holography techniques. The measurement consists of a series of scans over the hologram with a dense microphone array. Partial field decomposition methods are performed to generate coherent holograms. Numerical extrapolation of data beyond the measurement aperture mitigates artifacts near the aperture edges. A multisource equivalent wave model is used that includes the effects of the ground reflection on the measurement. Multisource statistically optimized near-field acoustical holography (M-SONAH) is used to reconstruct apparent source distributions between 20 and 1250 Hz at four engine powers. It is shown that M-SONAH produces accurate field reconstructions for both inward and outward propagation in the region spanned by the physical hologram measurement. Reconstructions across the set of engine powers and frequencies suggests that directivity depends mainly on estimated source location; sources farther downstream radiate at a higher angle relative to the inlet axis. At some frequencies and engine powers, reconstructed fields exhibit multiple radiation lobes originating from overlapped source regions, which is a phenomenon relatively recently reported for full-scale jets.


International Journal of Occupational Safety and Ergonomics | 2000

The Effects of Hearing Protectors on Auditory Localization: Evidence From Audio-Visual Target Acquisition

Robert S. Bolia; Richard L. McKinley

Response times (RT) in an audio-visual target acquisition task were collected from 3 participants while wearing either circumaural earmuffs, foam earplugs, or no hearing protection. Analyses revealed that participants took significantly longer to locate and identify an audio-visual target in both hearing protector conditions than they did in the unoccluded condition, suggesting a disturbance of the cues used by listeners to localize sounds in space. RTs were significantly faster in both hearing protector conditions than in a non-audio control condition, indicating that auditory localization was not completely disrupted. Results are discussed in terms of safety issues involved with wearing hearing protectors in an occupational environment.


aiaa/ceas aeroacoustics conference | 2013

On the Evolution of Crackle in Jet Noise from High- Performance Engines

Kent L. Gee; Tracianne B. Neilsen; Michael B. Muhlestein; Alan T. Wall; J. Micah Downing; Michael M. James; Blue Ridge; Richard L. McKinley

Crackle, the impulsive quality sometimes present in supersonic jet noise, has traditionally been defined in terms of the pressure waveform skewness. However, recent work has shown that the pressure waveform time derivative is a better quantifier of the acoustic shocks believed to be responsible for its perception. This paper discusses two definitions of crackle, waveform asymmetry versus shock content, and crackle as a source or propagation-related phenomenon. Data from two static military jet aircraft tests are used to demonstrate that the skewed waveforms radiated from the jet undergo significant nonlinear steepening and shock formation, as evidenced by the skewness of the time derivative. Thus, although skewness is a source phenomenon, crackle’s perceived quality is heavily influenced by propagation through the near field and into the far field to the extent that crackle is caused by the presence of shock-like features in the waveform.

Collaboration


Dive into the Richard L. McKinley's collaboration.

Top Co-Authors

Avatar

Hilary L Gallagher

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Alan T. Wall

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Mark A. Ericson

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Kent L. Gee

Brigham Young University

View shared research outputs
Top Co-Authors

Avatar

Robert S. Bolia

Wright-Patterson Air Force Base

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Brian D. Simpson

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Melissa A Theis

Oak Ridge Institute for Science and Education

View shared research outputs
Top Co-Authors

Avatar

W. Todd Nelson

Air Force Research Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge