David R. Perrott
California State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by David R. Perrott.
Nature | 1999
Kourosh Saberi; David R. Perrott
Speech is the most complex auditory signal and requires the most processing. The human brain devotes large cortical areas, to deciphering the information it contains, as well as parsing speech sounds produced simultaneously by several speakers. The brain can also invoke corrective measures to restore distortions in speech; for example, if a brief speech sound is replaced by an interfering sound that masks it, such as a cough, the listener perceives the missing speech as if the brain interpolates through the absent segment. We have studied the intelligibility of speech, and find it is resistant to time reversal of local segments of a spoken sentence, which has been described as “the most drastic form of time scale distortion”.
Attention Perception & Psychophysics | 1990
David R. Perrott; Kourosh Saberi; Kathleen Brown; Thomas Z. Strybel
In Experiments 1 and 2, the time to locate and identify a visual target (visual search performance in a two-alternative forced-choice paradigm) was measured as a function of the location of the target relative to the subject’s initial line of gaze. In Experiment 1, tests were conducted within a 260° region on the horizontal plane at a fixed elevation (eye level). In Experiment 2, the position of the target was varied in both the horizontal (260°) and the vertical (±46° from the initial line of gaze) planes. In both experiments, and for all locations tested, the time required to conduct a visual search was reduced substantially (175–1,200 msec) when a 10-Hz click train was presented from the same location as that occupied by the visual target. Significant differences in latencies were still evident when the visual target was located within 10° of the initial line of gaze (central visual field). In Experiment 3, we examined head and eye movements that occur as subjects attempt to locate a sound source. Concurrent movements of the head and eyes are commonly encountered during auditorily directed search behavior. In over half of the trials, eyelid closures were apparent as the subjects attempted to orient themselves toward the sound source. The results from these experiments support the hypothesis that the auditory spatial channel has a significant role in regulating visual gaze.
Journal of the Acoustical Society of America | 1990
David R. Perrott; Kourosh Saberi
Minimum audible angle (MAA) thresholds were obtained for four subjects in a two-alternative, forced-choice, three up/one down, adaptive paradigm as a function of the orientation of the array of sources. With sources distributed on the horizontal plane, the mean MAA threshold was 0.97 degrees. With the sources distributed on the vertical plane (array rotated 90 degrees), the mean MAA threshold was 3.65 degrees. Performance in both conditions was well in line with previous experiments of this type. Tests were also conducted with sources distributed on oblique planes. As the array was rotated from 10 degrees-60 degrees from the horizontal plane, relatively little change in the MAA threshold was observed; the mean MAA thresholds ranged from 0.78 degrees to 1.06 degrees. Only when the array was nearly vertical (80 degrees) was there any appreciable loss in spatial resolution; the MAA threshold had increased to 1.8 degrees. The relevance of these results to research on auditory localization under natural listening conditions, especially in the presence of head movements, is also discussed.
Journal of the Acoustical Society of America | 1991
Kourosh Saberi; Lynda Dostal; Toktam Sadralodabai; Vicki Bull; David R. Perrott
Free-field release from masking was studied as a function of the spatial separation of a signal and masker in a two-interval, forced-choice (2IFC) adaptive paradigm. The signal was a 250-ms train of clicks (100/s) generated by filtering 50-microseconds pulses with a TDH-49 speaker (0.9 to 9.0 kHz). The masker was continuous broadband (0.7 to 11 kHz) white noise presented at a level of 44 dBA measured at the position of the subjects head. In experiment I, masked and absolute thresholds were measured for 36 signal source locations (10 degree increments) along the horizontal plane as a function of seven masking source locations (30 degree increments). In experiment II, both absolute and masked thresholds were measured for seven signal locations along three vertical planes located at azimuthal rotations of 0 degrees (median vertical plane), 45 degrees, and 90 degrees. In experiment III, monaural absolute and masked thresholds were measured for various signal-masker configurations. Masking-level differences (MLDs) were computed relative to the condition where the signal and mask were in front of the subjects after using absolute thresholds to account for differences in the signals sound-pressure level (SPL) due to direction. Maximum MLDs were 15 dB along the horizontal plane, 8 dB along the vertical, and 9 dB under monaural conditions.
Human Factors | 1991
David R. Perrott; Toktam Sadralodabai; Kourosh Saberi; Thomas Z. Strybel
Visual search performance was examined in a two-alternative, forced-choice paradigm. The task involved locating and identifying which of two visual targets was present on a trial. The location of the targets varied relative to the subjects initial fixation point from 0 to 14.8 deg. The visual targets were either presented concurrently with a sound located at the same position as the visual target or were presented in silence. Both the number of distractor visual figures (0-63) present in the field during the search (Experiments 1 and 2) and the distinctness of the visual target relative to the distractors (Experiment 2) were considered. Under all conditions, visual search latencies were reduced when spatially correlated sounds were present. Aurally guided search was particularly enhanced when the visual target was located in the peripheral regions of the central visual field and when a larger number of distractor images (63) were present. Similar results were obtained under conditions in which the target was visually enhanced. These results indicate that spatially correlated sounds may have considerable utility in highinformation environments (e.g., piloting an aircraft).
Human Factors | 1996
David R. Perrott; John Cisneros; Richard L. McKinley; William R. D'Angelo
We examined the minimum latency required to locate and identify a visual target (visual search) in a two-alternative forced-choice paradigm in which the visual target could appear from any azimuth (0° to 360°) and from a broad range of elevations (from 90° above to 70° below the horizon) relative to a persons initial line of gaze. Seven people were tested in six conditions: unaided search, three aurally aided search conditions, and two visually aided search conditions. Aurally aided search with both actual and virtual sound localization cues proved to be superior to unaided and visually guided search. Application of synthesized three dimensional and two-dimensional sound cues in the workstations are discussed.
Journal of the Acoustical Society of America | 1987
David R. Perrott; Hamlet Ambarsoom; Julianna Tucker
Two experiments examined the capacity of listeners to turn and face an active sound source. Tests were conducted with sources located in the subjects forward field (an arc extending from 60 degrees to the subjects right to 60 degrees to the left). Localization performance was determined under both monaural and binaural listening conditions, using both brief pulses and sustained pulse trains as target signals. Not unexpectedly, the ability to orient the face to a hidden sound source was very poor under monaural conditions if the listener received only a brief (100-ms) tonal pulse. When continuous pulse trains were employed, localization, even under monaural conditions, became quite accurate. Across conditions, this complex motor response produced results in agreement with those that have been obtained when subjects were only required to report their spatial impressions. In particular, performance with binaural pulse trains was observed to vary as a function of the frequency of the target signals employed. Descriptions of the head movement response, along with a discussion of some of the implications of ear-head coordination, are presented.
Journal of the Acoustical Society of America | 1984
David R. Perrott
Minimum audible angle was measured for simultaneous acoustic events. Localization of concurrent events was found to be a direct function of the spectral differences between the events, the angle between the sources, and the location of the sources within the field defined by the subject. In the latter case, the m.a.a. was smallest with sources placed symmetrically about the listeners median plane and maximal at the extreme lateral portions. Post-hoc tests were completed which indicate that the spectral limits for concurrent localization is dependent both upon the angular separation of the sources and the position within the field as defined by the locus of the subject. The functions obtained approach the values reported by Mills [J. Acoust. Soc. Am. 30, 237-246(1958)] as the temporal overlap between the concurrent events decreased. The present results suggest that a single localization function may exist with the optimal performance observed with fully successive stimuli and poorest performance in the condition involving simultaneous events. The implications of these results are discussed.
Journal of the Acoustical Society of America | 1990
Kourosh Saberi; David R. Perrott
Auditory resolution of moving sound sources was determined in a simulated motion paradigm for sources traveling along horizontal, vertical, or oblique orientations in the subjectss frontal plane. With motion restricted to the horizontal orientation, minimum audible movement angles (MAMA) ranged from about 1.7 degrees at the lowest velocity (1.8 degrees/s) to roughly 10 degrees at the highest velocity (320 degrees/s). With the sound moving along an oblique orientation (rotated 45 degrees relative to the horizontal) MAMAs generally matched those of the horizontal condition. When motion was restricted to the vertical, MAMAs were substantially larger at all velocities (often exceeding 8 degrees). Subsequent tests indicated that MAMAs are a U-shaped function of velocity, with optimum resolution obtained at about 2 degrees/s for the horizontal (and oblique) and 7-11 degrees/s for the vertical orientation. Additional tests conducted at a fixed velocity of 1.8 degrees/s along oblique orientations of 80 degrees and 87 degrees indicated that even a small deviation from the vertical had a significant impact on MAMAs. A displacement of 10 degrees from the vertical orientation (a slope of 80 degrees) was sufficient to reduce thresholds (obtained at a velocity of 1.8 degrees/s) from about 11 degrees to approximately 2 degrees (a fivefold increase in acuity). These results are in good agreement with our previous study of minimum audible angles long oblique planes [Perrott and Saberi, J. Acoust. Soc. Am. 87, 1728-1731 (1990)].(ABSTRACT TRUNCATED AT 250 WORDS)
Journal of the Acoustical Society of America | 1969
David R. Perrott; Michael A. Nelson
This study investigates the probability of detecting binaural beats as a function of the frequency of the standard signal used (f1) and of the dichotic frequency difference (|f1−f2|). The present findings indicate that the upper frequency limit for the perception of binaural beats depends upon the dichotic frequency differences employed, the psychophysical procedure used, and the criteria for a beat threshold. The probability of beat detections was maximal at 500 Hz, and decreased as frequency increased up to 1500 Hz. No reliable pattern of beat detections was observed above 1500 Hz. Binaural beats were obtained with both “fused” and “nonfused” dichotically presented signals. Implications for a general model were discussed.