Andrew J. Kolarik
University of Cambridge
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Andrew J. Kolarik.
Hearing Research | 2014
Andrew J. Kolarik; Silvia Cirstea; Shahina Pardhan; Brian C. J. Moore
There is currently considerable interest in the consequences of loss in one sensory modality on the remaining senses. Much of this work has focused on the development of enhanced auditory abilities among blind individuals, who are often able to use sound to navigate through space. It has now been established that many blind individuals produce sound emissions and use the returning echoes to provide them with information about objects in their surroundings, in a similar manner to bats navigating in the dark. In this review, we summarize current knowledge regarding human echolocation. Some blind individuals develop remarkable echolocation abilities, and are able to assess the position, size, distance, shape, and material of objects using reflected sound waves. After training, normally sighted people are also able to use echolocation to perceive objects, and can develop abilities comparable to, but typically somewhat poorer than, those of blind people. The underlying cues and mechanisms, operable range, spatial acuity and neurological underpinnings of echolocation are described. Echolocation can result in functional real life benefits. It is possible that these benefits can be optimized via suitable training, especially among those with recently acquired blindness, but this requires further study. Areas for further research are identified.
Experimental Brain Research | 2013
Andrew J. Kolarik; Silvia Cirstea; Shahina Pardhan
Totally blind listeners often demonstrate better than normal capabilities when performing spatial hearing tasks. Accurate representation of three-dimensional auditory space requires the processing of available distance information between the listener and the sound source; however, auditory distance cues vary greatly depending upon the acoustic properties of the environment, and it is not known which distance cues are important to totally blind listeners. Our data show that totally blind listeners display better performance compared to sighted age-matched controls for distance discrimination tasks in anechoic and reverberant virtual rooms simulated using a room-image procedure. Totally blind listeners use two major auditory distance cues to stationary sound sources, level and direct-to-reverberant ratio, more effectively than sighted controls for many of the virtual distances tested. These results show that significant compensation among totally blind listeners for virtual auditory spatial distance leads to benefits across a range of simulated acoustic environments. No significant differences in performance were observed between listeners with partial non-correctable visual losses and sighted controls, suggesting that sensory compensation for virtual distance does not occur for listeners with partial vision loss.
Attention Perception & Psychophysics | 2016
Andrew J. Kolarik; Brian C. J. Moore; Pavel Zahorik; Silvia Cirstea; Shahina Pardhan
Auditory distance perception plays a major role in spatial awareness, enabling location of objects and avoidance of obstacles in the environment. However, it remains under-researched relative to studies of the directional aspect of sound localization. This review focuses on the following four aspects of auditory distance perception: cue processing, development, consequences of visual and auditory loss, and neurological bases. The several auditory distance cues vary in their effective ranges in peripersonal and extrapersonal space. The primary cues are sound level, reverberation, and frequency. Nonperceptual factors, including the importance of the auditory event to the listener, also can affect perceived distance. Basic internal representations of auditory distance emerge at approximately 6 months of age in humans. Although visual information plays an important role in calibrating auditory space, sensorimotor contingencies can be used for calibration when vision is unavailable. Blind individuals often manifest supranormal abilities to judge relative distance but show a deficit in absolute distance judgments. Following hearing loss, the use of auditory level as a distance cue remains robust, while the reverberation cue becomes less effective. Previous studies have not found evidence that hearing-aid processing affects perceived auditory distance. Studies investigating the brain areas involved in processing different acoustic distance cues are described. Finally, suggestions are given for further research on auditory distance perception, including broader investigation of how background noise and multiple sound sources affect perceived auditory distance for those with sensory loss.
Experimental Brain Research | 2010
Andrew J. Kolarik; Thomas Hengist Margrain; Thomas Charles Augustus Freeman
Previous work on ocular-following emphasises the accuracy of tracking eye movements. However, a more complete understanding of oculomotor control should account for variable error as well. We identify two forms of precision: ‘shake’, occurring over shorter timescales; ‘drift’, occurring over longer timescales. We show how these can be computed across a series of eye movements (e.g. a sequence of slow-phases or collection of pursuit trials) and then measure accuracy and precision for younger and older observers executing different types of eye movement. Overall, we found older observers were less accurate over a range of stimulus speeds and less precise at faster eye speeds. Accuracy declined more steeply for reflexive eye movements and shake was independent of speed. In all other instances, the two measures of precision expanded non-linearly with mean eye speed. We also found that shake during fixation was similar to shake for reflexive eye movement. The results suggest that deliberate and reflexive eye movement do not share a common non-linearity or a common noise source. The relationship of our data to previous studies is discussed, as are the consequences of imprecise eye movement for models of oculomotor control and perception during eye movement.
Hearing Research | 2009
Andrew J. Kolarik; John Francis Culling
Binaural temporal resolution was measured using the discrimination of brief interaural time delays (ITDs). In experiment 1, three listeners performed a 2I-2AFC, ITD-discrimination procedure. ITD changes of 8 to 1024micros were applied to brief probe noises. These probes, with durations of 16 to 362ms, were placed symmetrically in time within a 500-ms burst of otherwise interaurally uncorrelated noise. Psychometric functions were measured to obtain thresholds and temporal windows fitted to those thresholds. The best-fitting window was a symmetric roex shape (equivalent rectangular duration=197ms), an order of magnitude longer than monaural temporal windows and differed substantially from windows reported by Bernstein et al. [Bernstein, L.R., Trahiotis, C., Akeroyd, M.A., Hartung, K., 2001. Sensitivity to brief changes of interaural time and interaural intensity. J. Acoust. Soc. Am. 109, 1604-1615]. Experiment 2, replicated their main experiment, comparing their ITD-detection task with a similar discrimination procedure. Thresholds in the detection conditions were significantly better than those in the discrimination condition, particularly for short probe durations, indicating the use of an additional cue at these durations for the detection task and thus undermining the assumptions made in their window fit.
Journal of the Acoustical Society of America | 2013
Andrew J. Kolarik; Silvia Cirstea; Shahina Pardhan
The study investigated how listeners used level and direct-to-reverberant ratio (D/R) cues to discriminate distances to virtual sound sources. Sentence pairs were presented at virtual distances in simulated rooms that were either reverberant or anechoic. Performance on the basis of level was generally better than performance based on D/R. Increasing room reverberation time improved performance based on the D/R cue such that the two cues provided equally effective information at further virtual source distances in highly reverberant environments. Orientation of the listener within the virtual room did not affect performance.
Journal of the Acoustical Society of America | 2013
Andrew J. Kolarik; Silvia Cirstea; Shahina Pardhan; Brian C. J. Moore
Auditory distance perception is a crucial component of blind listeners’ spatial awareness. Many studies have reported supra-normal spatial auditory abilities among blind individuals, such as enhanced azimuthal localization [Voss et al. (2004)] and distance discrimination [Kolarik et al. (in press)]. However, it is not known whether blind listeners are better able to use acoustic information to enhance judgments of distance to single sound sources, or whether lack of visual spatial cues prevents calibration of auditory distance information, leading to worse performance than for sighted listeners. Blind and sighted listeners were presented with single, stationary virtual sound sources between 1.22 and 13.79 m away in a virtual anechoic environment simulated using an image-source model. Stimuli were spoken sentences. Sighted listeners systematically underestimated distance to remote virtual sources, while blind listeners overestimated the distance to nearby virtual sources and underestimated it for remote virtual sources. The findings suggest that blind listeners are less accurate at judging absolute distance, and experience a compression of the auditory world, relative to sighted listeners. The results support a perceptual deficiency hypothesis for absolute distance judgments, suggesting that compensatory processes for audition do not develop among blind listeners when estimating the distance to single, stationary sound sources.
Journal of the Acoustical Society of America | 2011
Andrew J. Kolarik; Silvia Cirstea; Shahina Pardhan
The study investigated how level and reverberation cues contribute to distance discrimination, and how accuracy is affected by reverberation cue strength. Sentence pairs were presented at distances between 1 and 8 m in a virtual room simulated using an image-source model and two reverberation settings (lower and higher). Listeners performed discrimination judgments in three conditions: level cue only (Level-Only), reverberation only (Equalized), and both cues available (Normal). Percentage correct judgment of which sentence was closer was measured. Optimal distance discrimination was obtained in the Normal condition. Perception of the difference in distance between sentences had a lower threshold (i.e., performance was significantly better, p<0.05) for closer than further targets in Normal and Level-Only conditions. On the contrary, in the Equalized condition, these thresholds were lower for further than closer targets. Thresholds were lower at higher reverberation in the Equalized condition, and for furt...
Experimental Brain Research | 2017
Andrew J. Kolarik; Shahina Pardhan; Silvia Cirstea; Brian C. J. Moore
Compared to sighted listeners, blind listeners often display enhanced auditory spatial abilities such as localization in azimuth. However, less is known about whether blind humans can accurately judge distance in extrapersonal space using auditory cues alone. Using virtualization techniques, we show that auditory spatial representations of the world beyond the peripersonal space of blind listeners are compressed compared to those for normally sighted controls. Blind participants overestimated the distance to nearby sources and underestimated the distance to remote sound sources, in both reverberant and anechoic environments, and for speech, music, and noise signals. Functions relating judged and actual virtual distance were well fitted by compressive power functions, indicating that the absence of visual information regarding the distance of sound sources may prevent accurate calibration of the distance information provided by auditory signals.
PLOS ONE | 2016
Andrew J. Kolarik; Amy Scarfe; Brian C. J. Moore; Shahina Pardhan
Accurate motor control is required when walking around obstacles in order to avoid collisions. When vision is unavailable, sensory substitution can be used to improve locomotion through the environment. Tactile sensory substitution devices (SSDs) are electronic travel aids, some of which indicate the distance of an obstacle using the rate of vibration of a transducer on the skin. We investigated how accurately such an SSD guided navigation in an obstacle circumvention task. Using an SSD, 12 blindfolded participants navigated around a single flat 0.6 x 2 m obstacle. A 3-dimensional Vicon motion capture system was used to quantify various kinematic indices of human movement. Navigation performance under full vision was used as a baseline for comparison. The obstacle position was varied from trial to trial relative to the participant, being placed at two distances 25 cm to the left, right or directly ahead. Under SSD guidance, participants navigated without collision in 93% of trials. No collisions occurred under visual guidance. Buffer space (clearance between the obstacle and shoulder) was larger by a factor of 2.1 with SSD guidance than with visual guidance, movement times were longer by a factor of 9.4, and numbers of velocity corrections were larger by a factor of 5 (all p<0.05). Participants passed the obstacle on the side affording the most space in the majority of trials for both SSD and visual guidance conditions. The results are consistent with the idea that SSD information can be used to generate a protective envelope during locomotion in order to avoid collisions when navigating around obstacles, and to pass on the side of the obstacle affording the most space in the majority of trials.