Lynnette Leone
North Dakota State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Lynnette Leone.
I-perception | 2013
Lynnette Leone; Mark E. McCourt
A series of experiments measured the audiovisual stimulus onset asynchrony (SOAAV), yielding facilitative multisensory integration. We evaluated (1) the range of SOAAV over which facilitation occurred when unisensory stimuli were weak; (2) whether the range of SOAAV producing facilitation supported the hypothesis that physiological simultaneity of unisensory activity governs multisensory facilitation; and (3) whether AV multisensory facilitation depended on relative stimulus intensity. We compared response-time distributions to unisensory auditory (A) and visual (V) stimuli with those to AV stimuli over a wide range (300 and 20 ms increments) of SOAAV, across four conditions of varying stimulus intensity. In condition 1, the intensity of unisensory stimuli was adjusted such that d′ ≈ 2. In condition 2, V stimulus intensity was increased (d′ > 4), while A stimulus intensity was as in condition 1. In condition 3, A stimulus intensity was increased (d′ > 4) while V stimulus intensity was as in condition 1. In condition 4, both A and V stimulus intensities were increased to clearly suprathreshold levels (d′ > 4). Across all conditions of stimulus intensity, significant multisensory facilitation occurred exclusively for simultaneously presented A and V stimuli. In addition, facilitation increased as stimulus intensity increased, in disagreement with inverse effectiveness. These results indicate that the requirements for facilitative multisensory integration include both physical and physiological simultaneity.
European Journal of Neuroscience | 2015
Lynnette Leone; Mark E. McCourt
The ‘temporal rule’ of multisensory integration (MI) proposes that unisensory stimuli, and the neuronal responses they evoke, must fall within a window of integration. Ecological validity demands that MI should occur only for physically simultaneous events (which may give rise to non‐simultaneous neural activations), and spurious neural response simultaneities unrelated to environmental multisensory occurrences must somehow be rejected. Two experiments investigated the requirements of simultaneity for facilitative MI. Experiment 1 employed an reaction time (RT)/race model paradigm to measure audiovisual (AV) MI as a function of AV stimulus‐onset asynchrony (SOA) under fully dark adapted conditions for visual stimuli that were either rod‐ or cone‐isolating. Auditory stimulus intensity was constant. Despite a 155‐ms delay in mean RT to the scotopic vs. photopic stimulus, facilitative AV MI in both conditions occurred exclusively at an AV SOA of 0 ms. Thus, facilitative MI demands both physical and physiological simultaneity. Experiment 2 investigated the accuracy of simultaneity and temporal order judgements under the same stimulus conditions. Judgements of AV stimulus simultaneity or temporal order were significantly influenced by stimulus intensity, indicating different simultaneity requirements for these tasks. The possibility was considered that there are mechanisms by which the nervous system may take account of variations in response latency arising from changes in stimulus intensity in order to selectively integrate only those physiological simultaneities that arise from physical simultaneities. It was proposed that separate subsystems for AV MI exist that pertain to action and perception.
Laterality | 2010
Lynnette Leone; Mark E. McCourt
Acute alcohol challenge has been associated with a selective impairment of right hemisphere function. A hallmark of visuospatial neglect syndrome is that patients with right hemisphere lesions misbisect horizontal lines far rightward of veridical centre. Neurologically intact participants misbisect lines with a systematic leftward bias (pseudoneglect). Neuroimaging studies in neurologically intact participants reveal predominant right hemisphere activation during performance of line bisection tasks. The current study assessed whether acute alcohol challenge alters global visuospatial attention. Participants (N=18; 10 male; strongly right-handed; mean age 23 years) engaged in a forced-choice tachistoscopic line bisection task in both ethanol challenge (mean BAC=.077) and no ethanol control conditions. Mean leftward bisection error in the control condition was −0.238 degrees visual angle (1.05% line length), and leftward bisection error significantly increased (p=.001) under ethanol challenge (−0.333 degrees visual angle, 1.47% line length). Mean bisection precision in the control condition was 0.358 degrees visual angle (1.58% line length); bisection precision significantly deteriorated (p=.008) under ethanol challenge (0.489 degrees, 2.17% line length). Decreased bisection precision indicates that ethanol disrupts the fidelity of visuospatial performance. The exaggerated leftward bisection error implies that ethanol may exert a differential effect on left versus right hemispheric function with respect to the control of global visuospatial attention.
Vision Research | 2015
Mark E. McCourt; Lynnette Leone; Barbara Blakeslee
A variety of visual capacities show significant age-related alterations. We assessed suprathreshold contrast and brightness perception across the lifespan in a large sample of healthy participants (N=155; 142) ranging in age from 16 to 80 years. Experiment 1 used a quadrature-phase motion cancelation technique (Blakeslee & McCourt, 2008) to measure canceling contrast (in central vision) for induced gratings at two temporal frequencies (1 Hz and 4 Hz) at two test field heights (0.5° or 2°×38.7°; 0.052 c/d). There was a significant age-related reduction in canceling contrast at 4 Hz, but not at 1 Hz. We find no age-related change in induction magnitude in the 1 Hz condition. We interpret the age-related decline in grating induction magnitude at 4 Hz to reflect a diminished capacity for inhibitory processing at higher temporal frequencies. In Experiment 2 participants adjusted the contrast of a matching grating (0.5° or 2°×38.7°; 0.052 c/d) to equal that of both real (30% contrast, 0.052 c/d) and induced (McCourt, 1982) standard gratings (100% inducing grating contrast; 0.052 c/d). Matching gratings appeared in the upper visual field (UVF) and test gratings appeared in the lower visual field (LVF), and vice versa, at eccentricities of ±7.5°. Average induction magnitude was invariant with age for both test field heights. There was a significant age-related reduction in perceived contrast of stimuli in the LVF versus UVF for both real and induced gratings.
Neuroreport | 2016
Mark E. McCourt; Lynnette Leone
We asked whether the perceived direction of visual motion and contrast thresholds for motion discrimination are influenced by the concurrent motion of an auditory sound source. Visual motion stimuli were counterphasing Gabor patches, whose net motion energy was manipulated by adjusting the contrast of the leftward-moving and rightward-moving components. The presentation of these visual stimuli was paired with the simultaneous presentation of auditory stimuli, whose apparent motion in 3D auditory space (rightward, leftward, static, no sound) was manipulated using interaural time and intensity differences, and Doppler cues. In experiment 1, observers judged whether the Gabor visual stimulus appeared to move rightward or leftward. In experiment 2, contrast discrimination thresholds for detecting the interval containing unequal (rightward or leftward) visual motion energy were obtained under the same auditory conditions. Experiment 1 showed that the perceived direction of ambiguous visual motion is powerfully influenced by concurrent auditory motion, such that auditory motion ‘captured’ ambiguous visual motion. Experiment 2 showed that this interaction occurs at a sensory stage of processing as visual contrast discrimination thresholds (a criterion-free measure of sensitivity) were significantly elevated when paired with congruent auditory motion. These results suggest that auditory and visual motion signals are integrated and combined into a supramodal (audiovisual) representation of motion.
Journal of Vision | 2015
Lynnette Leone; Mark E. McCourt
Proceedings of SPIE | 2012
Lynnette Leone; Mark E. McCourt
Journal of Vision | 2016
Benjamin Stettler; Lynnette Leone; Mark E. McCourt
Journal of Vision | 2012
Lynnette Leone; Barbara Blakeslee; Mark E. McCourt
F1000Research | 2011
Lynnette Leone; Barbara Blakeslee; Mark E. McCourt