Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jess Rowland is active.

Publication


Featured researches published by Jess Rowland.


The Journal of Neuroscience | 2010

Early Suppressive Mechanisms and the Negative Blood Oxygenation Level-Dependent Response in Human Visual Cortex

Alex R. Wade; Jess Rowland

Functional magnetic resonance imaging (fMRI) studies of early sensory cortex often measure stimulus-driven increases in the blood oxygenation level-dependent (BOLD) signal. However, these positive responses are frequently accompanied by reductions in the BOLD signal in adjacent regions of cortex. Although this negative BOLD response (NBR) is thought to result from neuronal suppression, the precise relationship between local activity, suppression, and perception remains unknown. By measuring BOLD signals in human primary visual cortex while varying the baseline contrast levels in the region affected by the NBR, we tested three physiologically plausible computational models of neuronal modulation that could explain this phenomenon: a subtractive model, a response gain model, and a contrast gain model. We also measured the ability of isoluminant contrast to generate an NBR. We show that the NBR can be modeled as a pathway-specific contrast gain modulation that is strongest outside the fovea. We found a similar spatial bias in a psychophysical study using identical stimuli, although these data indicated a response gain rather than a contrast gain mechanism. We reconcile these findings by proposing (1) that the NBR is associated with a long-range suppressive mechanism that hyperpolarizes a subset of magnocellularly driven neurons at the input to V1, (2) that this suppression is broadly tuned to match the spatial features of the mask region, and (3) that increasing the baseline contrast in the suppressed region drives all neurons in the input layer, reducing the relative contribution of the suppressing subpopulation in the fMRI signal.


The Journal of Neuroscience | 2006

An Oculomotor Decision Process Revealed by Functional Magnetic Resonance Imaging

Stephen Heinen; Jess Rowland; Byeong-Taek Lee; Alex R. Wade

It is not known how the brain decides to act on moving objects. We demonstrated previously that neurons in the macaque supplementary eye field (SEF) reflect the rule of ocular baseball, a go/nogo task in which eye movements signal the rule-guided interpretation of the trajectory of a target. In ocular baseball, subjects must decide whether to pursue a moving spot target with an eye movement after discriminating whether the target will cross a distal, visible line segment. Here we identify cortical regions active during the ocular baseball task using event-related human functional magnetic resonance imaging (fMRI) and concurrent eye-movement monitoring. Task-related activity was observed in the SEF, the frontal eye field (FEF), the superior parietal lobule (SPL), and the right ventrolateral prefrontal cortex (VLPFC). The SPL and right VLPFC showed heightened activity only during ocular baseball, despite identical stimuli and oculomotor demands in the control task, implicating these areas in the decision process. Furthermore, the right VLPFC but not the SPL showed the greatest activation during the nogo decision trials. This suggests both a functional dissociation between these areas and a role for the right VLPFC in rule-guided inhibition of behavior. In the SEF and FEF, activity was similar for ocular baseball and a control eye-movement task. We propose that, although the SEF reflects the ocular baseball rule, both areas in humans are functionally closer to motor processing than the SPL and the right VLPFC. By recording population activity with fMRI during the ocular baseball task, we have revealed the cortical substrate of an oculomotor decision process.


PLOS Biology | 2017

Concurrent temporal channels for auditory processing: Oscillatory neural entrainment reveals segregation of function at different scales

Xiangbin Teng; Xing Tian; Jess Rowland; David Poeppel

Natural sounds convey perceptually relevant information over multiple timescales, and the necessary extraction of multi-timescale information requires the auditory system to work over distinct ranges. The simplest hypothesis suggests that temporal modulations are encoded in an equivalent manner within a reasonable intermediate range. We show that the human auditory system selectively and preferentially tracks acoustic dynamics concurrently at 2 timescales corresponding to the neurophysiological theta band (4–7 Hz) and gamma band ranges (31–45 Hz) but, contrary to expectation, not at the timescale corresponding to alpha (8–12 Hz), which has also been found to be related to auditory perception. Listeners heard synthetic acoustic stimuli with temporally modulated structures at 3 timescales (approximately 190-, approximately 100-, and approximately 30-ms modulation periods) and identified the stimuli while undergoing magnetoencephalography recording. There was strong intertrial phase coherence in the theta band for stimuli of all modulation rates and in the gamma band for stimuli with corresponding modulation rates. The alpha band did not respond in a similar manner. Classification analyses also revealed that oscillatory phase reliably tracked temporal dynamics but not equivalently across rates. Finally, mutual information analyses quantifying the relation between phase and cochlear-scaled correlations also showed preferential processing in 2 distinct regimes, with the alpha range again yielding different patterns. The results support the hypothesis that the human auditory system employs (at least) a 2-timescale processing mode, in which lower and higher perceptual sampling scales are segregated by an intermediate temporal regime in the alpha band that likely reflects different underlying computations.


The Journal of Neuroscience | 2011

Attentional modulation of fMRI responses in human V1 is consistent with distinct spatial maps for chromatically-defined orientation and contrast

Joo-Hyun Song; Jess Rowland; Robert M. McPeek; Alex R. Wade

Attending to different stimulus features such as contrast or orientation can change the pattern of neural responses in human V1 measured with fMRI. We show that these pattern changes are much more distinct for colored stimuli than for achromatic stimuli. This is evidence for a classic model of V1 functional architecture in which chromatic contrast and orientation are coded in spatially distinct neural domains, while achromatic contrast and orientation are not.


Psychonomic Bulletin & Review | 2018

There is music in repetition: Looped segments of speech and nonspeech induce the perception of music in a time-dependent manner

Jess Rowland; Anna Kasdan; David Poeppel

While many techniques are known to music creators, the technique of repetition is one of the most commonly deployed. The mechanism by which repetition is effective as a music-making tool, however, is unknown. Building on the speech-to-song illusion (Deutsch, Henthorn, & Lapidis in Journal of the Acoustical Society of America, 129(4), 2245–2252, 2011), we explore a phenomenon in which perception of musical attributes are elicited from repeated, or “looped,” auditory material usually perceived as nonmusical such as speech and environmental sounds. We assessed whether this effect holds true for speech stimuli of different lengths; nonspeech sounds (water dripping); and speech signals decomposed into their rhythmic and spectral components. Participants listened to looped stimuli (from 700 to 4,000 ms) and provided continuous as well as discrete perceptual ratings. We show that the regularizing effect of repetition generalizes to nonspeech auditory material and is strongest for shorter clip lengths in the speech and environmental cases. We also find that deconstructed pitch and rhythmic speech components independently elicit a regularizing effect, though the effect across segment duration is different than that for intact speech and environmental sounds. Taken together, these experiments suggest repetition may invoke active internal mechanisms that bias perception toward musical structure.


Journal of Experimental Psychology: General | 2018

Rapid timing of musical aesthetic judgments.

Amy M. Belfi; Anna Kasdan; Jess Rowland; Edward A. Vessel; G. Gabrielle Starr; David Poeppel

In recent years, psychological models of perception have undergone reevaluation due to a broadening of focus toward understanding not only how observers perceive stimuli but also how they subjectively evaluate stimuli. Here, we investigated the time course of such aesthetic evaluations using a gating paradigm. In a series of experiments, participants heard excerpts of classical, jazz, and electronica music. Excerpts were of different durations (250 ms, 500 ms, 750 ms, 1,000 ms, 2,000 ms, 10,000 ms) or note values (eighth note, quarter note, half note, dotted-half note, whole note, and entire 10,000 ms excerpt). After each excerpt, participants rated how much they liked the excerpt on a 9-point Likert scale. In Experiment 1, listeners made accurate aesthetic judgments within 750 ms for classical and jazz pieces, while electronic pieces were judged within 500 ms. When translated into note values (Experiment 2), electronica and jazz clips were judged more quickly than classical. In Experiment 3, we manipulated the familiarity of the musical excerpts. Unfamiliar clips were judged more quickly (500 ms) than familiar clips (750 ms), but there was overall higher accuracy for familiar pieces. Finally, we investigated listeners’ aesthetic judgments continuously over the time course of more naturalistic (60 s) excerpts: Within 3 s, listeners’ judgments differed between most- and least-liked pieces. We suggest that such rapid aesthetic judgments represent initial gut-level decisions that are made quickly, but that even these initial judgments are influenced by characteristics such as genre and familiarity.


Current Biology | 2017

Brain-to-Brain Synchrony Tracks Real-World Dynamic Group Interactions in the Classroom

Suzanne Dikker; Lu Wan; Ido Davidesco; Lisa Kaggen; Matthias Oostrik; James McClintock; Jess Rowland; G. Michalareas; Jay J. Van Bavel; Mingzhou Ding; David Poeppel


Journal of Cognitive Neuroscience | 2018

Brain-to-Brain Synchrony and Learning Outcomes Vary by Student–Teacher Dynamics: Evidence from a Real-world Classroom Electroencephalography Study

Dana Bevilacqua; Ido Davidesco; Lu Wan; Matthias Oostrik; Kim Chaloner; Jess Rowland; Mingzhou Ding; David Poeppel; Suzanne Dikker


Attention Perception & Psychophysics | 2015

Decoding time for the identification of musical key

Morwaread Farbood; Jess Rowland; Gary F. Marcus; Oded Ghitza; David Poeppel


Archive | 2012

How fast can music and speech be perceived? Key identification in time-compressed music with periodic insertions of silence

Morwaread Farbood; Oded Ghitza; Jess Rowland; Gary F. Marcus; David Poeppel

Collaboration


Dive into the Jess Rowland's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lu Wan

University of Florida

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge