Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Aaron R. Nidiffer is active.

Publication


Featured researches published by Aaron R. Nidiffer.


Brain Topography | 2014

Identifying and Quantifying Multisensory Integration: A Tutorial Review

Ryan A. Stevenson; Dipanwita Ghose; Juliane Krueger Fister; Diana K. Sarko; Nicholas Altieri; Aaron R. Nidiffer; LeAnne R. Kurela; Justin K. Siemann; Thomas W. James; Mark T. Wallace

We process information from the world through multiple senses, and the brain must decide what information belongs together and what information should be segregated. One challenge in studying such multisensory integration is how to quantify the multisensory interactions, a challenge that is amplified by the host of methods that are now used to measure neural, behavioral, and perceptual responses. Many of the measures that have been developed to quantify multisensory integration (and which have been derived from single unit analyses), have been applied to these different measures without much consideration for the nature of the process being studied. Here, we provide a review focused on the means with which experimenters quantify multisensory processes and integration across a range of commonly used experimental methodologies. We emphasize the most commonly employed measures, including single- and multiunit responses, local field potentials, functional magnetic resonance imaging, and electroencephalography, along with behavioral measures of detection, accuracy, and response times. In each section, we will discuss the different metrics commonly used to quantify multisensory interactions, including the rationale for their use, their advantages, and the drawbacks and caveats associated with them. Also discussed are possible alternatives to the most commonly used metrics.


Experimental Brain Research | 2012

Interactions between the spatial and temporal stimulus factors that influence multisensory integration in human performance

Ryan A. Stevenson; Juliane Krueger Fister; Zachary P. Barnett; Aaron R. Nidiffer; Mark T. Wallace

In natural environments, human sensory systems work in a coordinated and integrated manner to perceive and respond to external events. Previous research has shown that the spatial and temporal relationships of sensory signals are paramount in determining how information is integrated across sensory modalities, but in ecologically plausible settings, these factors are not independent. In the current study, we provide a novel exploration of the impact on behavioral performance for systematic manipulations of the spatial location and temporal synchrony of a visual-auditory stimulus pair. Simple auditory and visual stimuli were presented across a range of spatial locations and stimulus onset asynchronies (SOAs), and participants performed both a spatial localization and simultaneity judgment task. Response times in localizing paired visual-auditory stimuli were slower in the periphery and at larger SOAs, but most importantly, an interaction was found between the two factors, in which the effect of SOA was greater in peripheral as opposed to central locations. Simultaneity judgments also revealed a novel interaction between space and time: individuals were more likely to judge stimuli as synchronous when occurring in the periphery at large SOAs. The results of this study provide novel insights into (a) how the speed of spatial localization of an audiovisual stimulus is affected by location and temporal coincidence and the interaction between these two factors and (b) how the location of a multisensory stimulus impacts judgments concerning the temporal relationship of the paired stimuli. These findings provide strong evidence for a complex interdependency between spatial location and temporal structure in determining the ultimate behavioral and perceptual outcome associated with a paired multisensory (i.e., visual-auditory) stimulus.


Neuropsychologia | 2016

Interactions between space and effectiveness in human multisensory performance.

Aaron R. Nidiffer; Ryan A. Stevenson; Juliane Krueger Fister; Zachary P. Barnett; Mark T. Wallace

Several stimulus factors are important in multisensory integration, including the spatial and temporal relationships of the paired stimuli as well as their effectiveness. Changes in these factors have been shown to dramatically change the nature and magnitude of multisensory interactions. Typically, these factors are considered in isolation, although there is a growing appreciation for the fact that they are likely to be strongly interrelated. Here, we examined interactions between two of these factors - spatial location and effectiveness - in dictating performance in the localization of an audiovisual target. A psychophysical experiment was conducted in which participants reported the perceived location of visual flashes and auditory noise bursts presented alone and in combination. Stimuli were presented at four spatial locations relative to fixation (0°, 30°, 60°, 90°) and at two intensity levels (high, low). Multisensory combinations were always spatially coincident and of the matching intensity (high-high or low-low). In responding to visual stimuli alone, localization accuracy decreased and response times (RTs) increased as stimuli were presented at more eccentric locations. In responding to auditory stimuli, performance was poorest at the 30° and 60° locations. For both visual and auditory stimuli, accuracy was greater and RTs were faster for more intense stimuli. For responses to visual-auditory stimulus combinations, performance enhancements were found at locations in which the unisensory performance was lowest, results concordant with the concept of inverse effectiveness. RTs for these multisensory presentations frequently violated race-model predictions, implying integration of these inputs, and a significant location-by-intensity interaction was observed. Performance gains under multisensory conditions were larger as stimuli were positioned at more peripheral locations, and this increase was most pronounced for the low-intensity conditions. These results provide strong support that the effects of stimulus location and effectiveness on multisensory integration are interdependent, with both contributing to the overall effectiveness of the stimuli in driving the resultant multisensory response.


I-perception | 2011

Interaction between Space and Effectiveness in Multisensory Integration: Behavioral and Perceptual Measures

Aaron R. Nidiffer; Ryan A. Stevenson; Juliane Krueger-Fister; Zachary P. Barnett; Mark T. Wallace

Previous research has described several core principles of multisensory integration. These include the spatial principle, which relates the integrative product to the physical location of the stimuli and the principle of inverse effectiveness, in which minimally effective stimuli elicit the greatest multisensory gains when combined. In the vast majority of prior studies, these principles have been studied in isolation, with little attention to their interrelationships and possible interactions. Recent neurophysiological studies in our laboratory have begun to examine these interactions within individual neurons in animal models, work that we extend here into the realm of human performance and perception. To test this, we conducted a psychophysical experiment in which 51 participants were tasked with judging the location of a target stimulus. Target stimuli were visual flashes and auditory noise bursts presented either alone or in combination at four locations and at two intensities. Multisensory combinations were always spatially coincident. A significant effect was found for response times and a marginal effect was found for accuracy, such that the degree of multisensory gain changed as a function of the interaction between space and effectiveness. These results provide further evidence for a strong interrelationship between the multisensory principles in dictating human performance.


Neuropsychologia | 2016

Stimulus intensity modulates multisensory temporal processing.

Juliane Krueger Fister; Ryan A. Stevenson; Aaron R. Nidiffer; Zachary P. Barnett; Mark T. Wallace


Archive | 2011

11 Spatial and Temporal Features of Multisensory Processes: Bridging between Animal and Human Studies

Diana K. Sarko; Aaron R. Nidiffer; Albert R. Powers; Dipanwita Ghose; Andrea Hillock-Dunn; Matthew C. Fister; Juliane Krueger; Mark T. Wallace


Archive | 2012

Spatial and Temporal Features of Multisensory Processes

Diana K. Sarko; Aaron R. Nidiffer; Albert R. Powers; Dipanwita Ghose; Andrea Hillock-Dunn; Matthew C. Fister; Juliane Krueger; Mark T. Wallace


Archive | 2015

Modes in the Integration of Multisensory Stimuli Superior Colliculus Neurons Use Distinct Operational

Thomas J. Perrault; J. William Vaughan; Barry E. Stein; Dipanwita Ghose; Alexander Maier; Aaron R. Nidiffer; Mark T. Wallace; Liping Yu; Jinghong Xu; Benjamin A. Rowland; Jerome Carriot; Mohsen Jamali; Jessica X. Brooks; Kathleen E. Cullen


Archive | 2015

Voices Differently in the Superior Temporal Sulcus Different Neural Frequency Bands Integrate Faces and

Asif A. Ghazanfar; Chandramouli Chandrasekaran; Luis Lemus; Dipanwita Ghose; Alexander Maier; Aaron R. Nidiffer; Mark T. Wallace; C Perrodin; Christoph Kayser; Nk Logothetis; Christopher I. Petkov; Jaewon Hwang; Lizabeth M. Romanski


Archive | 2012

Development of multisensory integration in subcortical and cortical brain networks

Mark T. Wallace; Dipanwita Ghose; Aaron R. Nidiffer; Matthew C. Fister; Juliane Krueger Fister

Collaboration


Dive into the Aaron R. Nidiffer's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Juliane Krueger Fister

Vanderbilt University Medical Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge