Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Scott A. Love is active.

Publication


Featured researches published by Scott A. Love.


PLOS ONE | 2013

A Psychophysical Investigation of Differences between Synchrony and Temporal Order Judgments.

Scott A. Love; Karin Petrini; Adam Cheng; Frank E. Pollick

Background Synchrony judgments involve deciding whether cues to an event are in synch or out of synch, while temporal order judgments involve deciding which of the cues came first. When the cues come from different sensory modalities these judgments can be used to investigate multisensory integration in the temporal domain. However, evidence indicates that that these two tasks should not be used interchangeably as it is unlikely that they measure the same perceptual mechanism. The current experiment further explores this issue across a variety of different audiovisual stimulus types. Methodology/Principal Findings Participants were presented with 5 audiovisual stimulus types, each at 11 parametrically manipulated levels of cue asynchrony. During separate blocks, participants had to make synchrony judgments or temporal order judgments. For some stimulus types many participants were unable to successfully make temporal order judgments, but they were able to make synchrony judgments. The mean points of subjective simultaneity for synchrony judgments were all video-leading, while those for temporal order judgments were all audio-leading. In the within participants analyses no correlation was found across the two tasks for either the point of subjective simultaneity or the temporal integration window. Conclusions Stimulus type influenced how the two tasks differed; nevertheless, consistent differences were found between the two tasks regardless of stimulus type. Therefore, in line with previous work, we conclude that synchrony and temporal order judgments are supported by different perceptual mechanisms and should not be interpreted as being representative of the same perceptual process.


Seeing and Perceiving | 2011

Cerebral correlates and statistical criteria of cross-modal face and voice integration.

Scott A. Love; Frank E. Pollick; Marianne Latinus

Perception of faces and voices plays a prominent role in human social interaction, making multisensory integration of cross-modal speech a topic of great interest in cognitive neuroscience. How to define potential sites of multisensory integration using functional magnetic resonance imaging (fMRI) is currently under debate, with three statistical criteria frequently used (e.g., super-additive, max and mean criteria). In the present fMRI study, 20 participants were scanned in a block design under three stimulus conditions: dynamic unimodal face, unimodal voice and bimodal face-voice. Using this single dataset, we examine all these statistical criteria in an attempt to define loci of face-voice integration. While the super-additive and mean criteria essentially revealed regions in which one of the unimodal responses was a deactivation, the max criterion appeared stringent and only highlighted the left hippocampus as a potential site of face- voice integration. Psychophysiological interaction analysis showed that connectivity between occipital and temporal cortices increased during bimodal compared to unimodal conditions. We concluded that, when investigating multisensory integration with fMRI, all these criteria should be used in conjunction with manipulation of stimulus signal-to-noise ratio and/or cross-modal congruency.


Cognitive, Affective, & Behavioral Neuroscience | 2014

The role of kinematics in cortical regions for continuous human motion perception

Phil McAleer; Frank E. Pollick; Scott A. Love; Frances Crabbe; Jeffrey M. Zacks

It has been proposed that we make sense of the movements of others by observing fluctuations in the kinematic properties of their actions. At the neural level, activity in the human motion complex (hMT+) and posterior superior temporal sulcus (pSTS) has been implicated in this relationship. However, previous neuroimaging studies have largely utilized brief, diminished stimuli, and the role of relevant kinematic parameters for the processing of human action remains unclear. We addressed this issue by showing extended-duration natural displays of an actor engaged in two common activities, to 12 participants in an fMRI study under passive viewing conditions. Our region-of-interest analysis focused on three neural areas (hMT+, pSTS, and fusiform face area) and was accompanied by a whole-brain analysis. The kinematic properties of the actor, particularly the speed of body part motion and the distance between body parts, were related to activity in hMT+ and pSTS. Whole-brain exploratory analyses revealed additional areas in posterior cortex, frontal cortex, and the cerebellum whose activity was related to these features. These results indicate that the kinematic properties of peoples’ movements are continually monitored during everyday activity as a step to determining actions and intent.


NeuroImage | 2016

The average baboon brain: MRI templates and tissue probability maps from 89 individuals

Scott A. Love; Damien Marie; Muriel Roth; Romain Lacoste; Bruno Nazarian; Alice Bertello; Olivier Coulon; Jean-Luc Anton; Adrien Meguerditchian

The baboon (Papio) brain is a remarkable model for investigating the brain. The current work aimed at creating a population-average baboon (Papio anubis) brain template and its left/right hemisphere symmetric version from a large sample of T1-weighted magnetic resonance images collected from 89 individuals. Averaging the prior probability maps output during the segmentation of each individual also produced the first baboon brain tissue probability maps for gray matter, white matter and cerebrospinal fluid. The templates and the tissue probability maps were created using state-of-the-art, freely available software tools and are being made freely and publicly available: http://www.nitrc.org/projects/haiko89/ or http://lpc.univ-amu.fr/spip.php?article589. It is hoped that these images will aid neuroimaging research of the baboon by, for example, providing a modern, high quality normalization target and accompanying standardized coordinate system as well as probabilistic priors that can be used during tissue segmentation.


COST'11 Proceedings of the 2011 international conference on Cognitive Behavioural Systems | 2011

Effects of experience, training and expertise on multisensory perception: investigating the link between brain and behavior

Scott A. Love; Frank E. Pollick; Karin Petrini

The ability to successfully integrate information from different senses is of paramount importance for perceiving the world and has been shown to change with experience. We first review how experience, in particular musical experience, brings about changes in our ability to fuse together sensory information about the world. We next discuss evidence from drumming studies that demonstrate how the perception of audiovisual synchrony depends on experience. These studies show that drummers are more robust than novices to perturbations of the audiovisual signals and appear to use different neural mechanisms in fusing sight and sound. Finally, we examine how experience influences audiovisual speech perception. We present an experiment investigating how perceiving an unfamiliar language influences judgments of temporal synchrony of the audiovisual speech signal. These results highlight the influence of both the listeners experience with hearing an unfamiliar language as well as the speakers experience with producing non-native words.


NeuroImage | 2011

Corrigendum to "Action expertise reduces brain activity for audiovisual matching actions An fMRI study with expert drummers" [NeuroImage 56/3 (2011) 1480-1492]

Karin Petrini; Frank E. Pollick; Sofia Dahl; Phil McAleer; Lawrie S. McKay; Davide Rocchesso; Carl Haakon Waadeland; Scott A. Love; Federico Avanzini; Aina Puce

a Department of Psychology, University of Glasgow, Glasgow, Scotland, UK b Department of Media Technology, Aalborg University Copenhagen, Copenhagen, Denmark c Netherlands Institute for Neuroscience (NIN), Amsterdam, The Netherlands d Department of Art and Industrial Design, IUAV University of Venice, Venice, Italy e Department of Music, Norwegian University of Science and Technology, Trondheim, Norway f Department of Information Engineering, University of Padua, Padua, Italy g Department of Psychological and Brain Sciences, Indiana University, Bloomington, USA


Cerebral Cortex | 2018

Left Brain Asymmetry of the Planum Temporale in a Nonhominid Primate: Redefining the Origin of Brain Specialization for Language

Damien Marie; Muriel Roth; Romain Lacoste; Bruno Nazarian; Alice Bertello; Jean-Luc Anton; William D. Hopkins; Konstantina Margiotoudi; Scott A. Love; Adrien Meguerditchian

The planum temporale (PT) is a critical region of the language functional network in the human brain showing a striking size asymmetry toward the left hemisphere. Historically considered as a structural landmark of the left-brain specialization for language, a similar anatomical bias has been described in great apes but never in monkeys-indicating that this brain landmark might be unique to Hominidae evolution. In the present in vivo magnetic resonance imaging study, we show clearly for the first time in a nonhominid primate species, an Old World monkey, a left size predominance of the PT among 96 olive baboons (Papio anubis), using manual delineation of this region in each individual hemisphere. This asymmetric distribution was quasi-identical to that found originally in humans. Such a finding questions the relationship between PT asymmetry and the emergence of language, indicating that the origin of this cerebral specialization could be much older than previously thought, dating back, not to the Hominidae, but rather to the Catarrhini evolution at the common ancestor of humans, great apes and Old World monkeys, 30-40 million years ago.


Frontiers in Human Neuroscience | 2018

Overlapping but Divergent Neural Correlates Underpinning Audiovisual Synchrony and Temporal Order Judgments

Scott A. Love; Karin Petrini; Cyril Pernet; Marianne Latinus; Frank E. Pollick

Multisensory processing is a core perceptual capability, and the need to understand its neural bases provides a fundamental problem in the study of brain function. Both synchrony and temporal order judgments are commonly used to investigate synchrony perception between different sensory cues and multisensory perception in general. However, extensive behavioral evidence indicates that these tasks do not measure identical perceptual processes. Here we used functional magnetic resonance imaging to investigate how behavioral differences between the tasks are instantiated as neural differences. As these neural differences could manifest at either the sustained (task/state-related) and/or transient (event-related) levels of processing, a mixed block/event-related design was used to investigate the neural response of both time-scales. Clear differences in both sustained and transient BOLD responses were observed between the two tasks, consistent with behavioral differences indeed arising from overlapping but divergent neural mechanisms. Temporal order judgments, but not synchrony judgments, required transient activation in several left hemisphere regions, which may reflect increased task demands caused by an extra stage of processing. Our results highlight that multisensory integration mechanisms can be task dependent, which, in particular, has implications for the study of atypical temporal processing in clinical populations.


Archive | 2015

Neural Bases for Social Attention in Healthy Humans

Aina Puce; Marianne Latinus; Alejandra Rossi; Elizabeth daSilva; Francisco J. Parada; Scott A. Love; Arian Ashourvan; Swapnaa Jayaraman

In this chapter we focus on the neural processes that occur in the mature healthy human brain in response to evaluating another’s social attention. We first examine the brain’s sensitivity to gaze direction of others, social attention (as typically indicated by gaze contact), and joint attention. Brain regions such as the superior temporal sulcus (STS), the amygdala, and the fusiform gyrus have been previously demonstrated to be sensitive to gaze changes, most frequently with functional magnetic resonance imaging (fMRI). Neurophysiological investigations, using electroencephalography (EEG) and magnetoencephalography (MEG), have identified event-related potentials (ERPs) such as the N170 that are sensitive to changes in gaze direction and head direction. We advance a putative model that explains findings relating to the neurophysiology of social attention , based mainly on our studies. This model proposes two brain modes of social information processing—a nonsocial “Default” mode and a social mode that we have named “Socially Aware”. In Default mode, there is an internal focus on executing actions to achieve our goals, as evident in studies in which passive viewing or tasks involving nonsocial judgments have been used. In contrast, Socially Aware mode is active when making explicit social judgments. Switching between these two modes is rapid and can occur via either top-down or bottom-up routes. From a different perspective, most of the literature, including our own studies, has focused on social attention phenomena as experienced from the first-person perspective, i.e., gaze changes or social attention directed at, or away from, the observer. However, in daily life we are actively involved in observing social interactions between others, where their social attention focus may not include us, or their gaze may not meet ours. Hence, changes in eye gaze and social attention are experienced from the third-person perspective. This area of research is still fairly small, but nevertheless important in the study of social and joint attention, and we discuss this very small literature briefly at the end of the chapter. We conclude the chapter with some outstanding questions, which are aimed at the main knowledge gaps in the literature.


NeuroImage | 2011

Action expertise reduces brain activity for audiovisual matching actions: An fMRI study with expert drummers

Karin Petrini; Frank E. Pollick; Sofia Dahl; Phil McAleer; Lawrie S. McKay; Davide Rocchesso; Carl Haakon Waadeland; Scott A. Love; Federico Avanzini; Aina Puce

Collaboration


Dive into the Scott A. Love's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Karin Petrini

University College London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alice Bertello

École Normale Supérieure

View shared research outputs
Top Co-Authors

Avatar

Bruno Nazarian

Aix-Marseille University

View shared research outputs
Top Co-Authors

Avatar

Damien Marie

Aix-Marseille University

View shared research outputs
Top Co-Authors

Avatar

Jean-Luc Anton

Aix-Marseille University

View shared research outputs
Top Co-Authors

Avatar

Muriel Roth

Aix-Marseille University

View shared research outputs
Top Co-Authors

Avatar

Romain Lacoste

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Olivier Coulon

Aix-Marseille University

View shared research outputs
Researchain Logo
Decentralizing Knowledge