Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Asif A. Ghazanfar is active.

Publication


Featured researches published by Asif A. Ghazanfar.


Trends in Cognitive Sciences | 2006

Is neocortex essentially multisensory

Asif A. Ghazanfar; Charles E. Schroeder

Although sensory perception and neurobiology are traditionally investigated one modality at a time, real world behaviour and perception are driven by the integration of information from multiple sensory sources. Mounting evidence suggests that the neural underpinnings of multisensory integration extend into early sensory processing. This article examines the notion that neocortical operations are essentially multisensory. We first review what is known about multisensory processing in higher-order association cortices and then discuss recent anatomical and physiological findings in presumptive unimodal sensory areas. The pervasiveness of multisensory influences on all levels of cortical processing compels us to reconsider thinking about neural processing in unisensory terms. Indeed, the multisensory nature of most, possibly all, of the neocortex forces us to abandon the notion that the senses ever operate independently during real-world cognition.


The Journal of Neuroscience | 2005

Multisensory Integration of Dynamic Faces and Voices in Rhesus Monkey Auditory Cortex

Asif A. Ghazanfar; Joost X. Maier; Kari L. Hoffman; Nk Logothetis

In the social world, multiple sensory channels are used concurrently to facilitate communication. Among human and nonhuman primates, faces and voices are the primary means of transmitting social signals (Adolphs, 2003; Ghazanfar and Santos, 2004). Primates recognize the correspondence between species-specific facial and vocal expressions (Massaro, 1998; Ghazanfar and Logothetis, 2003; Izumi and Kojima, 2004), and these visual and auditory channels can be integrated into unified percepts to enhance detection and discrimination. Where and how such communication signals are integrated at the neural level are poorly understood. In particular, it is unclear what role “unimodal” sensory areas, such as the auditory cortex, may play. We recorded local field potential activity, the signal that best correlates with human imaging and event-related potential signals, in both the core and lateral belt regions of the auditory cortex in awake behaving rhesus monkeys while they viewed vocalizing conspecifics. We demonstrate unequivocally that the primate auditory cortex integrates facial and vocal signals through enhancement and suppression of field potentials in both the core and lateral belt regions. The majority of these multisensory responses were specific to face/voice integration, and the lateral belt region shows a greater frequency of multisensory integration than the core region. These multisensory processes in the auditory cortex likely occur via reciprocal interactions with the superior temporal sulcus.


Neuron | 1997

Reconstructing the Engram: Simultaneous, Multisite, Many Single Neuron Recordings

Miguel A. L. Nicolelis; Asif A. Ghazanfar; Barbara M. Faggin; Scott V. Votaw; Laura Oliveira

Little is known about the physiological principles that govern large-scale neuronal interactions in the mammalian brain. Here, we describe an electrophysiological paradigm capable of simultaneously recording the extracellular activity of large populations of single neurons, distributed across multiple cortical and subcortical structures in behaving and anesthetized animals. Up to 100 neurons were simultaneously recorded after 48 microwires were implanted in the brain stem, thalamus, and somatosensory cortex of rats. Overall, 86% of the implanted microwires yielded single neurons, and an average of 2.3 neurons were discriminated per microwire. Our population recordings remained stable for weeks, demonstrating that this method can be employed to investigate the dynamic and distributed neuronal ensemble interactions that underlie processes such as sensory perception, motor control, and sensorimotor learning in freely behaving animals.


Trends in Cognitive Sciences | 2012

Brain-to-brain coupling: a mechanism for creating and sharing a social world.

Uri Hasson; Asif A. Ghazanfar; Bruno Galantucci; Simon Garrod; Christian Keysers

Cognition materializes in an interpersonal space. The emergence of complex behaviors requires the coordination of actions among individuals according to a shared set of rules. Despite the central role of other individuals in shaping ones mind, most cognitive studies focus on processes that occur within a single individual. We call for a shift from a single-brain to a multi-brain frame of reference. We argue that in many cases the neural processes in one brain are coupled to the neural processes in another brain via the transmission of a signal through the environment. Brain-to-brain coupling constrains and shapes the actions of each individual in a social network, leading to complex joint behaviors that could not have emerged in isolation.


PLOS Computational Biology | 2009

The Natural Statistics of Audiovisual Speech

Chandramouli Chandrasekaran; Andrea Trubanova; Sébastien Stillittano; Alice Caplier; Asif A. Ghazanfar

Humans, like other animals, are exposed to a continuous stream of signals, which are dynamic, multimodal, extended, and time varying in nature. This complex input space must be transduced and sampled by our sensory systems and transmitted to the brain where it can guide the selection of appropriate actions. To simplify this process, its been suggested that the brain exploits statistical regularities in the stimulus space. Tests of this idea have largely been confined to unimodal signals and natural scenes. One important class of multisensory signals for which a quantitative input space characterization is unavailable is human speech. We do not understand what signals our brain has to actively piece together from an audiovisual speech stream to arrive at a percept versus what is already embedded in the signal structure of the stream itself. In essence, we do not have a clear understanding of the natural statistics of audiovisual speech. In the present study, we identified the following major statistical features of audiovisual speech. First, we observed robust correlations and close temporal correspondence between the area of the mouth opening and the acoustic envelope. Second, we found the strongest correlation between the area of the mouth opening and vocal tract resonances. Third, we observed that both area of the mouth opening and the voice envelope are temporally modulated in the 2–7 Hz frequency range. Finally, we show that the timing of mouth movements relative to the onset of the voice is consistently between 100 and 300 ms. We interpret these data in the context of recent neural theories of speech which suggest that speech communication is a reciprocally coupled, multisensory event, whereby the outputs of the signaler are matched to the neural processes of the receiver.


Nature Neuroscience | 1998

Simultaneous encoding of tactile information by three primate cortical areas

Miguel A. L. Nicolelis; Asif A. Ghazanfar; Christopher R. Stambaugh; Laura Oliveira; Mark Laubach; John K. Chapin; Randall J. Nelson; Jon H. Kaas

We used simultaneous multi-site neural ensemble recordings to investigate the representation of tactile information in three areas of the primate somatosensory cortex (areas 3b, SII and 2). Small neural ensembles (30–40 neurons) of broadly tuned somatosensory neurons were able to identify correctly the location of a single tactile stimulus on a single trial, almost simultaneously. Furthermore, each of these cortical areas could use different combinations of encoding strategies, such as mean firing rate (areas 3b and 2) or temporal patterns of ensemble firing (area SII), to represent the location of a tactile stimulus. Based on these results, we propose that ensembles of broadly tuned neurons, located in three distinct areas of the primate somatosensory cortex, obtain information about the location of a tactile stimulus almost concurrently.


Trends in Cognitive Sciences | 2009

The emergence of multisensory systems through perceptual narrowing

David J. Lewkowicz; Asif A. Ghazanfar

According to conventional wisdom, multisensory development is a progressive process that results in the growth and proliferation of perceptual skills. We review new findings indicating that a regressive process - perceptual narrowing - also contributes in critical ways to perceptual development. These new data reveal that young infants are able to integrate non-native faces and vocalizations, that this broad multisensory perceptual tuning is present at birth, and that this tuning narrows by the end of the first year of life, leaving infants with the ability to integrate only socio-ecologically-relevant multisensory signals. This narrowing process forces us to reconsider the traditional progressive theories of multisensory development and opens up several new evolutionary questions as well.


The Journal of Neuroscience | 2008

Interactions between the superior temporal sulcus and auditory cortex mediate dynamic face/voice integration in rhesus monkeys

Asif A. Ghazanfar; Chandramouli Chandrasekaran; Nk Logothetis

The existence of multiple nodes in the cortical network that integrate faces and voices suggests that they may be interacting and influencing each other during communication. To test the hypothesis that multisensory responses in auditory cortex are influenced by visual inputs from the superior temporal sulcus (STS), an association area, we recorded local field potentials and single neurons from both structures concurrently in monkeys. The functional interactions between the auditory cortex and the STS, as measured by spectral analyses, increased in strength during presentations of dynamic faces and voices relative to either communication signal alone. These interactions were not solely modulations of response strength, because the phase relationships were significantly less variable in the multisensory condition as well. A similar analysis of functional interactions within the auditory cortex revealed no similar interactions as a function of stimulus condition, nor did a control condition in which the dynamic face was replaced with a dynamic disk mimicking mouth movements. Single neuron data revealed that these intercortical interactions were reflected in the spiking output of auditory cortex and that such spiking output was coordinated with oscillations in the STS. The vast majority of single neurons that were responsive to voices showed integrative responses when faces, but not control stimuli, were presented in conjunction. Our data suggest that the integration of faces and voices is mediated at least in part by neuronal cooperation between auditory cortex and the STS and that interactions between these structures are a fast and efficient way of dealing with the multisensory communication signals.


Current Biology | 2007

Vocal-tract resonances as indexical cues in rhesus monkeys.

Asif A. Ghazanfar; Hjalmar K. Turesson; Joost X. Maier; Ralph van Dinther; Roy D. Patterson; Nk Logothetis

Summary Vocal-tract resonances (or formants) are acoustic signatures in the voice and are related to the shape and length of the vocal tract. Formants play an important role in human communication, helping us not only to distinguish several different speech sounds [1], but also to extract important information related to the physical characteristics of the speaker, so-called indexical cues. How did formants come to play such an important role in human vocal communication? One hypothesis suggests that the ancestral role of formant perception—a role that might be present in extant nonhuman primates—was to provide indexical cues [2–5]. Although formants are present in the acoustic structure of vowel-like calls of monkeys [3–8] and implicated in the discrimination of call types [8–10], it is not known whether they use this feature to extract indexical cues. Here, we investigate whether rhesus monkeys can use the formant structure in their “coo” calls to assess the age-related body size of conspecifics. Using a preferential-looking paradigm [11, 12] and synthetic coo calls in which formant structure simulated an adult/large- or juvenile/small-sounding individual, we demonstrate that untrained monkeys attend to formant cues and link large-sounding coos to large faces and small-sounding coos to small faces—in essence, they can, like humans [13], use formants as indicators of age-related body size.


Nature Reviews Neuroscience | 2004

Primate brains in the wild: the sensory bases for social interactions.

Asif A. Ghazanfar; Laurie R. Santos

Each organism in the animal kingdom has evolved to detect and process a specific set of stimuli in its environment. Studies of an animals socioecology can help us to identify these stimuli, as well as the natural behavioural responses that they evoke and control. Primates are no exception, but many of our specializations are in the social domain. How did the human brain come to be so exquisitely tuned to social interactions? Only a comparative approach will provide the answer. Behavioural studies are shedding light on the sensory bases for non-human primate social interactions, and data from these studies are paving the way for investigations into the neural bases of sociality.

Collaboration


Dive into the Asif A. Ghazanfar's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge