Kari L. Hoffman
York University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Kari L. Hoffman.
The Journal of Neuroscience | 2005
Asif A. Ghazanfar; Joost X. Maier; Kari L. Hoffman; Nk Logothetis
In the social world, multiple sensory channels are used concurrently to facilitate communication. Among human and nonhuman primates, faces and voices are the primary means of transmitting social signals (Adolphs, 2003; Ghazanfar and Santos, 2004). Primates recognize the correspondence between species-specific facial and vocal expressions (Massaro, 1998; Ghazanfar and Logothetis, 2003; Izumi and Kojima, 2004), and these visual and auditory channels can be integrated into unified percepts to enhance detection and discrimination. Where and how such communication signals are integrated at the neural level are poorly understood. In particular, it is unclear what role “unimodal” sensory areas, such as the auditory cortex, may play. We recorded local field potential activity, the signal that best correlates with human imaging and event-related potential signals, in both the core and lateral belt regions of the auditory cortex in awake behaving rhesus monkeys while they viewed vocalizing conspecifics. We demonstrate unequivocally that the primate auditory cortex integrates facial and vocal signals through enhancement and suppression of field potentials in both the core and lateral belt regions. The majority of these multisensory responses were specific to face/voice integration, and the lateral belt region shows a greater frequency of multisensory integration than the core region. These multisensory processes in the auditory cortex likely occur via reciprocal interactions with the superior temporal sulcus.
Current Biology | 2007
Kari L. Hoffman; Katalin M. Gothard; Michael Schmid; Nk Logothetis
The social behavior of both human and nonhuman primates relies on specializations for the recognition of individuals, their facial expressions, and their direction of gaze. A broad network of cortical and subcortical structures has been implicated in face processing, yet it is unclear whether co-occurring dimensions of face stimuli, such as expression and direction of gaze, are processed jointly or independently by anatomically and functionally segregated neural structures. Awake macaques were presented with a set of monkey faces displaying aggressive, neutral, and appeasing expressions with head and eyes either averted or directed. BOLD responses to these faces as compared to Fourier-phase-scrambled images revealed widespread activation of the superior temporal sulcus and inferotemporal cortex and included activity in the amygdala. The different dimensions of the face stimuli elicited distinct activation patterns among the amygdaloid nuclei. The basolateral amygdala, including the lateral, basal, and accessory basal nuclei, produced a stronger response for threatening than appeasing expressions. The central nucleus and bed nucleus of the stria terminalis responded more to averted than directed-gaze faces. Independent behavioral measures confirmed that faces with averted gaze were more arousing, suggesting the activity in the central nucleus may be related to attention and arousal.
The Journal of Neuroscience | 2007
Kari L. Hoffman; Francesco P. Battaglia; Kenneth D. M. Harris; Jason N. MacLean; Lisa Marshall; Mayank R. Mehta
A sleeping brain is by no means dormant: most cortical neurons, primarily detached from the influence of stimuli in the environment, are nevertheless active, just as they are during behavior. Although neural activity is preserved during sleep, the structure of the activity changes significantly,
Philosophical Transactions of the Royal Society B | 2009
Kari L. Hoffman; Nk Logothetis
Learning about the world through our senses constrains our ability to recognise our surroundings. Experience shapes perception. What is the neural basis for object recognition and how are learning-induced changes in recognition manifested in neural populations? We consider first the location of neurons that appear to be critical for object recognition, before describing what is known about their function. Two complementary processes of object recognition are considered: discrimination among diagnostic object features and generalization across non-diagnostic features. Neural plasticity appears to underlie the development of discrimination and generalization for a given set of features, though tracking these changes directly over the course of learning has remained an elusive task.
Trends in Neurosciences | 2002
Kari L. Hoffman; Bruce L. McNaughton
Sleep can facilitate memory formation, but its role in cortical plasticity is poorly understood. A recent study found that sleep, following monocular deprivation (MD), facilitated cortical changes in ocular dominance. The magnitude of plasticity was similar to that observed after continued MD, and larger than that seen after sleep deprivation in darkness, suggesting that sleep per se enables mechanisms of cortical plasticity. Experience-dependent plasticity during sleep could be part of a more global process of memory consolidation.
Frontiers in Systems Neuroscience | 2013
Kari L. Hoffman; Michelle C. Dragan; Timothy K. Leonard; Cristiano Micheli; Rodrigo Montefusco-Siegmund; Taufik A. Valiante
Visual exploration in primates depends on saccadic eye movements (SEMs) that cause alternations of neural suppression and enhancement. This modulation extends beyond retinotopic areas, and is thought to facilitate perception; yet saccades may also influence brain regions critical for forming memories of these exploratory episodes. The hippocampus, for example, shows oscillatory activity that is generally associated with encoding of information. Whether or how hippocampal oscillations are influenced by eye movements is unknown. We recorded the neural activity in the human and macaque hippocampus during visual scene search. Across species, SEMs were associated with a time-limited alignment of a low-frequency (3–8 Hz) rhythm. The phase alignment depended on the task and not only on eye movements per se, and the frequency band was not a direct consequence of saccade rate. Hippocampal theta-frequency oscillations are produced by other mammals during repetitive exploratory behaviors, including whisking, sniffing, echolocation, and locomotion. The present results may reflect a similar yet distinct primate homologue supporting active perception during exploration.
Behavioral Neuroscience | 2012
Timothy K. Leonard; Galit Blumenthal; Katalin M. Gothard; Kari L. Hoffman
The pattern of visual fixations on an image depends not only on the image content but also on the viewers disposition and on the function (or pathology) of underlying neural circuitry. For example, human viewers display changes in viewing patterns toward face images that differ in gaze direction or in the viewers familiarity with the face. Macaques share many face processing abilities with humans, and their neural circuitry is used to understand perception across species, yet their viewing responses to gaze and familiarity of faces is poorly understood. In this study, rhesus macaques passively viewed faces of familiar and unfamiliar conspecifics whose head-and-eye gaze was directed either toward or away from the viewing monkey. The eyes of faces were viewed more than any other feature; furthermore, familiar eyes were viewed more than unfamiliar eyes. In contrast, ears, though not as salient as eyes, were viewed about twice as often for unfamiliar faces as familiar faces. Directed-gaze eyes were fixated earlier, and for a greater proportion of saccades than were the eyes of averted-gaze faces, suggesting that mutual gaze attracts a more immediate and sustained scanning of the eyes. Ears and external features were more salient for averted, as compared with directed gaze. In general, effects were more robust (within and across subjects) for the gaze contrast than for familiarity, perhaps as a consequence of the greater image-based differences for the gaze than the familiarity stimuli used in this study.
The Journal of Neuroscience | 2015
Timothy K. Leonard; Jonathan M. Mikkila; Emad N. Eskandar; Jason L. Gerrard; Daniel Kaping; Shaun R. Patel; Thilo Womelsdorf; Kari L. Hoffman
Hippocampal sharp-wave ripples (SWRs) are highly synchronous oscillatory field potentials that are thought to facilitate memory consolidation. SWRs typically occur during quiescent states, when neural activity reflecting recent experience is replayed. In rodents, SWRs also occur during brief locomotor pauses in maze exploration, where they appear to support learning during experience. In this study, we detected SWRs that occurred during quiescent states, but also during goal-directed visual exploration in nonhuman primates (Macaca mulatta). The exploratory SWRs showed peak frequency bands similar to those of quiescent SWRs, and both types were inhibited at the onset of their respective behavioral epochs. In apparent contrast to rodent SWRs, these exploratory SWRs occurred during active periods of exploration, e.g., while animals searched for a target object in a scene. SWRs were associated with smaller saccades and longer fixations. Also, when they coincided with target-object fixations during search, detection was more likely than when these events were decoupled. Although we observed high gamma-band field potentials of similar frequency to SWRs, only the SWRs accompanied greater spiking synchrony in neural populations. These results reveal that SWRs are not limited to off-line states as conventionally defined; rather, they occur during active and informative performance windows. The exploratory SWR in primates is an infrequent occurrence associated with active, attentive performance, which may indicate a new, extended role of SWRs during exploration in primates. SIGNIFICANCE STATEMENT Sharp-wave ripples (SWRs) are high-frequency oscillations that generate highly synchronized activity in neural populations. Their prevalence in sleep and quiet wakefulness, and the memory deficits that result from their interruption, suggest that SWRs contribute to memory consolidation during rest. Here, we report that SWRs from the monkey hippocampus occur not only during behavioral inactivity but also during successful visual exploration. SWRs were associated with attentive, focal search and appeared to enhance perception of locations viewed around the time of their occurrence. SWRs occurring in rest are noteworthy for their relation to heightened neural population activity, temporally precise and widespread synchronization, and memory consolidation; therefore, the SWRs reported here may have a similar effect on neural populations, even as experiences unfold.
Proceedings of the National Academy of Sciences of the United States of America | 2012
Hjalmar K. Turesson; Nk Logothetis; Kari L. Hoffman
Object perception and categorization can occur so rapidly that behavioral responses precede or co-occur with the firing rate changes in the object-selective neocortex. Phase coding could, in principle, support rapid representation of object categories, whereby the first spikes evoked by a stimulus would appear at different phases of an oscillation, depending on the object category. To determine whether object-selective regions of the neo-cortex demonstrate phase coding, we presented images of faces and objects to two monkeys while recording local field potentials (LFP) and single unit activity from object-selective regions in the upper bank superior temporal sulcus. Single units showed preferred phases of firing that depended on stimulus category, emerging with the initiation of spiking responses after stimulus onset. Differences in phase of firing were seen below 20 Hz and in the gamma and high-gamma frequency ranges. For all but the <20-Hz cluster, phase differences remained category-specific even when controlling for stimulus-locked activity, revealing that phase-specific firing is not a simple consequence of category-specific differences in the evoked responses of the LFP. In addition, we tested for firing rate-to-phase conversion. Category-specific differences in firing rates accounted for 30–40% of the explained variance in phase occurring at lower frequencies (<20 Hz) during the initial response, but was limited (<20% of the explained variance) in the 30- to 60-Hz frequency range, suggesting that gamma phase-of-firing effects reflect more than evoked LFP and firing rate responses. The present results are consistent with theoretical models of rapid object processing and extend previous observations of phase coding to include object-selective neocortex.
Journal of Neurophysiology | 2014
Patricia F. Sayegh; Kara M. Hawkins; Bogdan Neagu; J. Douglas Crawford; Kari L. Hoffman; Lauren E. Sergio
Eye-hand coordination is crucial for our ability to interact with the world around us. However, much of the visually guided reaches that we perform require a spatial decoupling between gaze direction and hand orientation. These complex decoupled reaching movements are in contrast to more standard eye and hand reaching movements in which the eyes and the hand are coupled. The superior parietal lobule (SPL) receives converging eye and hand signals; however, what is yet to be understood is how the activity within this region is modulated during decoupled eye and hand reaches. To address this, we recorded local field potentials within SPL from two rhesus macaques during coupled vs. decoupled eye and hand movements. Overall we observed a distinct separation in synchrony within the lower 10- to 20-Hz beta range from that in the higher 30- to 40-Hz gamma range. Specifically, within the early planning phase, beta synchrony dominated; however, the onset of this sustained beta oscillation occurred later during eye-hand decoupled vs. coupled reaches. As the task progressed, there was a switch to low-frequency and gamma-dominated responses, specifically for decoupled reaches. More importantly, we observed local field potential activity to be a stronger task (coupled vs. decoupled) and state (planning vs. execution) predictor than that of single units alone. Our results provide further insight into the computations of SPL for visuomotor transformations and highlight the necessity of accounting for the decoupled eye-hand nature of a motor task when interpreting movement control research data.