Arjen Alink
Cognition and Brain Sciences Unit
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Arjen Alink.
The Journal of Neuroscience | 2010
Arjen Alink; Caspar M. Schwiedrzik; Axel Kohler; Wolf Singer; Lars Muckli
In this functional magnetic resonance imaging study we tested whether the predictability of stimuli affects responses in primary visual cortex (V1). The results of this study indicate that visual stimuli evoke smaller responses in V1 when their onset or motion direction can be predicted from the dynamics of surrounding illusory motion. We conclude from this finding that the human brain anticipates forthcoming sensory input that allows predictable visual stimuli to be processed with less neural activation at early stages of cortical processing.
Journal of Vision | 2013
Thomas A. Carlson; David A. Tovar; Arjen Alink; Nikolaus Kriegeskorte
Human object recognition is remarkably efficient. In recent years, significant advancements have been made in our understanding of how the brain represents visual objects and organizes them into categories. Recent studies using pattern analyses methods have characterized a representational space of objects in human and primate inferior temporal cortex in which object exemplars are discriminable and cluster according to category (e.g., faces and bodies). In the present study we examined how category structure in object representations emerges in the first 1000 ms of visual processing. In the study, participants viewed 24 object exemplars with a planned categorical structure comprised of four levels ranging from highly specific (individual exemplars) to highly abstract (animate vs. inanimate), while their brain activity was recorded with magnetoencephalography (MEG). We used a sliding time window decoding approach to decode the exemplar and the exemplars category that participants were viewing on a moment-to-moment basis. We found exemplar and category membership could be decoded from the neuromagnetic recordings shortly after stimulus onset (<100 ms) with peak decodability following thereafter. Latencies for peak decodability varied systematically with the level of category abstraction with more abstract categories emerging later, indicating that the brain hierarchically constructs category representations. In addition, we examined the stationarity of patterns of activity in the brain that encode object category information and show these patterns vary over time, suggesting the brain might use flexible time varying codes to represent visual object categories.
The Journal of Neuroscience | 2012
Bernhard P. Staresina; Richard N. Henson; Nikolaus Kriegeskorte; Arjen Alink
The essence of episodic memory is our ability to reexperience past events in great detail, even in the absence of external stimulus cues. Does the phenomenological reinstatement of past experiences go along with reinstating unique neural representations in the brain? And if so, how is this accomplished by the medial temporal lobe (MTL), a brain region intimately linked to episodic memory? Computational models suggest that such reinstatement (also termed “pattern completion”) in cortical regions is mediated by the hippocampus, a key region of the MTL. Although recent functional magnetic resonance imaging studies demonstrated reinstatement of coarse item properties like stimulus category or task context across different brain regions, it has not yet been shown whether reinstatement can be observed at the level of individual, discrete events—arguably the defining feature of episodic memory—nor whether MTL structures like the hippocampus support this “true episodic” reinstatement. Here we show that neural activity patterns for unique word-scene combinations encountered during encoding are reinstated in human parahippocampal cortex (PhC) during retrieval. Critically, this reinstatement occurs when word-scene combinations are successfully recollected (even though the original scene is not visually presented) and does not encompass other stimulus domains (such as word-color associations). Finally, the degree of PhC reinstatement across retrieval events correlated with hippocampal activity, consistent with a role of the hippocampus in coordinating pattern completion in cortical regions.
Proceedings of the National Academy of Sciences of the United States of America | 2013
Bernhard P. Staresina; Arjen Alink; Nikolaus Kriegeskorte; Richard N. Henson
Significance How is new information converted into a memory trace? Here, we used functional neuroimaging to assess what happens to representations of new events after we first experience them. We found that a particular part of the medial temporal lobe, a brain region known to be critical for intact memory, spontaneously reactivates these events even when we are engaged in unrelated activities. Indeed, the extent to which such automatic reactivation occurs seems directly related to later memory performance. This finding shows that we can now study the dynamics of memory processes for specific experiences during the “offline” periods that follow the initial learning phase. How are new experiences transformed into memories? Recent findings have shown that activation in brain regions involved in the initial task performance reemerges during postlearning rest, suggesting that “offline activity” might be important for this transformation. It is unclear, however, whether such offline activity indeed reflects reactivation of individual learning experiences, whether the amount of event-specific reactivation is directly related to later memory performance, and what brain regions support such event-specific reactivation. Here, we used functional magnetic resonance imaging to assess whether event-specific reactivation occurs spontaneously during an active, postlearning delay period in the human brain. Applying representational similarity analysis, we found that successful recall of individual study events was predicted by the degree of their endogenous reactivation during the delay period. Within the medial temporal lobe, this reactivation was observed in the entorhinal cortex. Beyond the medial temporal lobe, event-specific reactivation was found in the retrosplenial cortex. Controlling for the levels of blood oxygen level-dependent activation and the serial position during encoding, the data suggest that offline reactivation might be a key mechanism for bolstering episodic memory beyond initial study processes. These results open a unique avenue for the systematic investigation of reactivation and consolidation of episodic memories in humans.
The Journal of Neuroscience | 2008
Arjen Alink; Wolf Singer; Lars Muckli
The brain is capable of integrating motion information arising from visual and auditory input. Such integration between sensory modalities can aid one another and helps to stabilize the motion percept. However, if motion information differs between sensory modalities, it can also result in an illusory auditory motion percept. This phenomenon is referred to as the cross-modal dynamic capture (CDC) illusion. We used functional magnetic resonance imaging to investigate whether early visual and auditory motion areas are involved in the generation of this illusion. Among the trials containing conflicting audiovisual motion, we compared the trials in which CDC occurred to those in which it did not and used a region of interest approach to see whether the auditory motion complex (AMC) and the visual motion area hMT/V5+ were affected by this illusion. Our results show that CDC reduces activation in bilateral auditory motion areas while increasing activity in the bilateral hMT/V5+. Interestingly, our data show that the CDC illusion is preceded by an enhanced activation that is most dominantly present in the ventral intraparietal sulcus. Moreover, we assessed the effect of motion coherency, which was found to enhance activation in bilateral hMT/V5+ as well as in an area adjacent to the right AMC. Together, our results show that audiovisual integration occurs in early motion areas. Furthermore, it seems that the cognitive state of subjects before stimulus onset plays an important role in the generation of multisensory illusions.
NeuroImage | 2016
Alexander Walther; Hamed Nili; Naveed Ejaz; Arjen Alink; Nikolaus Kriegeskorte; Jörn Diedrichsen
Representational similarity analysis of activation patterns has become an increasingly important tool for studying brain representations. The dissimilarity between two patterns is commonly quantified by the correlation distance or the accuracy of a linear classifier. However, there are many different ways to measure pattern dissimilarity and little is known about their relative reliability. Here, we compare the reliability of three classes of dissimilarity measure: classification accuracy, Euclidean/Mahalanobis distance, and Pearson correlation distance. Using simulations and four real functional magnetic resonance imaging (fMRI) datasets, we demonstrate that continuous dissimilarity measures are substantially more reliable than the classification accuracy. The difference in reliability can be explained by two characteristics of classifiers: discretization and susceptibility of the discriminant function to shifts of the pattern ensemble between imaging runs. Reliability can be further improved through multivariate noise normalization for all measures. Finally, unlike conventional distance measures, crossvalidated distances provide unbiased estimates of pattern dissimilarity on a ratio scale, thus providing an interpretable zero point. Overall, our results indicate that the crossvalidated Mahalanobis distance is preferable to both the classification accuracy and the correlation distance for characterizing representational geometries.
Frontiers in Psychology | 2013
Arjen Alink; Alexandra Krugliak; Alexander Walther; Nikolaus Kriegeskorte
The orientation of a large grating can be decoded from V1 functional magnetic resonance imaging (fMRI) data, even at low resolution (3-mm isotropic voxels). This finding has suggested that columnar-level neuronal information might be accessible to fMRI at 3T. However, orientation decodability might alternatively arise from global orientation-preference maps. Such global maps across V1 could result from bottom-up processing, if the preferences of V1 neurons were biased toward particular orientations (e.g., radial from fixation, or cardinal, i.e., vertical or horizontal). Global maps could also arise from local recurrent or top-down processing, reflecting pre-attentive perceptual grouping, attention spreading, or predictive coding of global form. Here we investigate whether fMRI orientation decoding with 2-mm voxels requires (a) globally coherent orientation stimuli and/or (b) global-scale patterns of V1 activity. We used opposite-orientation gratings (balanced about the cardinal orientations) and spirals (balanced about the radial orientation), along with novel patch-swapped variants of these stimuli. The two stimuli of a patch-swapped pair have opposite orientations everywhere (like their globally coherent parent stimuli). However, the two stimuli appear globally similar, a patchwork of opposite orientations. We find that all stimulus pairs are robustly decodable, demonstrating that fMRI orientation decoding does not require globally coherent orientation stimuli. Furthermore, decoding remained robust after spatial high-pass filtering for all stimuli, showing that fine-grained components of the fMRI patterns reflect visual orientations. Consistent with previous studies, we found evidence for global radial and vertical preference maps in V1. However, these were weak or absent for patch-swapped stimuli, suggesting that global preference maps depend on globally coherent orientations and might arise through recurrent or top-down processes related to the perception of global form.
Cerebral Cortex | 2012
Lucia Melloni; Sara van Leeuwen; Arjen Alink; Notger G. Müller
Whether an object captures our attention depends on its bottom-up salience, that is, how different it is compared with its neighbors, and top-down control, that is, our current inner goals. At which neuronal stage they interact to guide behavior is still unknown. In a functional magnetic resonance imaging study, we found evidence for a hierarchy of saliency maps in human early visual cortex (V1 to hV4) and identified where bottom-up saliency interacts with top-down control: V1 represented pure bottom-up signals, V2 was only responsive to top-down modulations, and in hV4 bottom-up saliency and top-down control converged. Two distinct cerebral networks exerted top-down control: distractor suppression engaged the left intraparietal sulcus, while target enhancement involved the frontal eye field and lateral occipital cortex. Hence, attentional selection is implemented in integrated maps in visual cortex, which provide precise topographic information about target-distractor locations thus allowing for successful visual search.
Human Brain Mapping | 2012
Arjen Alink; Felix Euler; Nikolaus Kriegeskorte; Wolf Singer; Axel Kohler
The aim of this functional magnetic resonance imaging (fMRI) study was to identify human brain areas that are sensitive to the direction of auditory motion. Such directional sensitivity was assessed in a hypothesis‐free manner by analyzing fMRI response patterns across the entire brain volume using a spherical‐searchlight approach. In addition, we assessed directional sensitivity in three predefined brain areas that have been associated with auditory motion perception in previous neuroimaging studies. These were the primary auditory cortex, the planum temporale and the visual motion complex (hMT/V5+). Our whole‐brain analysis revealed that the direction of sound‐source movement could be decoded from fMRI response patterns in the right auditory cortex and in a high‐level visual area located in the right lateral occipital cortex. Our region‐of‐interest‐based analysis showed that the decoding of the direction of auditory motion was most reliable with activation patterns of the left and right planum temporale. Auditory motion direction could not be decoded from activation patterns in hMT/V5+. These findings provide further evidence for the planum temporale playing a central role in supporting auditory motion perception. In addition, our findings suggest a cross‐modal transfer of directional information to high‐level visual cortex in healthy humans. Hum Brain Mapp, 2012.
PLOS ONE | 2007
Grit Hein; Arjen Alink; Andreas Kleinschmidt; Notger G. Müller
Why is it hard to divide attention between dissimilar activities, such as reading and listening to a conversation? We used functional magnetic resonance imaging (fMRI) to study interference between simple auditory and visual decisions, independently of motor competition. Overlapping activity for auditory and visual tasks performed in isolation was found in lateral prefrontal regions, middle temporal cortex and parietal cortex. When the visual stimulus occurred during the processing of the tone, its activation in prefrontal and middle temporal cortex was suppressed. Additionally, reduced activity was seen in modality-specific visual cortex. These results paralleled impaired awareness of the visual event. Even without competing motor responses, a simple auditory decision interferes with visual processing on different neural levels, including prefrontal cortex, middle temporal cortex and visual regions.