Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Pawel J. Matusz is active.

Publication


Featured researches published by Pawel J. Matusz.


Neuropsychologia | 2016

The multisensory function of the human primary visual cortex

Micah M. Murray; Antonia Thelen; Gregor Thut; Vincenzo Romei; Roberto Martuzzi; Pawel J. Matusz

It has been nearly 10 years since Ghazanfar and Schroeder (2006) proposed that the neocortex is essentially multisensory in nature. However, it is only recently that sufficient and hard evidence that supports this proposal has accrued. We review evidence that activity within the human primary visual cortex plays an active role in multisensory processes and directly impacts behavioural outcome. This evidence emerges from a full pallet of human brain imaging and brain mapping methods with which multisensory processes are quantitatively assessed by taking advantage of particular strengths of each technique as well as advances in signal analyses. Several general conclusions about multisensory processes in primary visual cortex of humans are supported relatively solidly. First, haemodynamic methods (fMRI/PET) show that there is both convergence and integration occurring within primary visual cortex. Second, primary visual cortex is involved in multisensory processes during early post-stimulus stages (as revealed by EEG/ERP/ERFs as well as TMS). Third, multisensory effects in primary visual cortex directly impact behaviour and perception, as revealed by correlational (EEG/ERPs/ERFs) as well as more causal measures (TMS/tACS). While the provocative claim of Ghazanfar and Schroeder (2006) that the whole of neocortex is multisensory in function has yet to be demonstrated, this can now be considered established in the case of the human primary visual cortex.


Cognition | 2015

Multi-modal distraction: Insights from children's limited attention

Pawel J. Matusz; Hannah Broadbent; Jessica Ferrari; Benjamin Forrest; Rebecca Merkley; Gaia Scerif

How does the multi-sensory nature of stimuli influence information processing? Cognitive systems with limited selective attention can elucidate these processes. Six-year-olds, 11-year-olds and 20-year-olds engaged in a visual search task that required them to detect a pre-defined coloured shape under conditions of low or high visual perceptual load. On each trial, a peripheral distractor that could be either compatible or incompatible with the current target colour was presented either visually, auditorily or audiovisually. Unlike unimodal distractors, audiovisual distractors elicited reliable compatibility effects across the two levels of load in adults and in the older children, but high visual load significantly reduced distraction for all children, especially the youngest participants. This study provides the first demonstration that multi-sensory distraction has powerful effects on selective attention: Adults and older children alike allocate attention to potentially relevant information across multiple senses. However, poorer attentional resources can, paradoxically, shield the youngest children from the deleterious effects of multi-sensory distraction. Furthermore, we highlight how developmental research can enrich the understanding of distinct mechanisms controlling adult selective attention in multi-sensory environments.


Frontiers in Integrative Neuroscience | 2015

Top-down control and early multisensory processes: chicken vs. egg

Rosanna De Meo; Micah M. Murray; Stephanie Clarke; Pawel J. Matusz

Traditional views contend that behaviorally-relevant multisensory interactions occur relatively late during stimulus processing and subsequently to influences of (top-down) attentional control. In contrast, work from the last 15 years shows that information from different senses is integrated in the brain also during the initial 100 ms after stimulus onset and within low-level cortices. Critically, many of these early-latency multisensory interactions (hereafter eMSI) directly impact behavior. The prevalence of eMSI substantially advances our understanding of how unified perception and goal-related behavior emerge. However, it also raises important questions about the dependency of the eMSI on top-down, goal-based attentional control mechanisms that bias information processing toward task-relevant objects (hereafter top-down control). To date, this dependency remains controversial, because eMSI can occur independently of top-down control, making it plausible for (some) multisensory processes to directly shape perception and behavior. In other words, the former is not necessary for these early effects to occur and to link them with perception (see Figure ​Figure1A).1A). This issue epitomizes the fundamental question regarding direct links between sensation, perception, and behavior (direct perception), and also extends it in a crucial way to incorporate the multisensory nature of everyday experience. At the same time, the emerging framework must strive to also incorporate the variety of higher-order control mechanisms that likely influence multisensory stimulus responses but which are not based on task-relevance. This article presents a critical perspective about the importance of top-down control for eMSI: In other words, who is controlling whom? Figure 1 (A) Depiction of manners in which top-down attentional control and bottom-up multisensory processes may influence direct perception in multisensory contexts. In this model, the bottom-up multisensory processes that occur early in time (eMSI; beige box) ...


Current Biology | 2014

Multisensory context portends object memory

Antonia Thelen; Pawel J. Matusz; Micah M. Murray

Summary Multisensory processes facilitate perception of currently-presented stimuli and can likewise enhance later object recognition. Memories for objects originally encountered in a multisensory context can be more robust than those for objects encountered in an exclusively visual or auditory context [1], upturning the assumption that memory performance is best when encoding and recognition contexts remain constant [2]. Here, we used event-related potentials (ERPs) to provide the first evidence for direct links between multisensory brain activity at one point in time and subsequent object discrimination abilities. Across two experiments we found that individuals showing a benefit and those impaired during later object discrimination could be predicted by their brain responses to multisensory stimuli upon their initial encounter. These effects were observed despite the multisensory information being meaningless, task-irrelevant, and presented only once. We provide critical insights into the advantages associated with multisensory interactions; they are not limited to the processing of current stimuli, but likewise encompass the ability to determine the benefit of ones memories for object recognition in later, unisensory contexts.


Psychophysiology | 2013

Top-down control of audiovisual search by bimodal search templates

Pawel J. Matusz; Martin Eimer

To test whether the attentional selection of targets defined by a combination of visual and auditory features is guided in a modality-specific fashion or by control processes that are integrated across modalities, we measured attentional capture by visual stimuli during unimodal visual and audiovisual search. Search arrays were preceded by spatially uninformative visual singleton cues that matched the current target-defining visual feature. Participants searched for targets defined by a visual feature, or by a combination of visual and auditory features (e.g., red targets accompanied by high-pitch tones). Spatial cueing effects indicative of attentional capture were reduced during audiovisual search, and cue-triggered N2pc components were attenuated and delayed. This reduction of cue-induced attentional capture effects during audiovisual search provides new evidence for the multimodal control of selective attention.


European Journal of Neuroscience | 2015

The role of auditory cortices in the retrieval of single-trial auditory-visual object memories

Pawel J. Matusz; Antonia Thelen; Sarah Amrein; Eveline Geiser; Jacques Anken; Micah M. Murray

Single‐trial encounters with multisensory stimuli affect both memory performance and early‐latency brain responses to visual stimuli. Whether and how auditory cortices support memory processes based on single‐trial multisensory learning is unknown and may differ qualitatively and quantitatively from comparable processes within visual cortices due to purported differences in memory capacities across the senses. We recorded event‐related potentials (ERPs) as healthy adults (n = 18) performed a continuous recognition task in the auditory modality, discriminating initial (new) from repeated (old) sounds of environmental objects. Initial presentations were either unisensory or multisensory; the latter entailed synchronous presentation of a semantically congruent or a meaningless image. Repeated presentations were exclusively auditory, thus differing only according to the context in which the sound was initially encountered. Discrimination abilities (indexed by d’) were increased for repeated sounds that were initially encountered with a semantically congruent image versus sounds initially encountered with either a meaningless or no image. Analyses of ERPs within an electrical neuroimaging framework revealed that early stages of auditory processing of repeated sounds were affected by prior single‐trial multisensory contexts. These effects followed from significantly reduced activity within a distributed network, including the right superior temporal cortex, suggesting an inverse relationship between brain activity and behavioural outcome on this task. The present findings demonstrate how auditory cortices contribute to long‐term effects of multisensory experiences on auditory object discrimination. We propose a new framework for the efficacy of multisensory processes to impact both current multisensory stimulus processing and unisensory discrimination abilities later in time.


Experimental Brain Research | 2016

The COGs (context, object, and goals) in multisensory processing.

Sanne ten Oever; Vincenzo Romei; Nienke van Atteveldt; Salvador Soto-Faraco; Micah M. Murray; Pawel J. Matusz

Abstract Our understanding of how perception operates in real-world environments has been substantially advanced by studying both multisensory processes and “top-down” control processes influencing sensory processing via activity from higher-order brain areas, such as attention, memory, and expectations. As the two topics have been traditionally studied separately, the mechanisms orchestrating real-world multisensory processing remain unclear. Past work has revealed that the observer’s goals gate the influence of many multisensory processes on brain and behavioural responses, whereas some other multisensory processes might occur independently of these goals. Consequently, other forms of top-down control beyond goal dependence are necessary to explain the full range of multisensory effects currently reported at the brain and the cognitive level. These forms of control include sensitivity to stimulus context as well as the detection of matches (or lack thereof) between a multisensory stimulus and categorical attributes of naturalistic objects (e.g. tools, animals). In this review we discuss and integrate the existing findings that demonstrate the importance of such goal-, object- and context-based top-down control over multisensory processing. We then put forward a few principles emerging from this literature review with respect to the mechanisms underlying multisensory processing and discuss their possible broader implications.


Current Biology | 2017

The Dual Nature of Early-Life Experience on Somatosensory Processing in the Human Infant Brain

Nathalie L. Maitre; Alexandra P. Key; Olena Chorna; James C. Slaughter; Pawel J. Matusz; Mark T. Wallace; Micah M. Murray

Every year, 15 million preterm infants are born, and most spend their first weeks in neonatal intensive care units (NICUs) [1]. Although essential for the support and survival of these infants, NICU sensory environments are dramatically different from those in which full-term infants mature and thus likely impact the development of functional brain organization [2]. Yet the integrity of sensory systems determines effective perception and behavior [3, 4]. In neonates, touch is a cornerstone of interpersonal interactions and sensory-cognitive development [5-7]. NICU treatments used to improve neurodevelopmental outcomes rely heavily on touch [8]. However, we understand little of how brain maturation at birth (i.e., prematurity) and quality of early-life experiences (e.g., supportive versus painful touch) interact to shape the development of the somatosensory system [9]. Here, we identified the spatial, temporal, and amplitude characteristics of cortical responses to light touch that differentiate them from sham stimuli in full-term infants. We then utilized this data-driven analytical framework to show that the degree of prematurity at birth determines the extent to which brain responses to light touch (but not sham) are attenuated at the time of discharge from the hospital. Building on these results, we showed that, when controlling for prematurity and analgesics, supportive experiences (e.g., breastfeeding, skin-to-skin care) are associated with stronger brain responses, whereas painful experiences (e.g., skin punctures, tube insertions) are associated with reduced brain responses to the same touch stimuli. Our results shed crucial insights into the mechanisms through which common early perinatal experiences may shape the somatosensory scaffolding of later perceptual, cognitive, and social development.


Human Brain Mapping | 2016

Contextual Factors Multiplex to Control Multisensory Processes

Beatriz R. Sarmiento; Pawel J. Matusz; Daniel Sanabria; Micah M. Murray

This study analyzed high‐density event‐related potentials (ERPs) within an electrical neuroimaging framework to provide insights regarding the interaction between multisensory processes and stimulus probabilities. Specifically, we identified the spatiotemporal brain mechanisms by which the proportion of temporally congruent and task‐irrelevant auditory information influences stimulus processing during a visual duration discrimination task. The spatial position (top/bottom) of the visual stimulus was indicative of how frequently the visual and auditory stimuli would be congruent in their duration (i.e., context of congruence). Stronger influences of irrelevant sound were observed when contexts associated with a high proportion of auditory‐visual congruence repeated and also when contexts associated with a low proportion of congruence switched. Context of congruence and context transition resulted in weaker brain responses at 228 to 257 ms poststimulus to conditions giving rise to larger behavioral cross‐modal interactions. Importantly, a control oddball task revealed that both congruent and incongruent audiovisual stimuli triggered equivalent non‐linear multisensory interactions when congruence was not a relevant dimension. Collectively, these results are well explained by statistical learning, which links a particular context (here: a spatial location) with a certain level of top‐down attentional control that further modulates cross‐modal interactions based on whether a particular context repeated or changed. The current findings shed new light on the importance of context‐based control over multisensory processing, whose influences multiplex across finer and broader time scales. Hum Brain Mapp 37:273–288, 2016.


Current Biology | 2015

Neuroplasticity: Unexpected Consequences of Early Blindness.

Micah M. Murray; Pawel J. Matusz; Amir Amedi

A pair of recent studies shows that congenital blindness can have significant consequences for the functioning of the visual system after sight restoration, particularly if that restoration is delayed.

Collaboration


Dive into the Pawel J. Matusz's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jakub Traczyk

University of Social Sciences and Humanities

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Rebecca Merkley

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Roberto Martuzzi

École Polytechnique Fédérale de Lausanne

View shared research outputs
Researchain Logo
Decentralizing Knowledge