Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Daniel Sanabria is active.

Publication


Featured researches published by Daniel Sanabria.


Brain Research | 2006

Selective temporal attention enhances the temporal resolution of visual perception : Evidence from a temporal order judgment task

Ángel Correa; Daniel Sanabria; Charles Spence; Pío Tudela; Juan Lupiáñez

We investigated whether attending to a particular point in time affects temporal resolution in a task in which participants judged which of two visual stimuli had been presented first. The results showed that temporal resolution can be improved by attending to the relevant moment as indicated by the temporal cue. This novel finding is discussed in terms of the differential effects of spatial and temporal attention on temporal resolution.


Chemical Senses | 2008

Olfactory Discrimination: When Vision Matters?

M. Luisa Demattè; Daniel Sanabria; Charles Spence

Many previous studies have attempted to investigate the effect of visual cues on olfactory perception in humans. The majority of this research has only looked at the modulatory effect of color, which has typically been explained in terms of multisensory perceptual interactions. However, such crossmodal effects may equally well relate to interactions taking place at a higher level of information processing as well. In fact, it is well-known that semantic knowledge can have a substantial effect on peoples olfactory perception. In the present study, we therefore investigated the influence of visual cues, consisting of color patches and/or shapes, on peoples olfactory discrimination performance. Participants had to make speeded odor discrimination responses (lemon vs. strawberry) while viewing a red or yellow color patch, an outline drawing of a strawberry or lemon, or a combination of these color and shape cues. Even though participants were instructed to ignore the visual stimuli, our results demonstrate that the accuracy of their odor discrimination responses was influenced by visual distractors. This result shows that both color and shape information are taken into account during speeded olfactory discrimination, even when such information is completely task irrelevant, hinting at the automaticity of such higher level visual-olfactory crossmodal interactions.


PLOS ONE | 2013

Cognitive performance and heart rate variability: the influence of fitness level.

Antonio Luque-Casado; Mikel Zabala; Esther Morales; Manuel Mateo-March; Daniel Sanabria

In the present study, we investigated the relation between cognitive performance and heart rate variability as a function of fitness level. We measured the effect of three cognitive tasks (the psychomotor vigilance task, a temporal orienting task, and a duration discrimination task) on the heart rate variability of two groups of participants: a high-fit group and a low-fit group. Two major novel findings emerged from this study. First, the lowest values of heart rate variability were found during performance of the duration discrimination task, compared to the other two tasks. Second, the results showed a decrement in heart rate variability as a function of the time on task, although only in the low-fit group. Moreover, the high-fit group showed overall faster reaction times than the low-fit group in the psychomotor vigilance task, while there were not significant differences in performance between the two groups of participants in the other two cognitive tasks. In sum, our results highlighted the influence of cognitive processing on heart rate variability. Importantly, both behavioral and physiological results suggested that the main benefit obtained as a result of fitness level appeared to be associated with processes involving sustained attention.


Neuroscience Letters | 2005

Intramodal perceptual grouping modulates multisensory integration: Evidence from the crossmodal dynamic capture task

Daniel Sanabria; Salvador Soto-Faraco; Jason S. Chan; Charles Spence

We investigated the extent to which intramodal visual perceptual grouping influences the multisensory integration (or grouping) of auditory and visual motion information. Participants discriminated the direction of motion of two sequentially presented sounds (moving leftward or rightward), while simultaneously trying to ignore a task-irrelevant visual apparent motion stream. The principles of perceptual grouping were used to vary the direction and extent of apparent motion within the irrelevant modality (vision). The results demonstrate that the multisensory integration of motion information can be modulated by the perceptual grouping taking place unimodally within vision, suggesting that unimodal perceptual grouping processes precede multisensory integration. The present study therefore illustrates how intramodal and crossmodal perceptual grouping processes interact to determine how the information in complex multisensory environments is parsed.


Experimental Brain Research | 2005

Spatiotemporal interactions between audition and touch depend on hand posture

Daniel Sanabria; Salvador Soto-Faraco; Charles Spence

We report two experiments designed to assess the consequences of posture change on audiotactile spatiotemporal interactions. In Experiment 1, participants had to discriminate the direction of an auditory stream (consisting of the sequential presentation of two tones from different spatial positions) while attempting to ignore a task-irrelevant tactile stream (consisting of the sequential presentation of two vibrations, one to each of the participant’s hands). The tactile stream presented to the participants’ hands was either spatiotemporally congruent or incongruent with respect to the sounds. A significant decrease in performance in incongruent trials compared with congruent trials was demonstrated when the participants adopted an uncrossed-hands posture but not when their hands were crossed over the midline. In Experiment 2, we investigated the ability of participants to discriminate the direction of two sequentially presented tactile stimuli (one presented to each hand) as a function of the presence of congruent vs incongruent auditory distractors. Here, the crossmodal effect was stronger in the crossed-hands posture than in the uncrossed-hands posture. These results demonstrate the reciprocal nature of audiotactile interactions in spatiotemporal processing, and highlight the important role played by body posture in modulating such crossmodal interactions.


Experimental Brain Research | 2007

Comparing intramodal and crossmodal cuing in the endogenous orienting of spatial attention

Ana B. Chica; Daniel Sanabria; Juan Lupiáñez; Charles Spence

The endogenous orienting of spatial attention has been studied with both informative central cues and informative peripheral cues. Central cues studies are difficult to compare with studies that have used uninformative peripheral cues due to the differences in stimulus presentation. Moreover, informative peripheral cues attract both endogenous and exogenous attention, thus making it difficult to disentangle the contribution of each process to any behavioural results observed. In the present study, we used an informative peripheral cue (either tactile or visual) that predicted that the target would appear (in different blocks of trials) on either the same or opposite side as the cue. By using this manipulation, both expected and unexpected trials could either be exogenously cued or uncued, thus making it possible to isolate expectancy effects from cuing effects. Our aim was to compare the endogenous orienting of spatial attention to tactile (Experiment 1) and to visual targets (Experiment 2) under conditions of intramodal and crossmodal spatial cuing. The results suggested that the endogenous orienting of spatial attention should not be considered as being a purely supramodal phenomenon, given that significantly larger expectancy effects were observed in the intramodal cuing conditions than in the crossmodal cuing conditions in both experiments.


Biological Psychology | 2016

Heart rate variability and cognitive processing: The autonomic response to task demands

Antonio Luque-Casado; José C. Perales; David Cárdenas; Daniel Sanabria

This study investigated variations in heart rate variability (HRV) as a function of cognitive demands. Participants completed an execution condition including the psychomotor vigilance task, a working memory task and a duration discrimination task. The control condition consisted of oddball versions (participants had to detect the rare event) of the tasks from the execution condition, designed to control for the effect of the task parameters (stimulus duration and stimulus rate) on HRV. The NASA-TLX questionnaire was used as a subjective measure of cognitive workload across tasks and conditions. Three major findings emerged from this study. First, HRV varied as a function of task demands (with the lowest values in the working memory task). Second, and crucially, we found similar HRV values when comparing each of the tasks with its oddball control equivalent, and a significant decrement in HRV as a function of time-on-task. Finally, the NASA-TLX results showed larger cognitive workload in the execution condition than in the oddball control condition, and scores variations as a function of task. Taken together, our results suggest that HRV is highly sensitive to overall demands of sustained attention over and above the influence of other cognitive processes suggested by previous literature. In addition, our study highlights a potential dissociation between objective and subjective measures of mental workload, with important implications in applied settings.


Neuropsychologia | 2013

Temporal orienting of attention is interfered by concurrent working memory updating.

Mariagrazia Capizzi; Ángel Correa; Daniel Sanabria

A previous dual-task study (Capizzi, Sanabria, & Correa, 2012) showed that temporal orienting of attention was disrupted by performing a concurrent working memory task, while sequential effects were preserved. Here, we recorded event related potentials (ERPs) during single- and dual-task performance to investigate how this behavioural dissociation would be expressed in neural activity measures. The single-task condition required participants to respond to a visual target stimulus that could be anticipated on the basis of a highly predictive temporal cue. The dual-task condition introduced a concurrent working memory task, in which colour information had to be updated on every trial. The behavioural results replicated our previous findings of impaired temporal orienting, but preserved sequential effects, under dual-task relative to single-task conditions. The ERPs results showed that temporal orienting and sequential effects both modulated the cue-locked preparatory contingent negative variation (CNV) and the target-locked N2 amplitude and P3 latency under single-task, but not under dual-task conditions. In contrast to temporal orienting, sequential effects were also observed at the early target-locked P1 and N1 potentials. Crucially, only the P1 modulation survived dual-task interference. These findings provide novel electrophysiological evidence that performance of a concurrent working memory task may interfere in a selective way with neural activity specifically linked to temporal orienting of attention.


Experimental Brain Research | 2004

Bouncing or streaming? Exploring the influence of auditory cues on the interpretation of ambiguous visual motion.

Daniel Sanabria; Ángel Correa; Juan Lupiáñez; Charles Spence

When looking at two identical objects moving toward each other on a two-dimensional visual display, two different events can be perceived: the objects can either be seen to bounce off each other, or else to stream through one another. Previous research has shown that the large bias normally seen toward the streaming percept can be modulated by the presentation of an auditory event at the moment of coincidence. However, previous behavioral research on this crossmodal effect has always relied on subjective report. In the present experiment, we used a novel experimental design to provide a more objective/implicit measure of the effect of an auditory cue on visual motion perception. In our study, two disks moved toward each other, with the point of coincidence hidden behind an occluder. When emerging from behind the occluder, the disks (one red, the other blue) could either follow the same trajectory (streaming) or else move in the opposite direction (bouncing). Participants made speeded discrimination responses regarding the side from which one of the disks emerged from behind the occluder. Participants responded more rapidly on streaming trials when no sound was presented and on bouncing trials when the sound was presented at the moment of coincidence. These results provide the first empirical demonstration of the auditory modulation of an ambiguous visual motion display using an implicit/objective behavioral measure of perception.


Cognitive, Affective, & Behavioral Neuroscience | 2004

When does visual perceptual grouping affect multisensory integration

Daniel Sanabria; Salvador Soto-Faraco; Jason S. Chan; Charles Spence

Several studies have shown that the direction in which a visual apparent motion stream moves can influence the perceived direction of an auditory apparent motion stream (an effect known as crossmodal dynamic capture). However, little is known about the role that intramodal perceptual grouping processes play in the multisensory integration of motion information. The present study was designed to investigate the time course of any modulation of the cross-modal dynamic capture effect by the nature of the perceptual grouping taking place within vision. Participants were required to judge the direction of an auditory apparent motion stream while trying to ignore visual apparent motion streams presented in a variety of different configurations. Our results demonstrate that the cross-modal dynamic capture effect was influenced more by visual perceptual grouping when the conditions for intramodal perceptual grouping were set up prior to the presentation of the audiovisual apparent motion stimuli. However, no such modulation occurred when the visual perceptual grouping manipulation was established at the same time as or after the presentation of the audiovisual stimuli. These results highlight the importance of the unimodal perceptual organization of sensory information to the manifestation of multisensory integration.

Collaboration


Dive into the Daniel Sanabria's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Florentino Huertas

Universidad Católica de Valencia San Vicente Mártir

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge