Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Brianna Beck is active.

Publication


Featured researches published by Brianna Beck.


Experimental Brain Research | 2016

Viewing the body modulates both pain sensations and pain responses.

Brianna Beck; Elisabetta Làdavas; Patrick Haggard

Viewing the body can influence pain perception, even when vision is non-informative about the noxious stimulus. Prior studies used either continuous pain rating scales or pain detection thresholds, which cannot distinguish whether viewing the body changes the discriminability of noxious heat intensities or merely shifts reported pain levels. In Experiment 1, participants discriminated two intensities of heat-pain stimulation. Noxious stimuli were delivered to the hand in darkness immediately after participants viewed either their own hand or a non-body object appearing in the same location. The visual condition varied randomly between trials. Discriminability of the noxious heat intensities (d′) was lower after viewing the hand than after viewing the object, indicating that viewing the hand reduced the information about stimulus intensity available within the nociceptive system. In Experiment 2, the hand and the object were presented in separate blocks of trials. Viewing the hand shifted perceived pain levels irrespective of actual stimulus intensity, biasing responses toward ‘high pain’ judgments. In Experiment 3, participants saw the noxious stimulus as it approached and touched their hand or the object. Seeing the pain-inducing event counteracted the reduction in discriminability found when viewing the hand alone. These findings show that viewing the body can affect both perceptual processing of pain and responses to pain, depending on the visual context. Many factors modulate pain; our study highlights the importance of distinguishing modulations of perceptual processing from modulations of response bias.


Psychological Science | 2017

Choosing, Doing, and Controlling: Implicit Sense of Agency Over Somatosensory Events

Khatereh Borhani; Brianna Beck; Patrick Haggard

Sense of agency—a feeling of control over one’s actions and their outcomes—might include at least two components: free choice over which outcome to pursue and motoric control over the action causing the outcome. We orthogonally manipulated locus of outcome choice (free or instructed choice) and motoric control (active or passive movement), while measuring the perceived temporal attraction between actions and outcomes (temporal binding) as an implicit marker of agency. Participants also rated stimulus intensity so that we could measure sensory attenuation, another possible implicit marker of agency. Actions caused higher or lower levels of either painful heat or mild electrotactile stimulation. We found that both motoric control and outcome choice contributed to outcome binding. Moreover, free choice, relative to instructed choice, attenuated the perceived magnitude of high-intensity outcomes, but only when participants made an active movement. Thus, choosing, not just doing, influences temporal binding and sensory attenuation, though in different ways. Our results show that these implicit measures of agency are sensitive to both voluntary motor commands and instrumental control over action outcomes.


PLOS ONE | 2015

The Enfacement Illusion Is Not Affected by Negative Facial Expressions

Brianna Beck; Flavia Cardini; Elisabetta Làdavas; Caterina Bertini

Enfacement is an illusion wherein synchronous visual and tactile inputs update the mental representation of one’s own face to assimilate another person’s face. Emotional facial expressions, serving as communicative signals, may influence enfacement by increasing the observer’s motivation to understand the mental state of the expresser. Fearful expressions, in particular, might increase enfacement because they are valuable for adaptive behavior and more strongly represented in somatosensory cortex than other emotions. In the present study, a face was seen being touched at the same time as the participant’s own face. This face was either neutral, fearful, or angry. Anger was chosen as an emotional control condition for fear because it is similarly negative but induces less somatosensory resonance, and requires additional knowledge (i.e., contextual information and social contingencies) to effectively guide behavior. We hypothesized that seeing a fearful face (but not an angry one) would increase enfacement because of greater somatosensory resonance. Surprisingly, neither fearful nor angry expressions modulated the degree of enfacement relative to neutral expressions. Synchronous interpersonal visuo-tactile stimulation led to assimilation of the other’s face, but this assimilation was not modulated by facial expression processing. This finding suggests that dynamic, multisensory processes of self-face identification operate independently of facial expression processing.


Cortex | 2015

Dissociable routes for personal and interpersonal visual enhancement of touch

Brianna Beck; Caterina Bertini; Patrick Haggard; Elisabetta Làdavas

Seeing a hand can enhance tactile acuity on the hand, even when tactile stimulation is not visible. This visual enhancement of touch (VET) occurs both when participants see their own hand (personal VET), and when they see another persons hand (interpersonal VET). Interpersonal VET occurs irrespective of where the viewed hand appears, while personal VET is eliminated when visual and proprioceptive signals about the location of ones own hand are incongruent. This suggests that the neural mechanisms for VET may differ according to ownership of the seen hand. We used continuous theta-burst transcranial magnetic stimulation (TMS) to disrupt either the human ventral intraparietal area (hVIP), which integrates tactile, proprioceptive, and visual information about ones own body, or the extrastriate body area (EBA), which processes visual body information irrespective of ownership. Participants then judged the orientation of tactile gratings applied to their hand while viewing images of their own hand, another persons hand, or a non-body object on a screen placed over their actual hand. Disrupting the hVIP attenuated personal VET but did not affect interpersonal VET, suggesting the hVIP is only involved in VET when ones own hand is seen. Disrupting the EBA reduced both personal and interpersonal VET, suggesting it is common to both routes.


PLOS ONE | 2013

Observed touch on a non-human face is not remapped onto the human observer's own face.

Brianna Beck; Caterina Bertini; Cristina Scarpazza; Elisabetta Làdavas

Visual remapping of touch (VRT) is a phenomenon in which seeing a human face being touched enhances detection of tactile stimuli on the observers own face, especially when the observed face expresses fear. This study tested whether VRT would occur when seeing touch on monkey faces and whether it would be similarly modulated by facial expressions. Human participants detected near-threshold tactile stimulation on their own cheeks while watching fearful, happy, and neutral human or monkey faces being concurrently touched or merely approached by fingers. We predicted minimal VRT for neutral and happy monkey faces but greater VRT for fearful monkey faces. The results with human faces replicated previous findings, demonstrating stronger VRT for fearful expressions than for happy or neutral expressions. However, there was no VRT (i.e. no difference between accuracy in touch and no-touch trials) for any of the monkey faces, regardless of facial expression, suggesting that touch on a non-human face is not remapped onto the somatosensory system of the human observer.


Social Cognitive and Affective Neuroscience | 2018

The social buffering of pain by affective touch: a laser-evoked potential study in romantic couples

Mariana von Mohr; Charlotte Krahé; Brianna Beck; Aikaterini Fotopoulou

Abstract Pain is modulated by social context. Recent neuroimaging studies have shown that romantic partners can provide a potent form of social support during pain. However, such studies have only focused on passive support, finding a relatively late‐onset modulation of pain‐related neural processing. In this study, we examined for the first time dynamic touch by ones romantic partner as an active form of social support. Specifically, 32 couples provided social, active, affective (vs active but neutral) touch according to the properties of a specific C‐tactile afferent pathway to their romantic partners, who then received laser‐induced pain. We measured subjective pain ratings and early N1 and later N2‐P2 laser‐evoked potentials (LEPs) to noxious stimulation, as well as individual differences in adult attachment style. We found that affective touch from ones partner reduces subjective pain ratings and similarly attenuates LEPs both at earlier (N1) and later (N2‐P2) stages of cortical processing. Adult attachment style did not affect LEPs, but attachment anxiety had a moderating role on pain ratings. This is the first study to show early neural modulation of pain by active, partner touch, and we discuss these findings in relation to the affective and social modulation of sensory salience.


Scientific Reports | 2017

Visual area V5/hMT+ contributes to perception of tactile motion direction: a TMS study

Tomohiro Amemiya; Brianna Beck; Vincent Walsh; Hiroaki Gomi; Patrick Haggard

Human imaging studies have reported activations associated with tactile motion perception in visual motion area V5/hMT+, primary somatosensory cortex (SI) and posterior parietal cortex (PPC; Brodmann areas 7/40). However, such studies cannot establish whether these areas are causally involved in tactile motion perception. We delivered double-pulse transcranial magnetic stimulation (TMS) while moving a single tactile point across the fingertip, and used signal detection theory to quantify perceptual sensitivity to motion direction. TMS over both SI and V5/hMT+, but not the PPC site, significantly reduced tactile direction discrimination. Our results show that V5/hMT+ plays a causal role in tactile direction processing, and strengthen the case for V5/hMT+ serving multimodal motion perception. Further, our findings are consistent with a serial model of cortical tactile processing, in which higher-order perceptual processing depends upon information received from SI. By contrast, our results do not provide clear evidence that the PPC site we targeted (Brodmann areas 7/40) contributes to tactile direction perception.


Cortex | 2017

Corrigendum to “Dissociable routes for personal and interpersonal visual enhancement of touch” [Cortex 73 (2015) 289–297]

Brianna Beck; Caterina Bertini; Patrick Haggard; Elisabetta Làdavas

A typographical error was found in the second paragraph of section 2.3, Procedure (p. 291). A sentence in that paragraph begins, “Next, cTBS (3 pulses at 5 Hz, repeated at 50 Hz intervals) was delivered over the target site...” This is incorrect. Instead, the sentence should state, “Next, cTBS (3 pulses at 50 Hz, repeated at 5 Hz intervals) was delivered over the target site...”. The authors would like to apologize for any inconvenience caused.


Seeing and Perceiving | 2012

Interpersonal multisensory stimulation and emotion: The impact of threat-indicative facial expressions on enfacement

Brianna Beck; Caterina Bertini; Elisabetta Làdavas

Prior studies have identified an ‘enfacement effect’ in which participants incorporate another’s face into their self-face representation after observing that face touched repeatedly in synchrony with touch on their own face (Sforza et al., 2010; Tsakiris, 2008). The degree of self-face/other-face merging is positively correlated with participants’ trait-level empathy scores (Sforza et al., 2010) and affects judgments of the other’s personality (Paladino et al., 2010), suggesting that enfacement also modulates higher-order representations of ‘self’ and ‘other’ involved in social and emotional evaluations. To test this hypothesis, we varied not only whether visuo-tactile stimulation was synchronous or asynchronous but also whether the person being touched in the video displayed an emotional expression indicative of threat, either fear or anger. We hypothesized that participants would incorporate the faces of fearful others more than the faces of angry others after a shared visuo-tactile experience because of a potentially stronger representation of the sight of fear in somatosensory cortices compared to the sight of anger (Cardini et al., 2012). Instead, we found that the enfacement effect (i.e., greater self-face/other-face merging following synchronous compared to asynchronous visuo-tactile stimulation) was abolished if the other person displayed fear but remained if they expressed anger. This nonetheless suggests that enfacement operates on an evaluative self-representation as well as a physical one because the effect changes with the emotional content of the other’s face. Further research into the neural mechanism behind the enfacement effect is needed to determine why sight of fear diminishes it rather than enhancing it.


Cognition | 2017

Having control over the external world increases the implicit sense of agency

Brianna Beck; Steven Di Costa; Patrick Haggard

Collaboration


Dive into the Brianna Beck's collaboration.

Top Co-Authors

Avatar

Patrick Haggard

University College London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Antonio Cataldo

University College London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

James Critchlow

University College London

View shared research outputs
Top Co-Authors

Avatar

Lieke de Boer

University College London

View shared research outputs
Researchain Logo
Decentralizing Knowledge