Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Bencie Woll is active.

Publication


Featured researches published by Bencie Woll.


Language | 1989

Sign Language: the study of deaf people and their language

Jim Kyle; Bencie Woll

Acknowledgements Introduction 1. The deaf community 2. British Sign Language 3. Historical aspects of BSL 4. Sign language acquisition 5. The building blocks of sign language 6. The structure of signs 7. Sign morphology and syntax: the grammar of BSL 8. Comparing sign languages 9. Learning and using BSL 10. The psychology of sign 11. Sign language interpreting 12. Sign language in schools 13. Which sign language? 14. Developments for sign language Appendix References Subject index Index of signs in the text Index of names in the text.


Trends in Cognitive Sciences | 2008

The signing brain: the neurobiology of sign language

Mairéad MacSweeney; Cheryl M. Capek; Ruth Campbell; Bencie Woll

Most of our knowledge about the neurobiological bases of language comes from studies of spoken languages. By studying signed languages, we can determine whether what we have learnt so far is characteristic of language per se or whether it is specific to languages that are spoken and heard. Overwhelmingly, lesion and neuroimaging studies indicate that the neural systems supporting signed and spoken language are very similar: both involve a predominantly left-lateralised perisylvian network. Recent studies have also highlighted processing differences between languages in these different modalities. These studies provide rich insights into language and communication processes in deaf and hearing people.


Neuroreport | 2000

Silent speechreading in the absence of scanner noise: an event-related fMRI study.

Mairéad MacSweeney; Edson Amaro; Gemma A. Calvert; Ruth Campbell; Anthony S. David; Philip McGuire; Steven Williams; Bencie Woll; Michael Brammer

In a previous study we used functional magnetic resonance imaging (fMRI) to demonstrate activation in auditory cortex during silent speechreading. Since image acquisition during fMRI generates acoustic noise, this pattern of activation could have reflected an interaction between background scanner noise and the visual lip-read stimuli. In this study we employed an event-related fMRI design which allowed us to measure activation during speechreading in the absence of acoustic scanner noise. In the experimental condition, hearing subjects were required to speechread random numbers from a silent speaker. In the control condition subjects watched a static image of the same speaker with mouth closed and were required to subvocally count an intermittent visual cue. A single volume of images was collected to coincide with the estimated peak of the blood oxygen level dependent (BOLD) response to these stimuli across multiple baseline and experimental trials. Silent speechreading led to greater activation in lateral temporal cortex relative to the control condition. This indicates that activation of auditory areas during silent speech-reading is not a function of acoustic scanner noise and confirms that silent speechreading engages similar regions of auditory cortex as listening to speech.


NeuroImage | 2004

Dissociating linguistic and nonlinguistic gestural communication in the brain

Mairéad MacSweeney; Ruth Campbell; Bencie Woll; Vincent Giampietro; Anthony S. David; Philip McGuire; Gemma A. Calvert; Michael Brammer

Gestures of the face, arms, and hands are components of signed languages used by Deaf people. Signaling codes, such as the racecourse betting code known as Tic Tac, are also made up of such gestures. Tic Tac lacks the phonological structure of British Sign Language (BSL) but is similar in terms of its visual and articulatory components. Using fMRI, we compared the neural correlates of viewing a gestural language (BSL) and a manual-brachial code (Tic Tac) relative to a low-level baseline task. We compared three groups: Deaf native signers, hearing native signers, and hearing nonsigners. None of the participants had any knowledge of Tic Tac. All three groups activated an extensive frontal-posterior network in response to both types of stimuli. Superior temporal cortex, including the planum temporale, was activated bilaterally in response to both types of gesture in all groups, irrespective of hearing status. The engagement of these traditionally auditory processing regions was greater in Deaf than hearing participants. These data suggest that the planum temporale may be responsive to visual movement in both deaf and hearing people, yet when hearing is absent early in development, the visual processing role of this region is enhanced. Greater activation for BSL than Tic Tac was observed in signers, but not in nonsigners, in the left posterior superior temporal sulcus and gyrus, extending into the supramarginal gyrus. This suggests that the left posterior perisylvian cortex is of fundamental importance to language processing, regardless of the modality in which it is conveyed.


Handbücher zur Sprach- und Kommunikationswissenschaft | 2012

Sign language: an international handbook

Roland Pfau; Markus Steinbach; Bencie Woll

Sign language linguists show here that all the questions relevant to the linguistic investigation of spoken languages can be asked about sign languages. Conversely, questions that sign language linguists consider - even if spoken language researchers have not asked them yet - should also be asked of spoken languages. The HSK handbook Sign Language aims to provide a concise and comprehensive overview of the state of the art in sign language linguistics. It includes 44 chapters, written by leading researchers in the field, that address issues in language typology, sign language grammar, psycho- and neurolinguistics, sociolinguistics, and language documentation and transcription. Crucially, all topics are presented in a way that makes them accessible to linguists who are not familiar with sign language linguistics.


NeuroImage | 2008

Phonological processing in deaf signers and the impact of age of first language acquisition.

Mairéad MacSweeney; Dafydd Waters; Michael Brammer; Bencie Woll; Usha Goswami

Just as words can rhyme, the signs of a signed language can share structural properties, such as location. Linguistic description at this level is termed phonology. We report that a left-lateralised fronto-parietal network is engaged during phonological similarity judgements made in both English (rhyme) and British Sign Language (BSL; location). Since these languages operate in different modalities, these data suggest that the neural network supporting phonological processing is, to some extent, supramodal. Activation within this network was however modulated by language (BSL/English), hearing status (deaf/hearing), and age of BSL acquisition (native/non-native). The influence of language and hearing status suggests an important role for the posterior portion of the left inferior frontal gyrus in speech-based phonological processing in deaf people. This, we suggest, is due to increased reliance on the articulatory component of speech when the auditory component is absent. With regard to age of first language acquisition, non-native signers activated the left inferior frontal gyrus more than native signers during the BSL task, and also during the task performed in English, which both groups acquired late. This is the first neuroimaging demonstration that age of first language acquisition has implications not only for the neural systems supporting the first language, but also for networks supporting languages learned subsequently.


The Lancet | 1996

Why do mothers cradle babies on their left

Jechil S. Sieratzki; Bencie Woll

Many explanations have been put forward for the observed preference of mothers to cradle babies on the left side. These include handedness, the importance of the maternal heartbeat, left breast sensitivity, socio-psychological factors, and advantages in monitoring the infant. We propose that protection and facilitation of affective communication is at the core of cradling; and explore the relation between left-cradling and the role of the right hemisphere in early mother-infant interaction. Left-cradling not only directs maternal communication to the infants right hemisphere but also facilitates affective feedback to the maternal right brain. The underlying neuro-linguistic mechanisms proposed in this article may be important in the early course of child language development and may also serve to illuminate our understanding of the evolution of human language.


Journal of Cognitive Neuroscience | 2002

Neural Correlates of British Sign Language Comprehension: Spatial Processing Demands of Topographic Language

Mairéad MacSweeney; Bencie Woll; Ruth Campbell; Gemma A. Calvert; Philip McGuire; Anthony S. David; Andrew Simmons; Michael Brammer

In all signed languages used by deaf people, signs are executed in sign space in front of the body. Some signed sentences use this space to map detailed real-world spatial relationships directly. Such sentences can be considered to exploit sign space topographically. Using functional magnetic resonance imaging, we explored the extent to which increasing the topographic processing demands of signed sentences was reflected in the differential recruitment of brain regions in deaf and hearing native signers of the British Sign Language. When BSL signers performed a sentence anomaly judgement task, the occipito-temporal junction was activated bilaterally to a greater extent for topographic than nontopo-graphic processing. The differential role of movement in the processing of the two sentence types may account for this finding. In addition, enhanced activation was observed in the left inferior and superior parietal lobules during processing of topographic BSL sentences. We argue that the left parietal lobe is specifically involved in processing the precise configuration and location of hands in space to represent objects, agents, and actions. Importantly, no differences in these regions were observed when hearing people heard and saw English translations of these sentences. Despite the high degree of similarity in the neural systems underlying signed and spoken languages, exploring the linguistic features which are unique to each of these broadens our understanding of the systems involved in language comprehension.


Neuropsychologia | 2008

Cortical circuits for silent speechreading in deaf and hearing people

Cheryl M. Capek; Mairéad MacSweeney; Bencie Woll; Dafydd Waters; Philip McGuire; Anthony S. David; Michael Brammer; Ruth Campbell

This fMRI study explored the functional neural organisation of seen speech in congenitally deaf native signers and hearing non-signers. Both groups showed extensive activation in perisylvian regions for speechreading words compared to viewing the model at rest. In contrast to earlier findings, activation in left middle and posterior portions of superior temporal cortex, including regions within the lateral sulcus and the superior and middle temporal gyri, was greater for deaf than hearing participants. This activation pattern survived covarying for speechreading skill, which was better in deaf than hearing participants. Furthermore, correlational analysis showed that regions of activation related to speechreading skill varied with the hearing status of the observers. Deaf participants showed a positive correlation between speechreading skill and activation in the middle/posterior superior temporal cortex. In hearing participants, however, more posterior and inferior temporal activation (including fusiform and lingual gyri) was positively correlated with speechreading skill. Together, these findings indicate that activation in the left superior temporal regions for silent speechreading can be modulated by both hearing status and speechreading skill.


Nature Communications | 2013

Dissociating cognitive and sensory neural plasticity in human superior temporal cortex

Velia Cardin; Eleni Orfanidou; Jerker Rönnberg; Cheryl M. Capek; Mary Rudner; Bencie Woll

Disentangling the effects of sensory and cognitive factors on neural reorganization is fundamental for establishing the relationship between plasticity and functional specialization. Auditory deprivation in humans provides a unique insight into this problem, because the origin of the anatomical and functional changes observed in deaf individuals is not only sensory, but also cognitive, owing to the implementation of visual communication strategies such as sign language and speechreading. Here, we describe a functional magnetic resonance imaging study of individuals with different auditory deprivation and sign language experience. We find that sensory and cognitive experience cause plasticity in anatomically and functionally distinguishable substrates. This suggests that after plastic reorganization, cortical regions adapt to process a different type of input signal, but preserve the nature of the computation they perform, both at a sensory and cognitive level.

Collaboration


Dive into the Bencie Woll's collaboration.

Top Co-Authors

Avatar

Gary Morgan

City University London

View shared research outputs
Top Co-Authors

Avatar

Ruth Campbell

University College London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Cheryl M. Capek

University College London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dafydd Waters

University College London

View shared research outputs
Top Co-Authors

Avatar

Joanna Atkinson

University College London

View shared research outputs
Researchain Logo
Decentralizing Knowledge