Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ying Choon Wu is active.

Publication


Featured researches published by Ying Choon Wu.


Journal of Cognitive Neuroscience | 2005

Right Hemisphere Activation of Joke-related Information: An Event-related Brain Potential Study

Seana Coulson; Ying Choon Wu

Two studies tested the hypothesis that the right hemisphere engages in relatively coarse semantic coding that aids high-level language tasks such as joke comprehension. Scalp-recorded event-related brain potentials (ERPs) were collected as healthy adults read probe words (CRAZY) preceded either by jokes or nonfunny controls (Everyone had so much fun jumping into the swimming pool, we decided to put in a littlewater/platform). Probes were related to the meaning of the jokes, but not the controls. In Experiment 1a, with central presentation, probes following jokes (related) elicited less negative ERPs 300700 msec postonset (N400) than did probes following nonfunny controls (unrelated). This finding suggests related probes were primed by the jokes. In addition, unrelated probes elicited a larger anterior positivity 700 900 msec than did related, as irrelevant stimuli impacted control processes invoked by task demands. In Experiment 1b, probes (CRAZY) were preceded only by sentence-final words from jokes (water) or controls (platform). No ERP effects were observed in Experiment 1b, suggesting the N400 priming effect and the anterior positivity observed in Experiment 1a reflect semantic activations at the discourse level. To assess hemispheric differences in semantic activations, in Experiment 2, ERPs were recorded as participants read probe words presented in their left and right visual fields (LVF and RVF, respectively). Probes elicited a smaller N400 component when preceded by jokes than controls. This N400 priming effect was larger with presentation to the LVF, suggesting joke-relevant information was more active in the right hemisphere. The anterior positivity was observed with RVF but not LVF presentation, suggesting an important role for the left hemisphere in controlled retrieval in language comprehension.


Brain and Language | 2007

How iconic gestures enhance communication : An ERP study

Ying Choon Wu; Seana Coulson

EEG was recorded as adults watched short segments of spontaneous discourse in which the speakers gestures and utterances contained complementary information. Videos were followed by one of four types of picture probes: cross-modal related probes were congruent with both speech and gestures; speech-only related probes were congruent with information in the speech, but not the gesture; and two sorts of unrelated probes were created by pairing each related probe with a different discourse prime. Event-related potentials (ERPs) elicited by picture probes were measured within the time windows of the N300 (250-350 ms post-stimulus) and N400 (350-550 ms post-stimulus). Cross-modal related probes elicited smaller N300 and N400 than speech-only related ones, indicating that pictures were easier to interpret when they corresponded with gestures. N300 and N400 effects were not due to differences in the visual complexity of each probe type, since the same cross-modal and speech-only picture probes elicited N300 and N400 with similar amplitudes when they appeared as unrelated items. These findings extend previous research on gesture comprehension by revealing how iconic co-speech gestures modulate conceptualization, enabling listeners to better represent visuo-spatial aspects of the speakers meaning.


Psychonomic Bulletin & Review | 2007

Iconic gestures prime related concepts: An ERP study

Ying Choon Wu; Seana Coulson

To assess priming by iconic gestures, we recorded EEG (at 29 scalp sites) in two experiments while adults watched short, soundless videos of spontaneously produced, cospeech iconic gestures followed by related or unrelated probe words. In Experiment 1, participants classified the relatedness between gestures and words. In Experiment 2, they attended to stimuli, and performed an incidental recognition memory test on words presented during the EEG recording session. Event-related potentials (ERPs) time-locked to the onset of probe words were measured, along with response latencies and word recognition rates. Although word relatedness did not affect reaction times or recognition rates, contextually related probe words elicited less-negative ERPs than did unrelated ones between 300 and 500 msec after stimulus onset (N400) in both experiments. These findings demonstrate sensitivity to semantic relations between iconic gestures and words in brain activity engendered during word comprehension.


Neuroreport | 2010

Gestures modulate speech processing early in utterances.

Ying Choon Wu; Seana Coulson

Electroencephalogram was recorded as healthy adults viewed short videos of spontaneous discourse in which a speaker used depictive gestures to complement information expressed through speech. Event-related potentials were computed time-locked to content words in the speech stream and to subsequent related and unrelated picture probes. Gestures modulated event-related potentials to content words co-timed with the first gesture in a discourse segment, relative to the same words presented with static freeze frames of the speaker. Effects were observed 200–550 ms after speech onset, a time interval associated with semantic processing. Gestures also increased sensitivity to picture probe relatedness. Effects of gestures on picture probe and spoken word analysis were inversely correlated, suggesting that gestures differentially impact verbal and image-based processes.


Brain and Language | 2011

Are Depictive Gestures like Pictures? Commonalities and Differences in Semantic Processing.

Ying Choon Wu; Seana Coulson

Conversation is multi-modal, involving both talk and gesture. Does understanding depictive gestures engage processes similar to those recruited in the comprehension of drawings or photographs? Event-related brain potentials (ERPs) were recorded from neurotypical adults as they viewed spontaneously produced depictive gestures preceded by congruent and incongruent contexts. Gestures were presented either dynamically in short, soundless video-clips, or statically as freeze frames extracted from gesture videos. In a separate ERP experiment, the same participants viewed related or unrelated pairs of photographs depicting common real-world objects. Both object photos and gesture stimuli elicited less negative ERPs from 400 to 600ms post-stimulus when preceded by matching versus mismatching contexts (dN450). Object photos and static gesture stills also elicited less negative ERPs between 300 and 400ms post-stimulus (dN300). Findings demonstrate commonalities between the conceptual integration processes underlying the interpretation of iconic gestures and other types of image-based representations of the visual world.


Acta Psychologica | 2014

Co-speech iconic gestures and visuo-spatial working memory.

Ying Choon Wu; Seana Coulson

Three experiments tested the role of verbal versus visuo-spatial working memory in the comprehension of co-speech iconic gestures. In Experiment 1, participants viewed congruent discourse primes in which the speakers gestures matched the information conveyed by his speech, and incongruent ones in which the semantic content of the speakers gestures diverged from that in his speech. Discourse primes were followed by picture probes that participants judged as being either related or unrelated to the preceding clip. Performance on this picture probe classification task was faster and more accurate after congruent than incongruent discourse primes. The effect of discourse congruency on response times was linearly related to measures of visuo-spatial, but not verbal, working memory capacity, as participants with greater visuo-spatial WM capacity benefited more from congruent gestures. In Experiments 2 and 3, participants performed the same picture probe classification task under conditions of high and low loads on concurrent visuo-spatial (Experiment 2) and verbal (Experiment 3) memory tasks. Effects of discourse congruency and verbal WM load were additive, while effects of discourse congruency and visuo-spatial WM load were interactive. Results suggest that congruent co-speech gestures facilitate multi-modal language comprehension, and indicate an important role for visuo-spatial WM in these speech-gesture integration processes.


Psychological Science | 2015

Iconic Gestures Facilitate Discourse Comprehension in Individuals With Superior Immediate Memory for Body Configurations

Ying Choon Wu; Seana Coulson

To understand a speaker’s gestures, people may draw on kinesthetic working memory (KWM)—a system for temporarily remembering body movements. The present study explored whether sensitivity to gesture meaning was related to differences in KWM capacity. KWM was evaluated through sequences of novel movements that participants viewed and reproduced with their own bodies. Gesture sensitivity was assessed through a priming paradigm. Participants judged whether multimodal utterances containing congruent, incongruent, or no gestures were related to subsequent picture probes depicting the referents of those utterances. Individuals with low KWM were primarily inhibited by incongruent speech-gesture primes, whereas those with high KWM showed facilitation—that is, they were able to identify picture probes more quickly when preceded by congruent speech and gestures than by speech alone. Group differences were most apparent for discourse with weakly congruent speech and gestures. Overall, speech-gesture congruency effects were positively correlated with KWM abilities, which may help listeners match spatial properties of gestures to concepts evoked by speech.


Psychophysiology | 2005

Meaningful gestures: electrophysiological indices of iconic gesture comprehension.

Ying Choon Wu; Seana Coulson


PLOS ONE | 2014

A Psychometric Measure of Working Memory Capacity for Configured Body Movement

Ying Choon Wu; Seana Coulson


Cognitive Science | 2015

Visuo-spatial Working Memory and the Comprehension of Iconic Gestures.

Ying Choon Wu; Bonnie Chinh; Seana Coulson

Collaboration


Dive into the Ying Choon Wu's collaboration.

Top Co-Authors

Avatar

Seana Coulson

University of California

View shared research outputs
Top Co-Authors

Avatar

Bonnie Chinh

University of California

View shared research outputs
Researchain Logo
Decentralizing Knowledge