Hiu Mei Chow
University of Hong Kong
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Hiu Mei Chow.
Journal of Vision | 2016
Hiu Mei Chow; Li Jingling; Chia-huei Tseng
Collinearity and eye of origin were recently discovered to guide attention: Target search is impaired if it is overlapping with a collinear structure (Jingling & Tseng, 2013) but enhanced if the target is an ocular singleton (Zhaoping, 2008). Both are proposed to occur in V1, and we study their interaction here. In our 9 × 9 search display (Experiment 1), all columns consisted of horizontal bars except for one randomly selected column that contained orthogonal bars (collinear distractor). All columns were presented to one eye except for a randomly selected column that was presented to the other eye (ocular distractor). The target could be located on a distractor column (collinear congruent [CC]/ocular congruent [OC]) or not (collinear incongruent [CI]/ocular incongruent [OI]). We expected to find the best search performance for OC + CI targets and the worst search performance for OI + CC targets. The other combinations would depend on the relative strength of collinearity and ocular information in guiding attention. As expected, we observed collinear impairment, but surprisingly, we did not observe any search advantage for OC targets. Our subsequent experiments confirmed that OC search impairment also occurred when color-defined columns (Experiment 2), ocular singletons (Experiments 4 and 5), and noncollinear columns (Experiment 5) were used instead of collinear columns. However, the ocular effect disappeared when paired with luminance-defined columns (Experiments 3A and 3B). Although our results agree well with earlier findings that eye-of-origin information guides attention, they highlight that our previous understanding of search advantage by ocular singleton targets might have been oversimplified.
bioRxiv | 2018
Hsin-Ni Ho; Hiu Mei Chow; Sayaka Tsunokake; Warrick Roseboom
The brain consistently faces a challenge of whether and how to combine the available information sources to estimate the properties of an object explored by hand. Thermal referral (TR) is a phenomenon that demonstrates how thermal and tactile modalities coordinate to resolve inconsistencies in spatial and thermal information. When the middle three fingers of one hand are thermally stimulated, but only the outer two fingers are heated (or cooled), thermal uniformity is perceived across three fingers. This illusory experience of thermal uniformity in TR compensates for the discontinuity in the thermal sensation across the sites in contact. The neural loci of TR is unclear. While TR reflects the diffuse nature of the thermoceptive system, its similarities to perceptual filling-in and its facilitative role in object perception also suggest that TR might involve inference processes associated with object perception. To clarify the positioning of this thermo-tactile interaction in the sensory processing hierarchy, we used perceptual adaptation and Bayesian decision modelling techniques. Our results indicate that TR adaptation takes place at a peripheral stage where information about temperature inputs are still preserved for each finger, and that the thermal-tactile interaction occurs after this stage. We also show that the temperature integration across three fingers in TR is consistent with precision weighted averaging effect - Bayesian cue combination. Altogether, our findings suggest that for the sensory processing hierarchy of thermal touch, thermal adaptation occurs prior to thermo-tactile integration, which combines thermal and tactile information to give a unified percept to facilitate object recognition. Significance Statement Thermal touch refers to the perception of temperature of objects in contact with the skin and is key to object recognition based on thermal cues. While object perception is an inference process involving multisensory inputs, thermal referral (TR) is an illusion demonstrating how the brain’s interpretation of object temperature can deviate from physical reality. Here we used TR to explore the processing hierarchy of thermal touch. We show that adaptation of thermal perception occurs prior to integration of thermal information across tactile locations. Further, we show that TR results from simple averaging of thermal sensation across locations. Our results illuminate the flexibility of the processing that underlies thermal-tactile interactions and facilitates object exploration and identification in our complicated natural environment.
Scientific Reports | 2018
Chia-huei Tseng; Hiu Mei Chow; Yuen Ki Ma; Jie Ding
Learning in a multisensory world is challenging as the information from different sensory dimensions may be inconsistent and confusing. By adulthood, learners optimally integrate bimodal (e.g. audio-visual, AV) stimulation by both low-level (e.g. temporal synchrony) and high-level (e.g. semantic congruency) properties of the stimuli to boost learning outcomes. However, it is unclear how this capacity emerges and develops. To approach this question, we examined whether preverbal infants were capable of utilizing high-level properties with grammar-like rule acquisition. In three experiments, we habituated pre-linguistic infants with an audio-visual (AV) temporal sequence that resembled a grammar-like rule (A-A-B). We varied the cross-modal semantic congruence of the AV stimuli (Exp 1: congruent syllables/faces; Exp 2: incongruent syllables/shapes; Exp 3: incongruent beeps/faces) while all the other low-level properties (e.g. temporal synchrony, sensory energy) were constant. Eight- to ten-month-old infants only learned the grammar-like rule from AV congruent stimuli pairs (Exp 1), not from incongruent AV pairs (Exp 2, 3). Our results show that similar to adults, preverbal infants’ learning is influenced by a high-level multisensory integration gating system, pointing to a perceptual origin of bimodal learning advantage that was not previously acknowledged.
Journal of Vision | 2017
Vivian M. Ciaramitaro; Hiu Mei Chow; Luke G. Eglington
We used a cross-modal dual task to examine how changing visual-task demands influenced auditory processing, namely auditory thresholds for amplitude- and frequency-modulated sounds. Observers had to attend to two consecutive intervals of sounds and report which interval contained the auditory stimulus that was modulated in amplitude (Experiment 1) or frequency (Experiment 2). During auditory-stimulus presentation, observers simultaneously attended to a rapid sequential visual presentation-two consecutive intervals of streams of visual letters-and had to report which interval contained a particular color (low load, demanding less attentional resources) or, in separate blocks of trials, which interval contained more of a target letter (high load, demanding more attentional resources). We hypothesized that if attention is a shared resource across vision and audition, an easier visual task should free up more attentional resources for auditory processing on an unrelated task, hence improving auditory thresholds. Auditory detection thresholds were lower-that is, auditory sensitivity was improved-for both amplitude- and frequency-modulated sounds when observers engaged in a less demanding (compared to a more demanding) visual task. In accord with previous work, our findings suggest that visual-task demands can influence the processing of auditory information on an unrelated concurrent task, providing support for shared attentional resources. More importantly, our results suggest that attending to information in a different modality, cross-modal attention, can influence basic auditory contrast sensitivity functions, highlighting potential similarities between basic mechanisms for visual and auditory attention.
Psychological Science | 2013
Chia-huei Tseng; Hiu Mei Chow; Lothar Spillmann
The perception of verticality is critical for balance control and interaction with the world. But this complex process fails badly under certain circumstances—usually as the result of an illusion. Here, we report on a real-world example of how the brain fails to disregard body position on a moving mountain tram and adopts an inappropriate frame of reference, which prompts passengers to perceive skyscrapers leaning by as much as 30°. To elucidate the sensory origin of this misperception, we conducted field experiments on the moving tram to systematically disentangle the contributions of four sensory systems known to affect verticality perception, namely, vestibular, tactile, proprioceptive, and visual cues. Our results refute the intuitive assumption that the perceived tilt of the buildings is based on visual error signals and demonstrate instead that a unified percept of verticality is a product of the synergistic interaction among multiple sensory systems and the contextual information available in the real world.
Journal of Vision | 2013
Hiu Mei Chow; Li Jingling; Chia-huei Tseng
Developmental Science | 2016
Angeline Sin Mei Tsui; Yuen Ki Ma; Anna Ho; Hiu Mei Chow; Chia-huei Tseng
Consciousness and Cognition | 2015
Hiu Mei Chow; Chia-huei Tseng
Journal of Vision | 2018
Hiu Mei Chow; Vivian M. Ciaramitaro
International Journal of Psychology | 2018
Xinmei Deng; Chen Cheng; Hiu Mei Chow; Xuechen Ding