Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Weina Zhu is active.

Publication


Featured researches published by Weina Zhu.


Scientific Reports | 2015

Dense sampling reveals behavioral oscillations in rapid visual categorization.

Jan Drewes; Weina Zhu; Andreas Wutz; David Melcher

Perceptual systems must create discrete objects and events out of a continuous flow of sensory information. Previous studies have demonstrated oscillatory effects in the behavioral outcome of low-level visual tasks, suggesting a cyclic nature of visual processing as the solution. To investigate whether these effects extend to more complex tasks, a stream of “neutral” photographic images (not containing targets) was rapidly presented (20 ms/image). Embedded were one or two presentations of a randomly selected target image (vehicles and animals). Subjects reported the perceived target category. On dual-presentation trials, the ISI varied systematically from 0 to 600 ms. At randomized timing before first target presentation, the screen was flashed with the intent of creating a phase reset in the visual system. Sorting trials by temporal distance between flash and first target presentation revealed strong oscillations in behavioral performance, peaking at 5 Hz. On dual-target trials, longer ISIs led to reduced performance, implying a temporal integration window for object category discrimination. The “animal” trials exhibited a significant oscillatory component around 5 Hz. Our results indicate that oscillatory effects are not mere fringe effects relevant only with simple stimuli, but are resultant from the core mechanisms of visual processing and may well extend into real-life scenarios.


PLOS ONE | 2014

Smaller Is Better: Drift in Gaze Measurements due to Pupil Dynamics

Jan Drewes; Weina Zhu; Yingzhou Hu; Xintian Hu

Camera-based eye trackers are the mainstay of eye movement research and countless practical applications of eye tracking. Recently, a significant impact of changes in pupil size on gaze position as measured by camera-based eye trackers has been reported. In an attempt to improve the understanding of the magnitude and population-wise distribution of the pupil-size dependent shift in reported gaze position, we present the first collection of binocular pupil drift measurements recorded from 39 subjects. The pupil-size dependent shift varied greatly between subjects (from 0.3 to 5.2 deg of deviation, mean 2.6 deg), but also between the eyes of individual subjects (0.1 to 3.0 deg difference, mean difference 1.0 deg). We observed a wide range of drift direction, mostly downward and nasal. We demonstrate two methods to partially compensate the pupil-based shift using separate calibrations in pupil-constricted and pupil-dilated conditions, and evaluate an improved method of compensation based on individual look-up-tables, achieving up to 74% of compensation.


PLOS ONE | 2016

Time for Awareness: The Influence of Temporal Properties of the Mask on Continuous Flash Suppression Effectiveness.

Weina Zhu; Jan Drewes; David Melcher

Visual processing is not instantaneous, but instead our conscious perception depends on the integration of sensory input over time. In the case of Continuous Flash Suppression (CFS), masks are flashed to one eye, suppressing awareness of stimuli presented to the other eye. One potential explanation of CFS is that it depends, at least in part, on the flashing mask continually interrupting visual processing before the stimulus reaches awareness. We investigated the temporal features of masks in two ways. First, we measured the suppression effectiveness of a wide range of masking frequencies (0-32Hz), using both complex (faces/houses) and simple (closed/open geometric shapes) stimuli. Second, we varied whether the different frequencies were interleaved within blocks or separated in homogenous blocks, in order to see if suppression was stronger or weaker when the frequency remained constant across trials. We found that break-through contrast differed dramatically between masking frequencies, with mask effectiveness following a skewed-normal curve peaking around 6Hz and little or no masking for low and high temporal frequencies. Peak frequency was similar for trial-randomized and block randomized conditions. In terms of type of stimulus, we found no significant difference in peak frequency between the stimulus groups (complex/simple, face/house, closed/open). These findings suggest that temporal factors play a critical role in perceptual awareness, perhaps due to interactions between mask frequency and the time frame of visual processing.


PLOS ONE | 2013

Animal Detection in Natural Images: Effects of Color and Image Database

Weina Zhu; Jan Drewes; Karl R. Gegenfurtner

The visual system has a remarkable ability to extract categorical information from complex natural scenes. In order to elucidate the role of low-level image features for the recognition of objects in natural scenes, we recorded saccadic eye movements and event-related potentials (ERPs) in two experiments, in which human subjects had to detect animals in previously unseen natural images. We used a new natural image database (ANID) that is free of some of the potential artifacts that have plagued the widely used COREL images. Color and grayscale images picked from the ANID and COREL databases were used. In all experiments, color images induced a greater N1 EEG component at earlier time points than grayscale images. We suggest that this influence of color in animal detection may be masked by later processes when measuring reation times. The ERP results of go/nogo and forced choice tasks were similar to those reported earlier. The non-animal stimuli induced bigger N1 than animal stimuli both in the COREL and ANID databases. This result indicates ultra-fast processing of animal images is possible irrespective of the particular database. With the ANID images, the difference between color and grayscale images is more pronounced than with the COREL images. The earlier use of the COREL images might have led to an underestimation of the contribution of color. Therefore, we conclude that the ANID image database is better suited for the investigation of the processing of natural scenes than other databases commonly used.


Frontiers in Human Neuroscience | 2016

Differential Visual Processing of Animal Images, with and without Conscious Awareness

Weina Zhu; Jan Drewes; Nicholas Peatfield; David Melcher

The human visual system can quickly and efficiently extract categorical information from a complex natural scene. The rapid detection of animals in a scene is one compelling example of this phenomenon, and it suggests the automatic processing of at least some types of categories with little or no attentional requirements (Li et al., 2002, 2005). The aim of this study is to investigate whether the remarkable capability to categorize complex natural scenes exist in the absence of awareness, based on recent reports that “invisible” stimuli, which do not reach conscious awareness, can still be processed by the human visual system (Pasley et al., 2004; Williams et al., 2004; Fang and He, 2005; Jiang et al., 2006, 2007; Kaunitz et al., 2011a). In two experiments, we recorded event-related potentials (ERPs) in response to animal and non-animal/vehicle stimuli in both aware and unaware conditions in a continuous flash suppression (CFS) paradigm. Our results indicate that even in the “unseen” condition, the brain responds differently to animal and non-animal/vehicle images, consistent with rapid activation of animal-selective feature detectors prior to, or outside of, suppression by the CFS mask.


Vision Research | 2014

Dissociation between spatial and temporal integration mechanisms in Vernier fusion

Jan Drewes; Weina Zhu; David Melcher

The visual system constructs a percept of the world across multiple spatial and temporal scales. This raises the questions of whether different scales involve separate integration mechanisms and whether spatial and temporal factors are linked via spatio-temporal reference frames. We investigated this using Vernier fusion, a phenomenon in which the features of two Vernier stimuli presented in close spatio-temporal proximity are fused into a single percept. With increasing spatial offset, perception changes dramatically from a single percept into apparent motion and later, at larger offsets, into two separately perceived stimuli. We tested the link between spatial and temporal integration by presenting two successive Vernier stimuli presented at varying spatial and temporal offsets. The second Vernier either had the same or the opposite offset as the first. We found that the type of percept depended not only on spatial offset, as reported previously, but interacted with the temporal parameter as well. At temporal separations around 30-40 ms the majority of trials were perceived as motion, while above 70 ms predominantly two separate stimuli were reported. The dominance of the second Vernier varied systematically with temporal offset, peaking around 40 ms ISI. Same-offset conditions showed increasing amounts of perceived separation at large ISIs, but little dependence on spatial offset. As subjects did not always completely fuse stimuli, we separated trials by reported percept (single/fusion, motion, double/segregation). We found systematic indications of spatial fusion even on trials in which subjects perceived temporal segregation. These findings imply that spatial integration/fusion may occur even when the stimuli are perceived as temporally separate entities, suggesting that the mechanisms responsible for temporal segregation and spatial integration may not be mutually exclusive.


Journal of Vision | 2014

Reduced ERP amplitudes for animal stimuli in the absence of conscious awareness

Weina Zhu; Jan Drewes; Karl R. Gegenfurtner

and easily extracted; (2) Animal stimuli deviate from non-animal stimuli around 150ms after stimulus onset (ERPs, Thorpe, Fize et al. 1996) Question: (1) Does this remarkable capability function in the absence of awareness? (2) Are there any differences between animal and non-animal in the suppressed condition? (Are animals special?) Introduction Paradigm: CFS (continuous flash suppression) break-through (experiment 1) and plain CFS paradigm (experiment 2,3, EEG) were used.


Journal of Vision | 2018

The edge of awareness: Mask spatial density, but not color, determines optimal temporal frequency for continuous flash suppression

Jan Drewes; Weina Zhu; David Melcher


Journal of Vision | 2017

Mechanisms of suppression: How the classic Mondrian beats noise in CFS masking

Weina Zhu; Jan Drewes; David Melcher


Journal of Vision | 2017

Long vs. short integrators: resting state alpha frequency predicts individual differences in temporal integration

Jan Drewes; Weina Zhu; Evelyn Muschter; David Melcher

Collaboration


Dive into the Weina Zhu's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Xintian Hu

Kunming Institute of Zoology

View shared research outputs
Top Co-Authors

Avatar

Yingzhou Hu

Kunming Institute of Zoology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge