Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jingjing Yang is active.

Publication


Featured researches published by Jingjing Yang.


ieee/icme international conference on complex medical engineering | 2011

Developing a logistic regression model with cross-correlation for motor imagery signal recognition

Siuly; Yan Li; Jinglong Wu; Jingjing Yang

Classification of motor imagery (MI)-based electroencephalogram (EEG) signals is a key issue for the development of brain-computer interface (BCI) systems. The objective of this study is to develop an algorithm that can distinguish two categories of MI EEG signals. In this paper, we propose a new classification algorithm for two-class MI signals recognition in BCIs. The proposed scheme develops a novel cross-correlation-based feature extractor, which is aided with a logistic regression model. The present method is tested on dataset IVa of BCI Competition III, which contain two-class MI data for five subjects. The performance is objectively computed using a k-fold cross validation (k=10) method on the testing set for each subject. The results of this study are compared with the recently reported eight methods in the literature. The results demonstrate that our proposed method outperforms the eight methods in terms of the average classification accuracy.


PLOS ONE | 2013

Effects of auditory stimuli in the horizontal plane on audiovisual integration: an event-related potential study.

Weiping Yang; Qi-fang Li; Tatsuya Ochi; Jingjing Yang; Yulin Gao; Xiaoyu Tang; Satoshi Takahashi; Jinglong Wu

This article aims to investigate whether auditory stimuli in the horizontal plane, particularly originating from behind the participant, affect audiovisual integration by using behavioral and event-related potential (ERP) measurements. In this study, visual stimuli were presented directly in front of the participants, auditory stimuli were presented at one location in an equidistant horizontal plane at the front (0°, the fixation point), right (90°), back (180°), or left (270°) of the participants, and audiovisual stimuli that include both visual stimuli and auditory stimuli originating from one of the four locations were simultaneously presented. These stimuli were presented randomly with equal probability; during this time, participants were asked to attend to the visual stimulus and respond promptly only to visual target stimuli (a unimodal visual target stimulus and the visual target of the audiovisual stimulus). A significant facilitation of reaction times and hit rates was obtained following audiovisual stimulation, irrespective of whether the auditory stimuli were presented in the front or back of the participant. However, no significant interactions were found between visual stimuli and auditory stimuli from the right or left. Two main ERP components related to audiovisual integration were found: first, auditory stimuli from the front location produced an ERP reaction over the right temporal area and right occipital area at approximately 160–200 milliseconds; second, auditory stimuli from the back produced a reaction over the parietal and occipital areas at approximately 360–400 milliseconds. Our results confirmed that audiovisual integration was also elicited, even though auditory stimuli were presented behind the participant, but no integration occurred when auditory stimuli were presented in the right or left spaces, suggesting that the human brain might be particularly sensitive to information received from behind than both sides.


PLOS ONE | 2015

Effects of sound frequency on audiovisual integration: An event-related potential study

Weiping Yang; Jingjing Yang; Yulin Gao; Xiaoyu Tang; Yanna Ren; Satoshi Takahashi; Jinglong Wu

A combination of signals across modalities can facilitate sensory perception. The audiovisual facilitative effect strongly depends on the features of the stimulus. Here, we investigated how sound frequency, which is one of basic features of an auditory signal, modulates audiovisual integration. In this study, the task of the participant was to respond to a visual target stimulus by pressing a key while ignoring auditory stimuli, comprising of tones of different frequencies (0.5, 1, 2.5 and 5 kHz). A significant facilitation of reaction times was obtained following audiovisual stimulation, irrespective of whether the task-irrelevant sounds were low or high frequency. Using event-related potential (ERP), audiovisual integration was found over the occipital area for 0.5 kHz auditory stimuli from 190–210 ms, for 1 kHz stimuli from 170–200 ms, for 2.5 kHz stimuli from 140–200 ms, 5 kHz stimuli from 100–200 ms. These findings suggest that a higher frequency sound signal paired with visual stimuli might be early processed or integrated despite the auditory stimuli being task-irrelevant information. Furthermore, audiovisual integration in late latency (300–340 ms) ERPs with fronto-central topography was found for auditory stimuli of lower frequencies (0.5, 1 and 2.5 kHz). Our results confirmed that audiovisual integration is affected by the frequency of an auditory stimulus. Taken together, the neurophysiological results provide unique insight into how the brain processes a multisensory visual signal and auditory stimuli of different frequencies.


Neuroscience Letters | 2013

Modulation of auditory stimulus processing by visual spatial or temporal cue: an event-related potentials study.

Xiaoyu Tang; Chunlin Li; Qi Li; Yulin Gao; Weiping Yang; Jingjing Yang; Soushirou Ishikawa; Jinglong Wu

Utilizing the high temporal resolution of event-related potentials (ERPs), we examined how visual spatial or temporal cues modulated the auditory stimulus processing. The visual spatial cue (VSC) induces orienting of attention to spatial locations; the visual temporal cue (VTC) induces orienting of attention to temporal intervals. Participants were instructed to respond to auditory targets. Behavioral responses to auditory stimuli following VSC were faster and more accurate than those following VTC. VSC and VTC had the same effect on the auditory N1 (150-170 ms after stimulus onset). The mean amplitude of the auditory P1 (90-110 ms) in VSC condition was larger than that in VTC condition, and the mean amplitude of late positivity (300-420 ms) in VTC condition was larger than that in VSC condition. These findings suggest that modulation of auditory stimulus processing by visually induced spatial or temporal orienting of attention were different, but partially overlapping.


Neuroscience Letters | 2015

The temporal reliability of sound modulates visual detection: An event-related potential study

Qi Li; Yan Wu; Jingjing Yang; Jinglong Wu; Tetsuo Touge

Utilizing the high temporal resolution of event-related potentials (ERPs), we examined the effects of temporal reliability of sounds on visual detection. Significantly faster reaction times to visual target stimuli were observed when reliable temporal information was provided by a task-irrelevant auditory stimulus. Three main ERP components related to the effects of auditory temporal reliability were found: the first at 180-240 ms over a wide central area, the second at 300-400 ms over an anterior area, and the third at 300-380 ms over bilateral temporal areas. Our results support the hypothesis that temporal reliability affects visual detection and indicate that auditory facilitation of visual detection is partly due to spread of attention and thus results from implicit temporal linking of auditory and visual information at a relatively late processing stage.


Neuroreport | 2014

Effects of ipsilateral and bilateral auditory stimuli on audiovisual integration: a behavioral and event-related potential study

Yulin Gao; Qi Li; Weiping Yang; Jingjing Yang; Xiaoyu Tang; Jinglong Wu

We used event-related potential measures to compare the effects of ipsilateral and bilateral auditory stimuli on audiovisual (AV) integration. Behavioral results showed that the responses to visual stimuli with either type of auditory stimulus were faster than responses to visual stimuli only and that perceptual sensitivity (d′) for visual detection was enhanced for visual stimuli with ipsilateral auditory stimuli. Furthermore, event-related potential components related to AV integrations were identified over the occipital areas at ∼180–200 ms during early-stage sensory processing by the effect of ipsilateral auditory stimuli and over the frontocentral areas at ∼300–320 ms during late-stage cognitive processing by the effect of ipsilateral and bilateral auditory stimuli. Our results confirmed that AV integration was also elicited, despite the effect of bilateral auditory stimuli, and only occurred at later stages of cognitive processing in response to a visual detection task. Furthermore, integration from early-stage sensory processing was observed by the effect of ipsilateral auditory stimuli, suggesting that the integration of AV information in the human brain might be particularly sensitive to ipsilaterally presented AV stimuli.


international conference on complex medical engineering | 2012

Effect of cue-target interval on endogenous attention in Go/No-Go task: evidence from an event-related potentials study

Xiaoyu Tang; Yulin Gao; Weiping Yang; Chunlin Li; Jingjing Yang; Ishikawa Soushirou; Ming Zhang; Jinglong Wu

Previous researches found that inter stimulus interval (ISI) affected the amplitude and latency of P300 component in go/no-go task or oddball paradigm. Here we combined the cue-target paradigm with go/no-go task to investigate whether the cue-target interval (the time between cue-offset and target-onset) could affect the amplitude or latency of P300 component or not, when the central cue could completely predicts the target location, which induced totally endogenous attention. The results showed that the latency of P300 would not change in short (600ms) or long (1800ms) cue-target intervals conditions, indicating the endogenous attention worked on the target processing so that effect of different intervals on target processing was not significant. However, the mean amplitude of P300 (from 300ms to 600ms, after target onset) increased with the increasing cue-target interval, which supported that the temporal factor that was either target-target interval or cue-target interval, might determine the amplitude of P300.


international conference on complex medical engineering | 2012

Effects of spatial location on bisensory audiovisual integration in horizontal meridian

Qi Li; Jingjing Yang; Jinglong Wu

Crossmodal spatial integration between auditory and visual stimuli is a common phenomenon in space perception. In the present study, the effects of spatial location on bisensory audiovisual integration in horizontal meridian were investigated. The behavior results of audiovisual integration were compared at -30°, 0° and 30° on horizontal direction. Our results showed that responses to audiovisual stimuli presented at a center location (0°) were faster and more accurate than those presented at peripheral locations (±30°). The results suggest that bimodal audiovisual integration depends on the spatial location of audiovisual stimuli.


ieee/icme international conference on complex medical engineering | 2011

Task-irrelevant auditory stimuli affect audiovisual integration in a visual attention task: Evidence from event-related potentials

Jingjing Yang; Qi Li; Yulin Gao; Jinglong Wu

Integration of information from multiple senses is fundamental to perception and cognition, but the neural activity of multimodal audiovisual integration remains unclear. This study used event-related potentials (ERPs) to demonstrate that onset synchronous task-irrelevant auditory stimuli affect the audiovisual integration. The behavioral results showed that the responses to audiovisual target stimuli were faster than that to unimodal visual target stimuli. Moreover, the ERPs were recorded in response to unimodal auditory (A), unimodal visual (V) and bimodal (AV) stimuli. Cross-modal interactions were estimated using the additive [AV − (A + V)] model. Four ERP components related to audiovisual integration were observed: (1) over central and occipital areas at around100 to 160ms; (2) over the central and occipital areas at around 160 to 200ms; (3) over the occipital areas at around 200 to 240ms. (4) over frontal-central areas at around 280 to 320ms. These findings confirmed the main neural activity of audiovisual integration. In addition, our study provided evidence that multimodal integration can be generated even if the auditory stimulus was task- irrelevant.


international conference on complex medical engineering | 2013

Effect of cue-target interval on audiovisual stimuli processing in endogenous spatial attention: An event-related potentials study

Xiaoyu Tang; Chunlin Li; Qi Li; Yulin Gao; Weiping Yang; Jingjing Yang; Ishikawa Soushirou; Satoshi Takahashi; Jinglong Wu

Previous studies indicated that the interstimulus interval (ISI) has effects on the visual or auditory stimulus processing. Utilizing the high temporal resolution of event-related potentials (ERPs), we combined the endogenous cue-target paradigm in which the central cue could completely predicts the target location with go/no-go task to investigate whether the ISI could affect the audiovisual (AV) stimuli processing. The results showed that the ISI had effect on the AV stimuli processing. Specifically, the mean amplitude of the late positivity components (220-260 ms & 400-500 ms) was larger in the long ISI (1800 ms) condition than in the short ISI (600 ms) condition, while the late negativity component (300-340 ms) was larger in the short ISI condition than in the long ISI condition. But the ISI had no effect on early ERPs components (P1 & N1) elicited by the AV stimuli. The ERPs results suggested that the ISI can affect the AV stimuli processing reflecting on the later components, but not on the earlier ERPs components.

Collaboration


Dive into the Jingjing Yang's collaboration.

Top Co-Authors

Avatar

Jinglong Wu

Beijing Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Qi Li

Changchun University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Xiujun Li

Changchun University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chunlin Li

Capital Medical University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yan Wu

Changchun University of Science and Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge