Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jon Touryan is active.

Publication


Featured researches published by Jon Touryan.


Frontiers in Neuroscience | 2014

Estimating endogenous changes in task performance from EEG

Jon Touryan; Gregory Apker; Brent J. Lance; Scott E. Kerick; Anthony J. Ries; Kaleb McDowell

Brain wave activity is known to correlate with decrements in behavioral performance as individuals enter states of fatigue, boredom, or low alertness.Many BCI technologies are adversely affected by these changes in user state, limiting their application and constraining their use to relatively short temporal epochs where behavioral performance is likely to be stable. Incorporating a passive BCI that detects when the user is performing poorly at a primary task, and adapts accordingly may prove to increase overall user performance. Here, we explore the potential for extending an established method to generate continuous estimates of behavioral performance from ongoing neural activity; evaluating the extended method by applying it to the original task domain, simulated driving; and generalizing the method by applying it to a BCI-relevant perceptual discrimination task. Specifically, we used EEG log power spectra and sequential forward floating selection (SFFS) to estimate endogenous changes in behavior in both a simulated driving task and a perceptual discrimination task. For the driving task the average correlation coefficient between the actual and estimated lane deviation was 0.37 ± 0.22 (μ ± σ). For the perceptual discrimination task we generated estimates of accuracy, reaction time, and button press duration for each participant. The correlation coefficients between the actual and estimated behavior were similar for these three metrics (accuracy = 0.25 ± 0.37, reaction time = 0.33 ± 0.23, button press duration = 0.36 ± 0.30). These findings illustrate the potential for modeling time-on-task decrements in performance from concurrent measures of neural activity.


Journal of Cognitive Neuroscience | 2014

Isolating the neural mechanisms of interference during continuous multisensory dual-task performance

Ryan W. Kasper; Hubert Cecotti; Jon Touryan; Miguel P. Eckstein; Barry Giesbrecht

The need to engage in multiple tasks simultaneously is often encountered in everyday experience, but coordinating between two or more tasks can lead to impaired performance. Typical investigations of multitasking impairments have focused on the performance of two tasks presented in close temporal proximity on discrete trials; however, such paradigms do not match well with the continuous performance situations more typically encountered outside the laboratory. As a result, the stages of information processing that are affected during multisensory continuous dual tasks and how these changes in processing relate to behavior remain unclear. To address these issues, participants were presented simultaneous rapid visual and auditory stimulus sequences under three conditions: attend visual only, attend auditory only, and dual attention (attend both visual and auditory). Performance, measured in terms of response time and perceptual sensitivity (d′), revealed dual-task impairments only in the auditory task. Neural activity, measured by the ERP technique, revealed that both early stage sensory processing and later cognitive processing of the auditory task were affected by dual-task performance, but similar stages of processing of the visual task were not. Critically, individual differences in neural activity at both early and late stages of information processing accurately rank-ordered individuals based on the observed difference in behavioral performance between the single and dual attention conditions. These results reveal relationships between behavioral performance and the neural correlates of both early and late stage information processing that provide key insights into the complex interplay between the brain and behavior when multiple tasks are performed continuously.


international conference on augmented cognition | 2013

Translation of EEG-Based Performance Prediction Models to Rapid Serial Visual Presentation Tasks

Jon Touryan; Gregory Apker; Scott E. Kerick; Brent J. Lance; Anthony J. Ries; Kaleb McDowell

Brain wave activity is known to correlate with decrements in behavior brought on by fatigue, boredom or low levels of alertness. Being able to predict these behavioral changes from the neural activity via electroencephalography (EEG) is an area of ongoing interest. In this study we used an established approach to predict time-on-task decrements in behavior for both a realistic driving simulator and a difficult perceptual discrimination task, utilized in many brain-computer interface applications. The goal was to quantify how well EEG-based models of behavior, developed for a driving paradigm, extend to this non-driving task. Similar to previous studies, we were able to predict time-on-task behavioral effects from the EEG power spectrum for a number of participants in both the driving and perception tasks.


PLOS ONE | 2016

The Impact of Task Demands on Fixation-Related Brain Potentials during Guided Search.

Anthony J. Ries; Jon Touryan; Barry Ahrens; Patrick J. Connolly

Recording synchronous data from EEG and eye-tracking provides a unique methodological approach for measuring the sensory and cognitive processes of overt visual search. Using this approach we obtained fixation related potentials (FRPs) during a guided visual search task specifically focusing on the lambda and P3 components. An outstanding question is whether the lambda and P3 FRP components are influenced by concurrent task demands. We addressed this question by obtaining simultaneous eye-movement and electroencephalographic (EEG) measures during a guided visual search task while parametrically modulating working memory load using an auditory N-back task. Participants performed the guided search task alone, while ignoring binaurally presented digits, or while using the auditory information in a 0, 1, or 2-back task. The results showed increased reaction time and decreased accuracy in both the visual search and N-back tasks as a function of auditory load. Moreover, high auditory task demands increased the P3 but not the lambda latency while the amplitude of both lambda and P3 was reduced during high auditory task demands. The results show that both early and late stages of visual processing indexed by FRPs are significantly affected by concurrent task demands imposed by auditory working memory.


Biological Psychology | 2016

Common EEG features for behavioral estimation in disparate, real-world tasks

Jon Touryan; Brent J. Lance; Scott E. Kerick; Anthony J. Ries; Kaleb McDowell

In this study we explored the potential for capturing the behavioral dynamics observed in real-world tasks from concurrent measures of EEG. In doing so, we sought to develop models of behavior that would enable the identification of common cross-participant and cross-task EEG features. To accomplish this we had participants perform both simulated driving and guard duty tasks while we recorded their EEG. For each participant we developed models to estimate their behavioral performance during both tasks. Sequential forward floating selection was used to identify the montage of independent components for each model. Linear regression was then used on the combined power spectra from these independent components to generate a continuous estimate of behavior. Our results show that oscillatory processes, evidenced in EEG, can be used to successfully capture slow fluctuations in behavior in complex, multi-faceted tasks. The average correlation coefficients between the actual and estimated behavior was 0.548 ± 0.117 and 0.701 ± 0.154 for the driving and guard duty tasks respectively. Interestingly, through a simple clustering approach we were able to identify a number of common components, both neural and eye-movement related, across participants and tasks. We used these component clusters to quantify the relative influence of common versus participant-specific features in the models of behavior. These findings illustrate the potential for estimating complex behavioral dynamics from concurrent measures from EEG using a finite library of universal features.


international conference on augmented cognition | 2013

Integration of Automated Neural Processing into an Army-Relevant Multitasking Simulation Environment

Jon Touryan; Anthony J. Ries; Paul Weber; Laurie Gibson

Brain-computer interface technology has experienced a rapid evolution over recent years. Recent studies have demonstrated the feasibility of detecting the presence or absence of targets in visual imagery from the neural response alone. Classification accuracy persists even when the imagery is presented rapidly. While this capability offers significant promise for applications that require humans to process large volumes of imagery, it remains unclear how well this approach will translate to more real-world scenarios. To explore the viability of automated neural processing in an Army-relevant operational context, we designed and built a simulation environment based on a ground vehicle crewstation. Here, we describe the process of integrating and testing the automated neural processing capability within this simulation environment. Our results indicate the potential for significant benefits to be realized by incorporating brain-computer interface technology into future Army systems.


International Journal of Psychophysiology | 2018

The fixation-related lambda response: Effects of saccade magnitude, spatial frequency, and ocular artifact removal

Anthony J. Ries; David Slayback; Jon Touryan

Fixation-related potentials (FRPs) enable examination of electrophysiological signatures of visual perception under naturalistic conditions, providing a neural snapshot of the fixated scene. The most prominent FRP component, commonly referred to as the lambda response, is a large deflection over occipital electrodes (O1, Oz, O2) peaking 80-100 ms post fixation, reflecting afferent input to visual cortex. The lambda response is affected by bottom-up stimulus features and the size of the preceding saccade; however, prior research has not adequately controlled for these influences in free viewing paradigms. The current experiment (N = 16, 1 female) addresses these concerns by systematically manipulating spatial frequency in a free-viewing task requiring a range of saccade sizes. Given the close temporal proximity between saccade related activity and the onset of the lambda response, we evaluate how removing independent components (IC) associated with ocular motion artifacts affects lambda response amplitude. Our results indicate that removing ocular artifact ICs based on the covariance with gaze position did not significantly affect the amplitude of this occipital potential. Moreover, the results showed that spatial frequency and saccade magnitude each produce significant effects on lambda amplitude, where amplitude decreased with increasing spatial frequency and increased as a function of saccade size for small and medium-sized saccades. The amplitude differences between spatial frequencies were maintained across all saccade magnitudes suggesting these effects are produced from distinctly different and uncorrelated mechanisms. The current results will inform future analyses of the lambda potential in natural scenes where saccade magnitudes and spatial frequencies ultimately vary.


Frontiers in Human Neuroscience | 2017

Isolating Discriminant Neural Activity in the Presence of Eye Movements and Concurrent Task Demands

Jon Touryan; Vernon J. Lawhern; Patrick M. Connolly; Nima Bigdely-Shamlo; Anthony J. Ries

A growing number of studies use the combination of eye-tracking and electroencephalographic (EEG) measures to explore the neural processes that underlie visual perception. In these studies, fixation-related potentials (FRPs) are commonly used to quantify early and late stages of visual processing that follow the onset of each fixation. However, FRPs reflect a mixture of bottom-up (sensory-driven) and top-down (goal-directed) processes, in addition to eye movement artifacts and unrelated neural activity. At present there is little consensus on how to separate this evoked response into its constituent elements. In this study we sought to isolate the neural sources of target detection in the presence of eye movements and over a range of concurrent task demands. Here, participants were asked to identify visual targets (Ts) amongst a grid of distractor stimuli (Ls), while simultaneously performing an auditory N-back task. To identify the discriminant activity, we used independent components analysis (ICA) for the separation of EEG into neural and non-neural sources. We then further separated the neural sources, using a modified measure-projection approach, into six regions of interest (ROIs): occipital, fusiform, temporal, parietal, cingulate, and frontal cortices. Using activity from these ROIs, we identified target from non-target fixations in all participants at a level similar to other state-of-the-art classification techniques. Importantly, we isolated the time course and spectral features of this discriminant activity in each ROI. In addition, we were able to quantify the effect of cognitive load on both fixation-locked potential and classification performance across regions. Together, our results show the utility of a measure-projection approach for separating task-relevant neural activity into meaningful ROIs within more complex contexts that include eye movements.


2016 IEEE Second Workshop on Eye Tracking and Visualization (ETVIS) | 2016

The use of eye metrics to index cognitive workload in video games

Rohit Mallick; David Slayback; Jon Touryan; Anthony J. Ries; Brent J. Lance

Eye tracking metrics may provide unobtrusive measures of cognitive states such as workload and fatigue and can serve as useful inputs into future human computer interface technologies. To further explore the usefulness of eye tracking for the estimation of cognitive state, the current experiment evaluated saccade, fixation, and pupil-based measures to identify which metrics reliably indexed cognitive workload in a dynamic, unconstrained task (Tetris). In line with previous studies, our results show that some eye movement features are correlated with changes in workload, manipulated here via task difficulty. Among these were blink duration, saccade velocity, and tonic pupil dilation.


Journal of Vision | 2015

Eye movement correlates of behavioral performance in a simulated guard duty task

Jon Touryan; Anthony J. Ries

Eye movement patterns, including the pupillary response, have been shown to correlate with cognitive states such as mental workload or time-on-task fatigue. Likewise, these signals have been shown to directly correlate with behavior in both simple and complex tasks. In this study we explored the link between eye movement patterns and performance in a simulated guard duty task. Here, subjects viewed a sequence of photographs of individuals along with their corresponding identification card (ID) to determine the validity of each ID. The number of IDs awaiting verification randomly fluctuated during the task and was visually represented to the subject in the form of a dynamic queue to vary task load. Behavioral performance, EEG, gaze position and pupil diameter were measured continuously throughout the duration of the task. Using an established saccade detection algorithm, we were able to generate a number of features from the eye movement data. Through regression analysis, we found that both the length of the queue and reaction time were significantly correlated with eye movement features, including pupil diameter, saccade and blink frequency. In a similar fashion, we used Independent Component Analysis (ICA), combined with linear regression, to identify the EEG features most predictive of behavioral performance. Interestingly, a number of the most predictive EEG features were also eye movement related. In line with previous studies, our results demonstrate that a significant amount of task-relevant information can be extracted from patterns of eye movements. Meeting abstract presented at VSS 2015.

Collaboration


Dive into the Jon Touryan's collaboration.

Top Co-Authors

Avatar

Patrick J. Connolly

United States Geological Survey

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Laurie Gibson

Science Applications International Corporation

View shared research outputs
Top Co-Authors

Avatar

Chun-Hsiang Chuang

National Chiao Tung University

View shared research outputs
Top Co-Authors

Avatar

Shao-Wei Lu

National Chiao Tung University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hubert Cecotti

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge