Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Andrew T. Duchowski is active.

Publication


Featured researches published by Andrew T. Duchowski.


Behavior Research Methods Instruments & Computers | 2002

A breadth-first survey of eye-tracking applications

Andrew T. Duchowski

Eye-tracking applications are surveyed in a breadth-first manner, reporting on work from the following domains: neuroscience, psychology, industrial engineering and human factors, marketing/advertising, and computer science. Following a review of traditionally diagnostic uses, emphasis is placed on interactive applications, differentiating between selective and gaze-contingent approaches.


Applied Ergonomics | 2002

Using virtual reality technology for aircraft visual inspection training: presence and comparison studies

Jeenal Vora; Santosh Nair; Anand K. Gramopadhye; Andrew T. Duchowski; Brian J. Melloy; Barbara G. Kanki

The aircraft maintenance industry is a complex system consisting of several interrelated human and machine components. Recognizing this, the Federal Aviation Administration (FAA) has pursued human factors related research. In the maintenance arena the research has focused on the aircraft inspection process and the aircraft inspector. Training has been identified as the primary intervention strategy to improve the quality and reliability of aircraft inspection. If training is to be successful, it is critical that we provide aircraft inspectors with appropriate training tools and environment. In response to this need, the paper outlines the development of a virtual reality (VR) system for aircraft inspection training. VR has generated much excitement but little formal proof that it is useful. However, since VR interfaces are difficult and expensive to build, the computer graphics community needs to be able to predict which applications will benefit from VR. To address this important issue, this research measured the degree of immersion and presence felt by subjects in a virtual environment simulator. Specifically, it conducted two controlled studies using the VR system developed for visual inspection task of an aft-cargo bay at the VR Lab of Clemson University. Beyond assembling the visual inspection virtual environment, a significant goal of this project was to explore subjective presence as it affects task performance. The results of this study indicated that the system scored high on the issues related to the degree of presence felt by the subjects. As a next logical step, this study, then, compared VR to an existing PC-based aircraft inspection simulator. The results showed that the VR system was better and preferred over the PC-based training tool.


eye tracking research & application | 2008

Longitudinal evaluation of discrete consecutive gaze gestures for text entry

Jacob O. Wobbrock; James S. Rubinstein; Michael W. Sawyer; Andrew T. Duchowski

Eye-typing performance results are reported from controlled studies comparing an on-screen keyboard and Eye Write, a new on-screen gestural input alternative. Results from the first pilot study suggest the presence of a learning curve that novice users must overcome in order to gain proficiency in EyeWrites use (requiring practice with its letter-like gestural alphabet). Results from the second longitudinal study indicate that EyeWrites inherent multi-saccade handicap (4.52 saccades per character, frequency-weighted average) is sufficient for the on-screen keyboard to edge out Eye Write in speed performance. Eye-typing speeds with Eye Write approach 5 wpm on average (8 wpm attainable by proficient users), whereas keyboard users achieve about 7 wpm on average (in line with previous results). However, Eye Write users leave significantly fewer uncorrected errors in the final text, with no significant difference in the number of errors corrected during entry, indicating a speed-accuracy trade-off. Subjective results indicate that participants consider Eye Write significantly faster, easier to use, and prone to cause less ocular fatigue than the on-screen keyboard. In addition, Eye-Write consumes much less screen real-estate than an on-screen keyboard, giving it practical advantages for eye-based text entry.


Communications of The ACM | 2003

Focusing on the essential: considering attention in display design

Patrick Baudisch; Douglas DeCarlo; Andrew T. Duchowski; Wilson S. Geisler

Attentive displays address the need for rendering power and computer display resolution. The five examples presented here illustrate a common goal with very different approaches to achieving it.


ACM Transactions on Multimedia Computing, Communications, and Applications | 2007

Foveated gaze-contingent displays for peripheral LOD management, 3D visualization, and stereo imaging

Andrew T. Duchowski; Arzu Çöltekin

Advancements in graphics hardware have allowed development of hardware-accelerated imaging displays. This article reviews techniques for real-time simulation of arbitrary visual fields over still images and video. The goal is to provide the vision sciences and perceptual graphics communities techniques for the investigation of fundamental processes of visual perception. Classic gaze-contingent displays used for these purposes are reviewed and for the first time a pixel shader is introduced for display of a high-resolution window over peripherally degraded stimulus. The pixel shader advances current state-of-the-art by allowing real-time processing of still or streamed images, obviating the need for preprocessing or storage.


international conference on biometrics theory applications and systems | 2008

Adapting Starburst for Elliptical Iris Segmentation

Wayne J. Ryan; Damon L. Woodard; Andrew T. Duchowski; Stan Birchfield

Fitting an ellipse to the iris boundaries accounts for the projective distortions present in off-axis images of the eye and provides the contour fitting necessary for the dimensionless mapping used in leading iris recognition algorithms. Previous iris segmentation efforts have either focused on fitting circles to pupillary and limbic boundaries or assigning labels to image pixels. This paper approaches the iris segmentation problem by adapting the starburst algorithm to locate pupillary and limbic feature pixels used to fit a pair of ellipses. The approach is evaluated by comparing the fits to ground truth. Two metrics are used in the evaluation, the first based on the algebraic distance between ellipses, the second based on ellipse chamfer images. Results are compared to segmentations produced by ND_IRIS over randomly selected images from the iris challenge evaluation database. Statistical evidence shows significant improvement of starbursts elliptical fits over the circular fits on which ND_IRIS relies.


Computer Vision and Image Understanding | 2005

Editorial: special issue: eye detection and tracking

Qiang Ji; Harry Wechsler; Andrew T. Duchowski; Myron Flickner

As one of the most salient features of human face, the eyes play an important role in interpreting and understanding a person s desires, needs, and emotional states. Robust nonintrusive eye detection and tracking are crucial for human–computer interaction, attentive user interfaces, and user affective state understanding. In addition, the unique geometric, photometric, and motion characteristics of the eyes also provide important visual cues for face detection, face recognition, and facial expression understanding. The special issue sought the latest original research that focuses on real-time and nonintrusive eye detection and tracking, with minimum or no calibration and under natural head movement. Of the 23 submissions we received, after two rounds of rigorous review, we accepted eight high-quality papers, representing different areas of eye detection and tracking. The accepted papers can be grouped into three categories: literature review of state of the art in eye and gaze detection and tracking, remote gaze tracking under natural head movement, and improved eye detection and tracking techniques. For literature review, Morimoto and Mimica [1] present an in-depth and quantitative review of state of the art of remote eye gaze trackers, discussing the strengths, and weaknesses of the alternatives available today. For real-time and nonintrusive gaze estimation and tracking under natural head movement, three papers are included in this issue. Yoo and Chun [2] propose a multicamera and multilight nonintrusive and real-time system for estimating gaze under head movement. Their system exploits the invariant perspective property of cross-ratio to efficiently and accurately estimate and track eye gaze under large head movement. Noureddin et al. [3] present another system for remote gaze tracking under natural head movement. Their system consists of two cameras (an eye camera and a face camera) and a mirror. The mirror rotates to maintain real-time tracking of the eyes by the eye camera in the presence of head motion. Instead of using multiple cameras, Wang et al. [4] introduce a novel approach to measuring the eye gaze using a monocular camera. The unique contribution of their work lies in remote eye detection and gaze estimation under minor head movement using only one camera, based on exploiting the geometric and perspective properties of the iris.


Behavior Research Methods Instruments & Computers | 2002

3-D eye movement analysis.

Andrew T. Duchowski; Eric Medlin; Nathan Cournia; Hunter A. Murphy; Anand K. Gramopadhye; Santosh Nair; Jeenal Vorah; Brian J. Melloy

This paper presents a novel three-dimensional (3-D) eye movement analysis algorithm for binocular eye tracking within virtualreality (VR). The user’s gaze direction, head position, and orientation are tracked in order to allow recording of the user’s fixations within the environment. Although the linear signal analysis approach is itself not new, its application to eye movement analysis in three dimensions advances traditional two-dimensional approaches, since it takes into account the six degrees of freedom of head movements and is resolution independent. Results indicate that the 3-D eye movement analysis algorithm can successfully be used for analysis of visual process measures in VR. Process measures not only can corroborate performance measures, but also can lead to discoveries of the reasons for performance improvements. In particular, analysis of users’ eye movements in VR can potentially lead to further insights into the underlying cognitive processes of VR subjects.


eye tracking research & application | 2000

Binocular eye tracking in virtual reality for inspection training

Andrew T. Duchowski; Vinay Shivashankaraiah; Tim Rawls; Anand K. Gramopadhye; Brian J. Melloy; Barbara G. Kanki

This paper describes the development of a binocular eye tracking Virtual Reality system for aircraft inspection training. The aesthetic appearance of the environment is driven by standard graphical techniques augmented by realistic texture maps of the physical environment. A “virtual flashlight” is provided to simulate a tool used by inspectors. The users gaze direction, as well as head position and orientation, are tracked to allow recording of the users gaze locations within the environment. These gaze locations, or scanpaths, are calculated as gaze/polygon intersections, enabling comparison of fixated points with stored locations of artificially generated defects located in the environment interior. Recorded scanpaths provide a means of comparison of the performance of experts to novices, thereby gauging the effects of training.


virtual reality software and technology | 2001

Binocular eye tracking in VR for visual inspection training

Andrew T. Duchowski; Eric Medlin; Anand K. Gramopadhye; Brian J. Melloy; Santosh Nair

This paper presents novel software techniques for binocular eye tracking within Virtual Reality and discusses their application to aircraft inspection training. The aesthetic appearance of the environment is driven by standard graphical techniques augmented by realistic texture maps of the physical environment. The users gaze direction, as well as head position and orientation, are tracked to allow recording of the users fixations within the environment. Methods are given for (1) integration of the eye tracker into a Virtual Reality framework, (2) stereo calculation of the users 3D gaze vector, (3) a new 3D calibration technique developed to estimate the users inter-pupillary distance post-facto, and (4) a new technique for eye movement analysis in 3-space. The 3D eye movement analysis technique is an improvement over traditional 2D approaches since it takes into account the 6 degrees of freedom of head movements and is resolution independent. Results indicate that although the current signal analysis approach is somewhat noisy and tends to underestimate the identified number of fixations, recorded eye movements provide valuable human factors process measures complementing performance statistics used to gauge training effectiveness.

Collaboration


Dive into the Andrew T. Duchowski's collaboration.

Top Co-Authors

Avatar

Krzysztof Krejtz

University of Social Sciences and Humanities

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Izabela Krejtz

University of Social Sciences and Humanities

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anna Niedzielska

University of Social Sciences and Humanities

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge