Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where George A. Geri is active.

Publication


Featured researches published by George A. Geri.


SID Symposium Digest of Technical Papers | 2007

P-39: Perceptual Tests of the Temporal Response of a Shuttered LCoS Projector

Marc Winterbottom; George A. Geri; Craig Eidman; Byron J. Pierce

Perceptual motion blur was studied using imagery presented on an LCoS projector equipped with a mechanical shutter to reduce pixel hold-time. Perceptual measures of image blur were obtained with a simple test stimulus, as well as imagery similar to that used in Air Force flight simulation and training. Measured pixel hold-time was found to accurately predict perceived blur.


The Visual Computer | 1990

Computer image generation for flight simulators" the Gabor approach

Yehoshua Y. Zeevi; Moshe Porat; George A. Geri

A formalism for image representation in the combined frequency-position space is presented using the generalized Gabor approach. This approach uses elementary functions to which the human visual system is particularly sensitive and which are efficient for the analysis and synthesis of visual imagery. The formalism is also compatible with the implementation of a variable resolution system wherein image information is nonuniformly distributed across the visual field in accordance with the human visual systems ability to process it. When used with a gaze-slaved visual display system, imagery generated using the techniques described here affords a combination of high resolution and wide field-of-view. This combination is particularly important in high-fidelity, computergenerated, visual environments as required, for instance, in flight simulators.


SID Symposium Digest of Technical Papers | 2006

P‐76: Perceptual Tests of the Temporal Properties of a Shuttered LCD Projector

Marc Winterbottom; George A. Geri; Bill Morgan; Craig Eidman; James Gaska; Byron J. Pierce

Abstract : Perceptual motion blur was studied in imagery presented on an LCD projector equipped with a mechanical shutter to reduce pixel hold-time. Perceptual measures of image blur were obtained with both a simple test stimulus, as well as real-world imagery. Both were found to correlate well with the measured pixel hold-time.


SID Symposium Digest of Technical Papers | 2008

59.5L: Late‐News Paper: Evaluation of a Prototype Grating‐Light‐Valve Laser Projector for Flight Simulation Applications

Marc Winterbottom; James Gaska; George A. Geri; Barbara T. Sweet

An evaluation of a prototype grating light valve laser projector indicates it has properties well-suited to flight-simulation applications. Full-field luminance and contrast, spatial resolution, temporal resolution, and color stability were equal to or better than those of CRT projectors typically used in flight-simulator applications. In addition, this projector is capable of providing refresh rates greater than 60 Hz. The higher refresh rates eliminate perceived flicker, and greatly reduce (120 Hz) or eliminate (240 Hz) motion artifacts over the range of target speeds tested.


SID Symposium Digest of Technical Papers | 2005

P-33: Identification of Simulated Targets as a Function of Target and Background Blur

George A. Geri; Shama C. Akhtar; Jennifer Winner; Byron J. Pierce

Simulated imagery was used to determine the effect of target and background blur on target identification performance. An interaction between these two variables was found, indicating that greater image detail may not improve performance, and hence may not be required in applications for which high-resolution databases are not readily available.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1998

The Implications of Image Collimation for Flight Simulator Training

Byron J. Pierce; George A. Geri

There is some question as to whether non-collimated (i.e., real) imagery viewed at one meter or less provides sufficiently realistic visual cues to support out-the-window flight simulator training. As a first step toward answering this question, we have obtained perceived size and velocity estimates using both simple stimuli in a controlled laboratory setting and full simulator imagery in an apparatus consisting of optically combined collimated and real-image displays. In the size study it was found that real imagery appeared 15-30% smaller than collimated imagery. In the velocity studies, the laboratory data showed that the perceived velocity of real imagery was less than that of collimated imagery. No perceived velocity effects were found with the simulator imagery. Results support the position that for training tasks requiring accurate perception of spatial and temporal aspects of the simulated visual environment, misperceptions of size, but not velocity, need to be considered when real-image displays are used.


SID Symposium Digest of Technical Papers | 2003

P-23: Effect of Display Line Rate and Antialiasing on the Recognition of Aircraft Aspect Angle

Marc Winterbottom; George A. Geri; Byron J. Pierce

Abstract : Increasing Display line rate did not improve aspect angle recognition performance beyond a level predicted by measured display resolution. Image antialiasing improved performance even though it did not increase the measured spatial resolution. Finally, the threshold for aspect angle recognition was found to be consistent with that obtained for other visual tasks dependent on target spatial detail.


Journal of The Optical Society of America A-optics Image Science and Vision | 2008

Oculomotor contribution to the change in perceived speed with viewing distance

George A. Geri; Byron J. Pierce; Robert Patterson

An array of moving circular stimuli was used to determine whether perceived speed is affected by the oculomotor responses associated with changes in viewing distance. The perceived speed of stimuli viewed at either 0.33 or 1.33 m was compared to the perceived speed of a similar stimulus viewed at a distance of 5.5 m. In addition, a control condition was run in which changes in perceived speed were compared for monocular viewing of the 0.33 m and 5.5 m stimuli. In the binocular condition, there were statistically significant decreases in perceived speed of about 11% for the 0.33 m viewing distance, and about 6.5% for the 1.33 m viewing distance. There was no significant decrease in perceived speed in the monocular condition. This latter finding, along with the similar appearance of the near and far stimuli in the monocular condition, suggests that ocular vergence (as opposed to accommodation or vergence-accommodation) was the primary determinant of the change in perceived speed with changes in binocular viewing distance. Although the change in perceived speed with fixation distance was relatively small, the data from all observers were in the direction of speed constancy. Thus, to the extent that vergence is a cue to egocentric distance, the present data suggest that egocentric distance is used to scale the perceived speed of targets moving at different distances from the observer.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2001

Low-Altitude Flight Performance as a Measure of Flight Simulator Fidelity

Marc Winterbottom; George A. Geri; Byron J. Pierce; Nichole M. Harris

Low-altitude flight performance was used to evaluate the effectiveness of the texture density cues used in high fidelity flight simulators. Observers were asked to maintain a constant above ground level (AGL) altitude over textured terrain whose elevation was varied. All combinations of two texture densities (0.13 and 0.43 elements/meter) and three airspeeds (50, 150, and 300 m/sec) were tested. Observers maintained a higher AGL altitude as speed increased, suggesting that higher optical flow rates interfered with the ability to maintain the target AGL altitude. However, when a single airspeed was used in a given block of trials, the effect of airspeed was eliminated, indicating that the original airspeed effect was due to perceptual averaging of the optical flow rate at each airspeed tested. Texture density was a significant factor only in the blocked-airspeed condition, suggesting that the blocking procedure eliminated measurement error that initially obscured the effect of this factor on altitude maintenance.


Proceedings of SPIE | 2016

A software module for implementing auditory and visual feedback on a video-based eye tracking system

Bharat Rosanlall; Izidor Gertner; George A. Geri; Karl F. Arrington

We describe here the design and implementation of a software module that provides both auditory and visual feedback of the eye position measured by a commercially available eye tracking system. The present audio-visual feedback module (AVFM) serves as an extension to the Arrington Research ViewPoint EyeTracker, but it can be easily modified for use with other similar systems. Two modes of audio feedback and one mode of visual feedback are provided in reference to a circular area-of-interest (AOI). Auditory feedback can be either a click tone emitted when the user’s gaze point enters or leaves the AOI, or a sinusoidal waveform with frequency inversely proportional to the distance from the gaze point to the center of the AOI. Visual feedback is in the form of a small circular light patch that is presented whenever the gaze-point is within the AOI. The AVFM processes data that are sent to a dynamic-link library by the EyeTracker. The AVFM’s multithreaded implementation also allows real-time data collection (1 kHz sampling rate) and graphics processing that allow display of the current/past gaze-points as well as the AOI. The feedback provided by the AVFM described here has applications in military target acquisition and personnel training, as well as in visual experimentation, clinical research, marketing research, and sports training.

Collaboration


Dive into the George A. Geri's collaboration.

Top Co-Authors

Avatar

Byron J. Pierce

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Marc Winterbottom

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

James Gaska

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Yehoshua Y. Zeevi

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Craig Eidman

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Izidor Gertner

City University of New York

View shared research outputs
Top Co-Authors

Avatar

Robert Patterson

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Izidor C. Gertner

University of Dayton Research Institute

View shared research outputs
Top Co-Authors

Avatar

Paul A. Wetzel

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar

William Morgan

Air Force Research Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge