J. Adam Jones
University of Southern California
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by J. Adam Jones.
acm symposium on applied perception | 2012
J. Adam Jones; Evan A. Suma; David M. Krum; Mark T. Bolas
As wider field-of-view displays become more common, the question arises as to whether or not data collected on these displays are comparable to those collected with smaller field-of-view displays. This document describes a pilot study that aimed to address these concerns by comparing medium-field distance judgments in a 60° FOV display, a 150° FOV display, and a simulated 60° FOV within the 150° FOV display. The results indicate that participants performed similarly in both the actual and simulated 60° FOV displays. On average, participants in the 150° FOV display improved distance judgments by 13% over the 60° FOV displays.
tests and proofs | 2017
J. Adam Jones; David M. Krum; Mark T. Bolas
In this article, we detail a series of experiments that examines the effect of vertical field-of-view extension and the addition of non-specific peripheral visual stimulation on gait characteristics and distance judgments in a head-worn virtual environment. Specifically, we examined four field-of-view configurations: a common 60° diagonal field of view (48° × 40°), a 60° diagonal field of view with the addition of a luminous white frame in the far periphery, a field of view with an extended upper edge, and a field of view with an extended lower edge. We found that extension of the field of view, either with spatially congruent or spatially non-informative visuals, resulted in improved distance judgments and changes in observed posture. However, these effects were not equal across all field-of-view configurations, suggesting that some configurations may be more appropriate than others when balancing performance, cost, and ergonomics.
ieee virtual reality conference | 2014
J. Adam Jones; Lauren Cairco Dukes; Mark T. Bolas
In this document we present a method for calibrating head-mounted displays and other display surfaces using an automated, low-cost camera system. A unique aspect of this method is that the calibration of geometric distortions, field of view, and chromatic aberration are achieved without the need for a priori knowledge of the display systems intrinsic parameters. This method operates by capturing and storing the pixel space locations of a series of real world control points. These control points are then used as ground truth references by which virtual space transformations can be automatically generated for a display system.
collaboration technologies and systems | 2015
J. Adam Jones; Lauren Cairco Dukes; David M. Krum; Mark T. Bolas; Larry F. Hodges
A common technology used to present immersive, collaborative environments is the head mounted virtual reality (VR) display. However, due to engineering limitations, variability in manufacturing, and person-to-person differences in eye position, virtual environments are often geometrically inaccurate. Correcting these inaccuracies typically requires complicated or interactive calibration procedures. In this document we present a method for calibrating head-mounted displays and other display surfaces using an automated, low-cost camera system. A unique aspect of this method is that the calibration of geometric distortions, field of view, and chromatic aberration are achieved without the need for a priori knowledge of the display systems intrinsic parameters. Since this calibration method can easily measure display distortions, we further extend our work to serve to measure the effect of eye position on the apparent location of imagery presented in a virtual reality head mounted display. We test a range of reasonable eye positions that may result from person-to-person variations in display placement and interpupilary distances. It was observed that the pattern of geometric distortions introduced by the displays optical system changes substantially as the eye moves from one position to the next. Though many commercial and research VR systems calibrate for interpupillary distance and optical distortions separately, this may be insufficient as eye position influences distortion characteristics.
ieee virtual reality conference | 2015
Elham Ebrahimi; Bliss M. Altenhoff; Christopher C. Pagano; Sabarish V. Babu; J. Adam Jones
We report the results of an empirical evaluation to examine the carryover effects of calibrations to one of three perturbations of visual and proprioceptive feedback: i) Minus condition (-20% gain) in which a visual stylus appeared at 80% of the distance of a physical stylus, ii) Neutral condition (0% gain) in which a visual stylus was co-located with a physical stylus, and iii) Plus condition (+20% gain) in which the visual stylus appeared at 120% of the distance of the physical stylus. Feedback was shown to calibrate distance judgments quickly within an IVE, with estimates being farthest after calibrating to visual information appearing nearer (Minus condition), and nearest after calibrating to visual information appearing further (Plus condition).
ieee virtual reality conference | 2014
J. Adam Jones; David M. Krum; Mark T. Bolas
In this document we discuss a study that investigates the effect of eye position on the apparent location of imagery presented in an off-the-shelf head worn display. We test a range of reasonable eye positions that may result from person-to-person variations in display placement and interpupillary distances. It was observed that the pattern of geometric distortions introduced by the displays optical system changes substantially as the eye moves from one position to the next. These visual displacements can be on the order of several degrees and increase in magnitude towards the peripheral edges of the field of view. Though many systems calibrate for interpupillary distance and optical distortions separately, this may be insufficient as eye position influences distortion characteristics.
Archive | 2014
Mark T. Bolas; J. Adam Jones; Ian E. McDowall; Evan A. Suma
acm symposium on applied perception | 2014
Elham Ebrahimi; Bliss M. Altenhoff; Leah S. Hartman; J. Adam Jones; Sabarish V. Babu; Christopher C. Pagano; Timothy A. Davis
symposium on 3d user interfaces | 2016
J. Adam Jones; Darlene E. Edewaard; Richard A. Tyrrell; Larry F. Hodges
Archive | 2014
J. Adam Jones; Mark T. Bolas; David M. Krum