Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gurjot Singh is active.

Publication


Featured researches published by Gurjot Singh.


applied perception in graphics and visualization | 2008

The effects of virtual reality, augmented reality, and motion parallax on egocentric depth perception

J. Adam Jones; J. Edward Swan; Gurjot Singh; Eric Kolstad; Stephen R. Ellis

A large number of previous studies have shown that egocentric depth perception tends to be underestimated in virtual reality (VR) - objects appear smaller and farther away than they should. Various theories as to why this might occur have been investigated, but to date the cause is not fully understood. A much smaller number of studies have investigated how depth perception operates in augmented reality (AR), and some of these studies have also indicated a similar underestimation effect. In this paper we report an experiment that further investigates these effects. The experiment compared VR and AR conditions to two real-world control conditions, and studied the effect of motion parallax across all conditions. Our combined VR and AR head-mounted display (HMD) allowed us to develop very careful calibration procedures based on real-world calibration widgets, which cannot be replicated with VR-only HMDs. To our knowledge, this is the first study to directly compare VR and AR conditions as part of the same experiment.


applied perception in graphics and visualization | 2010

Depth judgment measures and occluding surfaces in near-field augmented reality

Gurjot Singh; J. Edward Swan; J. Adam Jones; Stephen R. Ellis

In this paper we describe an apparatus and experiment that measured depth judgments in augmented reality at near-field distances of 34 to 50 centimeters. The experiment compared perceptual matching, a closed-loop task for measuring depth judgments, with blind reaching, a visually open-loop task for measuring depth judgments. The experiment also studied the effect of a highly salient occluding surface appearing behind, coincident with, and in front of a virtual object. The apparatus and closed-loop matching task were based on previous work by Ellis and Menges. The experiment found maximum average depth judgment errors of 5.5 cm, and found that the blind reaching judgments were less accurate than the perceptual matching judgments. The experiment found that the presence of a highly-salient occluding surface has a complicated effect on depth judgments, but does not lead to systematically larger or smaller errors.


applied perception in graphics and visualization | 2011

Peripheral visual information and its effect on distance judgments in virtual and augmented environments

J. Adam Jones; J. Edward Swan; Gurjot Singh; Stephen R. Ellis

A frequently observed problem in medium-field virtual environments is the underestimation of egocentric depth. This problem has been described numerous times and with widely varying degrees of severity, and although there has been considerable progress made in modifying observer behavior to compensate for these misperceptions, the question of why these errors exist is still an open issue. This paper presents the findings of a series of experiments, comprising 103 participants, that attempts to identify and quantify the source of a pattern of adaptation and improved depth judgment accuracy over time scales of less than one hour. Taken together, these experiments suggest that peripheral visual information is an important source of information for the calibration of movement within medium-field virtual environments.


ieee virtual reality conference | 2008

The Effects of Virtual Reality, Augmented Reality, and Motion Parallax on Egocentric Depth Perception

A. Jones; J.E. Swan; Gurjot Singh; Eric Kolstad

A large number of previous studies have shown that egocentric depth perception tends to be underestimated in virtual reality (VR) - objects appear smaller and farther away than they should. Various theories as to why this might occur have been investigated, but to date the cause is not fully understood. A much smaller number of studies have investigated how depth perception operates in augmented reality (AR), and some of these studies have also indicated a similar underestimation effect. In this paper we report an experiment that further investigates these effects. The experiment compared VR and AR conditions to two real-world control conditions, and studied the effect of motion parallax across all conditions. Our combined VR and AR head-mounted display (HMD) allowed us to develop very careful calibration procedures based on real-world calibration widgets, which cannot be replicated with VR-only HMDs. To our knowledge, this is the first study to directly compare VR and AR conditions as part of the same experiment.


ieee virtual reality conference | 2012

Depth judgments by reaching and matching in near-field augmented reality

Gurjot Singh; J. Edward Swan; J. Adam Jones; Stephen R. Ellis

In this abstract we describe an experiment that measured depth judgments in optical see-through augmented reality (AR) at near-field reaching distances of ~ 24 to ~ 56 cm. The 2×2 experiment crossed two depth judgment tasks, perceptual matching and blind reaching, with two different environments, a real-world environment and an augmented reality environment. We designed a task that used a direct reaching gesture at constant percentages of each participants maximum reach; our task was inspired by previous work by Tresilian and Mon-Williams [6] that found very accurate blind reaching results in a real-world environment.


international symposium on mixed and augmented reality | 2015

CI-Spy: Designing A Mobile Augmented Reality System for Scaffolding Historical Inquiry Learning

Gurjot Singh; Doug A. Bowman; David Hicks; David Cline; J. Todd Ogle; Aaron Johnson; Rosemary Zlokas; Thomas W. Tucker; Eric D. Ragan

We present a Markerless 3D Augmented Reality Application for virtual accessory try-on applications around human arm. The system is based on a Kinect sensor and a multi-layer rendering framework to render RGB, depth data and 3D model of accessories simultaneously. The aim is to support realistic visualization of virtual objects around human arm, by detecting wrist pose and handling occlusion for various interactive marketing and retail applications, such as virtual watch try-on.My research focus is on posthumanism: future biomorphic transformations and sculptural reconstruction of the human body. I work with digital—physical sculpture, data art, performance and contemporary technology and materials, such as 3D printing, attempting to bridge the gap between art, science, design and technology. My work investigates a presently topical discourse on the aspects of posthumanism theories, such as bioengineering, gender issues and outer space exploration. Philosophically, I lean on posthumanism theories, uncovered by Cary Wolfe, Daryl Wennemann, N. Katherine Hayles, et al. I further these ideas into sculptural surfaces, enmeshing future skin concept and biomorphic transformations of the skin based on the data from the human body. Current research eventually will result in a multi sensorial experience with mixed reality scenarios — exhibition and performances with augmented to body 3D printed sculptures — a visualisation of humans personal data. The work contributes articulating future possibilities of the relationship of technology and the body. The contribution can be important to the postgender related discussion in relation to digital technologies, data and bodies; to the augmented and mixed reality issues relating to the body and also the themes of embodiment, bodies of matter and data bodies. The objects encapsulate different layers of information and might encourage different perspective on posthumanism.


ieee virtual reality conference | 2011

Depth judgment tasks and environments in near-field augmented reality

Gurjot Singh; J. Edward Swan; J. Adam Jones; Stephen R. Ellis

In this poster abstract we describe an experiment that measured depth judgments in optical see-through augmented reality at near-field distances of 34 to 50 centimeters. The experiment compared two depth judgment tasks: perceptual matching, a closed-loop task, and blind reaching, a visually open-loop task. The experiment tested each of these tasks in both a real-world environment and an augmented reality environment, and used a between-subjects design that included 40 participants. The experiment found that matching judgments were very accurate in the real world, with errors on the order of millimeters and very little variance. In contrast, matching judgments in augmented reality showed a linear trend of increasing overestimation with increasing distance, with a mean overestimation of ∼ 1 cm. With reaching judgments participants underestimated ∼ 4.5 cm in both augmented reality and the real world. We also discovered and solved a calibration problem that arises at near-field distances.


ieee virtual reality conference | 2011

Peripheral visual information and its effect on the perception of egocentric depth in virtual and augmented environments

J. Adam Jones; J. Edward Swan; Gurjot Singh; Stephen R. Ellis

A frequently observed problem in virtual environments is the underestimation of egocentric depth. This problem has been described numerous times and with widely varying degrees of severity. Though there has been considerable progress made in modifying observer behavior to compensate for these misperceptions, the question of why these errors exist is still an open issue. The study detailed in this document presents the preliminary findings of a large, between-subjects experiment (N=98) that attempts to identify and quantify the source of a pattern of adaptation and improved accuracy in the absence of explicit feedback found in Jones et al. [1].


acm symposium on applied perception | 2012

Improvements in visually directed walking in virtual environments cannot be explained by changes in gait alone

J. Adam Jones; J. Edward Swan; Gurjot Singh; Sujan Reddy; Kenneth R. Moser; Chunya Hua; Stephen R. Ellis


IEEE Transactions on Visualization and Computer Graphics | 2015

Matching and Reaching Depth Judgments with Real and Augmented Reality Targets

J. Edward Swan; Gurjot Singh; Stephen R. Ellis

Collaboration


Dive into the Gurjot Singh's collaboration.

Top Co-Authors

Avatar

J. Edward Swan

Mississippi State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

J. Adam Jones

Mississippi State University

View shared research outputs
Top Co-Authors

Avatar

Eric Kolstad

Mississippi State University

View shared research outputs
Top Co-Authors

Avatar

A. Jones

Mississippi State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chunya Hua

Mississippi State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge